ERIC Educational Resources Information Center
Cacciatore, Kristen L.
2010-01-01
This research entails integrating two novel approaches for enriching student learning in chemistry into the context of the general chemistry laboratory. The first is a pedagogical approach based on research in cognitive science and the second is the green chemistry philosophy. Research has shown that inquiry-based approaches are effective in…
A Mixed Kijima Model Using the Weibull-Based Generalized Renewal Processes
2015-01-01
Generalized Renewal Processes are useful for approaching the rejuvenation of dynamical systems resulting from planned or unplanned interventions. We present new perspectives for the Generalized Renewal Processes in general and for the Weibull-based Generalized Renewal Processes in particular. Disregarding from literature, we present a mixed Generalized Renewal Processes approach involving Kijima Type I and II models, allowing one to infer the impact of distinct interventions on the performance of the system under study. The first and second theoretical moments of this model are introduced as well as its maximum likelihood estimation and random sampling approaches. In order to illustrate the usefulness of the proposed Weibull-based Generalized Renewal Processes model, some real data sets involving improving, stable, and deteriorating systems are used. PMID:26197222
A Generalized Approach for Measuring Relationships Among Genes.
Wang, Lijun; Ahsan, Md Asif; Chen, Ming
2017-07-21
Several methods for identifying relationships among pairs of genes have been developed. In this article, we present a generalized approach for measuring relationships between any pairs of genes, which is based on statistical prediction. We derive two particular versions of the generalized approach, least squares estimation (LSE) and nearest neighbors prediction (NNP). According to mathematical proof, LSE is equivalent to the methods based on correlation; and NNP is approximate to one popular method called the maximal information coefficient (MIC) according to the performances in simulations and real dataset. Moreover, the approach based on statistical prediction can be extended from two-genes relationships to multi-genes relationships. This application would help to identify relationships among multi-genes.
"Celebration of the Neurons": The Application of Brain Based Learning in Classroom Environment
ERIC Educational Resources Information Center
Duman, Bilal
2007-01-01
The purpose of this study is to investigate approaches and techniques related to how brain based learning used in classroom atmosphere. This general purpose were answered following the questions: (1) What is the aim of brain based learning? (2) What are general approaches and techniques that brain based learning used? and (3) How should be used…
NASA Technical Reports Server (NTRS)
Wallis, Graham B.
1989-01-01
Some features of two recent approaches of two-phase potential flow are presented. The first approach is based on a set of progressive examples that can be analyzed using common techniques, such as conservation laws, and taken together appear to lead in the direction of a general theory. The second approach is based on variational methods, a classical approach to conservative mechanical systems that has a respectable history of application to single phase flows. This latter approach, exemplified by several recent papers by Geurst, appears generally to be consistent with the former approach, at least in those cases for which it is possible to obtain comparable results. Each approach has a justifiable theoretical base and is self-consistent. Moreover, both approaches appear to give the right prediction for several well-defined situations.
The Future of Computer-Based Toxicity Prediction:
Mechanism-Based Models vs. Information Mining Approaches
When we speak of computer-based toxicity prediction, we are generally referring to a broad array of approaches which rely primarily upon chemical structure ...
A novel task-oriented optimal design for P300-based brain-computer interfaces.
Zhou, Zongtan; Yin, Erwei; Liu, Yang; Jiang, Jun; Hu, Dewen
2014-10-01
Objective. The number of items of a P300-based brain-computer interface (BCI) should be adjustable in accordance with the requirements of the specific tasks. To address this issue, we propose a novel task-oriented optimal approach aimed at increasing the performance of general P300 BCIs with different numbers of items. Approach. First, we proposed a stimulus presentation with variable dimensions (VD) paradigm as a generalization of the conventional single-character (SC) and row-column (RC) stimulus paradigms. Furthermore, an embedding design approach was employed for any given number of items. Finally, based on the score-P model of each subject, the VD flash pattern was selected by a linear interpolation approach for a certain task. Main results. The results indicate that the optimal BCI design consistently outperforms the conventional approaches, i.e., the SC and RC paradigms. Specifically, there is significant improvement in the practical information transfer rate for a large number of items. Significance. The results suggest that the proposed optimal approach would provide useful guidance in the practical design of general P300-based BCIs.
A novel task-oriented optimal design for P300-based brain-computer interfaces
NASA Astrophysics Data System (ADS)
Zhou, Zongtan; Yin, Erwei; Liu, Yang; Jiang, Jun; Hu, Dewen
2014-10-01
Objective. The number of items of a P300-based brain-computer interface (BCI) should be adjustable in accordance with the requirements of the specific tasks. To address this issue, we propose a novel task-oriented optimal approach aimed at increasing the performance of general P300 BCIs with different numbers of items. Approach. First, we proposed a stimulus presentation with variable dimensions (VD) paradigm as a generalization of the conventional single-character (SC) and row-column (RC) stimulus paradigms. Furthermore, an embedding design approach was employed for any given number of items. Finally, based on the score-P model of each subject, the VD flash pattern was selected by a linear interpolation approach for a certain task. Main results. The results indicate that the optimal BCI design consistently outperforms the conventional approaches, i.e., the SC and RC paradigms. Specifically, there is significant improvement in the practical information transfer rate for a large number of items. Significance. The results suggest that the proposed optimal approach would provide useful guidance in the practical design of general P300-based BCIs.
ERIC Educational Resources Information Center
Walby, Nathan
2011-01-01
Teaching musical vocabulary in a middle school general music class can often be challenging to the performance-based teacher. This article provides several teaching strategies for approaching words from both a theoretical and a practical standpoint. Based on a dialectical "this-with-that" approach by Estelle Jorgensen, this article argues that…
Phase recovery in temporal speckle pattern interferometry using the generalized S-transform.
Federico, Alejandro; Kaufmann, Guillermo H
2008-04-15
We propose a novel approach based on the generalized S-transform to retrieve optical phase distributions in temporal speckle pattern interferometry. The performance of the proposed approach is compared with those given by well-known techniques based on the continuous wavelet, the Hilbert transforms, and a smoothed time-frequency distribution by analyzing interferometric data degraded by noise, nonmodulating pixels, and modulation loss. The advantages and limitations of the proposed phase retrieval approach are discussed.
Vote Stuffing Control in IPTV-based Recommender Systems
NASA Astrophysics Data System (ADS)
Bhatt, Rajen
Vote stuffing is a general problem in the functioning of the content rating-based recommender systems. Currently IPTV viewers browse various contents based on the program ratings. In this paper, we propose a fuzzy clustering-based approach to remove the effects of vote stuffing and consider only the genuine ratings for the programs over multiple genres. The approach requires only one authentic rating, which is generally available from recommendation system administrators or program broadcasters. The entire process is automated using fuzzy c-means clustering. Computational experiments performed over one real-world program rating database shows that the proposed approach is very efficient for controlling vote stuffing.
Lee, Kai-Hui; Chiu, Pei-Ling
2013-10-01
Conventional visual cryptography (VC) suffers from a pixel-expansion problem, or an uncontrollable display quality problem for recovered images, and lacks a general approach to construct visual secret sharing schemes for general access structures. We propose a general and systematic approach to address these issues without sophisticated codebook design. This approach can be used for binary secret images in non-computer-aided decryption environments. To avoid pixel expansion, we design a set of column vectors to encrypt secret pixels rather than using the conventional VC-based approach. We begin by formulating a mathematic model for the VC construction problem to find the column vectors for the optimal VC construction, after which we develop a simulated-annealing-based algorithm to solve the problem. The experimental results show that the display quality of the recovered image is superior to that of previous papers.
NASA Astrophysics Data System (ADS)
Austin, Rickey W.
In Einstein's theory of Special Relativity (SR), one method to derive relativistic kinetic energy is via applying the classical work-energy theorem to relativistic momentum. This approach starts with a classical based work-energy theorem and applies SR's momentum to the derivation. One outcome of this derivation is relativistic kinetic energy. From this derivation, it is rather straight forward to form a kinetic energy based time dilation function. In the derivation of General Relativity a common approach is to bypass classical laws as a starting point. Instead a rigorous development of differential geometry and Riemannian space is constructed, from which classical based laws are derived. This is in contrast to SR's approach of starting with classical laws and applying the consequences of the universal speed of light by all observers. A possible method to derive time dilation due to Newtonian gravitational potential energy (NGPE) is to apply SR's approach to deriving relativistic kinetic energy. It will be shown this method gives a first order accuracy compared to Schwarzschild's metric. The SR's kinetic energy and the newly derived NGPE derivation are combined to form a Riemannian metric based on these two energies. A geodesic is derived and calculations compared to Schwarzschild's geodesic for an orbiting test mass about a central, non-rotating, non-charged massive body. The new metric results in high accuracy calculations when compared to Einsteins General Relativity's prediction. The new method provides a candidate approach for starting with classical laws and deriving General Relativity effects. This approach mimics SR's method of starting with classical mechanics when deriving relativistic equations. As a compliment to introducing General Relativity, it provides a plausible scaffolding method from classical physics when teaching introductory General Relativity. A straight forward path from classical laws to General Relativity will be derived. This derivation provides a minimum first order accuracy to Schwarzschild's solution to Einstein's field equations.
78 FR 1690 - Semiannual Agenda of Regulations
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-08
... organizations subject to the advanced approaches capital rules, a supplementary leverage ratio that incorporates... risk-based and leverage capital requirements. Regulatory Capital Rules (Part 2): Standardized Approach... (``Standardized Approach NPR'') includes proposed changes to the agencies' general risk-based capital requirements...
Real-time simulation of biological soft tissues: a PGD approach.
Niroomandi, S; González, D; Alfaro, I; Bordeu, F; Leygue, A; Cueto, E; Chinesta, F
2013-05-01
We introduce here a novel approach for the numerical simulation of nonlinear, hyperelastic soft tissues at kilohertz feedback rates necessary for haptic rendering. This approach is based upon the use of proper generalized decomposition techniques, a generalization of PODs. Proper generalized decomposition techniques can be considered as a means of a priori model order reduction and provides a physics-based meta-model without the need for prior computer experiments. The suggested strategy is thus composed of an offline phase, in which a general meta-model is computed, and an online evaluation phase in which the results are obtained at real time. Results are provided that show the potential of the proposed technique, together with some benchmark test that shows the accuracy of the method. Copyright © 2013 John Wiley & Sons, Ltd.
Pourhassan, Mojgan; Neumann, Frank
2018-06-22
The generalized travelling salesperson problem is an important NP-hard combinatorial optimization problem for which meta-heuristics, such as local search and evolutionary algorithms, have been used very successfully. Two hierarchical approaches with different neighbourhood structures, namely a Cluster-Based approach and a Node-Based approach, have been proposed by Hu and Raidl (2008) for solving this problem. In this paper, local search algorithms and simple evolutionary algorithms based on these approaches are investigated from a theoretical perspective. For local search algorithms, we point out the complementary abilities of the two approaches by presenting instances where they mutually outperform each other. Afterwards, we introduce an instance which is hard for both approaches when initialized on a particular point of the search space, but where a variable neighbourhood search combining them finds the optimal solution in polynomial time. Then we turn our attention to analysing the behaviour of simple evolutionary algorithms that use these approaches. We show that the Node-Based approach solves the hard instance of the Cluster-Based approach presented in Corus et al. (2016) in polynomial time. Furthermore, we prove an exponential lower bound on the optimization time of the Node-Based approach for a class of Euclidean instances.
Hierarchical structure of biological systems: a bioengineering approach.
Alcocer-Cuarón, Carlos; Rivera, Ana L; Castaño, Victor M
2014-01-01
A general theory of biological systems, based on few fundamental propositions, allows a generalization of both Wierner and Berthalanffy approaches to theoretical biology. Here, a biological system is defined as a set of self-organized, differentiated elements that interact pair-wise through various networks and media, isolated from other sets by boundaries. Their relation to other systems can be described as a closed loop in a steady-state, which leads to a hierarchical structure and functioning of the biological system. Our thermodynamical approach of hierarchical character can be applied to biological systems of varying sizes through some general principles, based on the exchange of energy information and/or mass from and within the systems.
A General and Flexible Approach to Estimating the Social Relations Model Using Bayesian Methods
ERIC Educational Resources Information Center
Ludtke, Oliver; Robitzsch, Alexander; Kenny, David A.; Trautwein, Ulrich
2013-01-01
The social relations model (SRM) is a conceptual, methodological, and analytical approach that is widely used to examine dyadic behaviors and interpersonal perception within groups. This article introduces a general and flexible approach to estimating the parameters of the SRM that is based on Bayesian methods using Markov chain Monte Carlo…
Generalized pseudopotential approach for electron-atom scattering.
NASA Technical Reports Server (NTRS)
Zarlingo, D. G.; Ishihara, T.; Poe, R. T.
1972-01-01
A generalized many-electron pseudopotential approach is presented for electron-neutral-atom scattering problems. A calculation based on this formulation is carried out for the singlet s-wave and p-wave electron-hydrogen phase shifts with excellent results. We compare the method with other approaches as well as discuss its applications for inelastic and rearrangement collision problems.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-20
... every $100 of current generally applicable leverage exposure based on a group of advanced approaches... approaches adopted by the agencies in July, 2013 (2013 revised capital approaches), the agencies established... organizations subject to the advanced approaches risk-based capital rules. In this notice of proposed rulemaking...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-18
... received. Table of Contents I. Introduction A. Statutory Framework B. Consultations C. Approach to Drafting.... Generally B. Consistency With CFTC Approach IV. Paperwork Reduction Act A. Summary of Collections of... that may rely on security-based swaps to manage risk and reduce volatility. C. Approach to Drafting the...
ERIC Educational Resources Information Center
Esmaily, Hamideh M.; Silver, Ivan; Shiva, Shadi; Gargani, Alireza; Maleki-Dizaji, Nasrin; Al-Maniri, Abdullah; Wahlstrom, Rolf
2010-01-01
Introduction: An outcome-based education approach has been proposed to develop more effective continuing medical education (CME) programs. We have used this approach in developing an outcome-based educational intervention for general physicians working in primary care (GPs) and evaluated its effectiveness compared with a concurrent CME program in…
An Open Trial of an Acceptance-Based Behavior Therapy for Generalized Anxiety Disorder
ERIC Educational Resources Information Center
Roemer, Lizabeth; Orsillo, Susan M.
2007-01-01
Research suggests that experiential avoidance may play an important role in generalized anxiety disorder (GAD; see Roemer, L., & Orsillo, S.M. (2002). "Expanding our conceptualization of and treatment for generalized anxiety disorder: Integrating mindfulness/acceptance-based approaches with existing cognitive-behavioral models." "Clinical…
Model-Based Prognostics of Hybrid Systems
NASA Technical Reports Server (NTRS)
Daigle, Matthew; Roychoudhury, Indranil; Bregon, Anibal
2015-01-01
Model-based prognostics has become a popular approach to solving the prognostics problem. However, almost all work has focused on prognostics of systems with continuous dynamics. In this paper, we extend the model-based prognostics framework to hybrid systems models that combine both continuous and discrete dynamics. In general, most systems are hybrid in nature, including those that combine physical processes with software. We generalize the model-based prognostics formulation to hybrid systems, and describe the challenges involved. We present a general approach for modeling hybrid systems, and overview methods for solving estimation and prediction in hybrid systems. As a case study, we consider the problem of conflict (i.e., loss of separation) prediction in the National Airspace System, in which the aircraft models are hybrid dynamical systems.
Sumter, Takita Felder; Owens, Patrick M
2011-01-01
The need for a revised curriculum within the life sciences has been well-established. One strategy to improve student preparation in the life sciences is to redesign introductory courses like biology, chemistry, and physics so that they better reflect their disciplinary interdependence. We describe a medically relevant, context-based approach to teaching second semester general chemistry that demonstrates the interdisciplinary nature of biology and chemistry. Our innovative method provides a model in which disciplinary barriers are diminished early in the undergraduate science curriculum. The course is divided into three principle educational modules: 1) Fundamentals of General Chemistry, 2) Medical Approaches to Inflammation, and 3) Neuroscience as a connector of chemistry, biology, and psychology. We accurately anticipated that this modified approach to teaching general chemistry would enhance student interest in chemistry and bridge the perceived gaps between biology and chemistry. The course serves as a template for context-based, interdisciplinary teaching that lays the foundation needed to train 21st century scientists. Copyright © 2010 Wiley Periodicals, Inc.
Sumter, Takita Felder; Owens, Patrick M.
2012-01-01
The need for a revised curriculum within the life sciences has been well-established. One strategy to improve student preparation in the life sciences is to redesign introductory courses like biology, chemistry, and physics so that they better reflect their disciplinary interdependence. We describe a medically relevant, context-based approach to teaching second semester general chemistry that demonstrates the interdisciplinary nature of biology and chemistry. Our innovative method provides a model in which disciplinary barriers are diminished early in the undergraduate science curriculum. The course is divided into three principle educational modules: 1) Fundamentals of General Chemistry, 2) Medical Approaches to Inflammation, and 3) Neuroscience as a connector of chemistry, biology, and psychology. We accurately anticipated that this modified approach to teaching general chemistry would enhance student interest in chemistry and bridge the perceived gaps between biology and chemistry. The course serves as a template for context-based, interdisciplinary teaching that lays the foundation needed to train 21st century scientists. PMID:21445902
[GSH fermentation process modeling using entropy-criterion based RBF neural network model].
Tan, Zuoping; Wang, Shitong; Deng, Zhaohong; Du, Guocheng
2008-05-01
The prediction accuracy and generalization of GSH fermentation process modeling are often deteriorated by noise existing in the corresponding experimental data. In order to avoid this problem, we present a novel RBF neural network modeling approach based on entropy criterion. It considers the whole distribution structure of the training data set in the parameter learning process compared with the traditional MSE-criterion based parameter learning, and thus effectively avoids the weak generalization and over-learning. Then the proposed approach is applied to the GSH fermentation process modeling. Our results demonstrate that this proposed method has better prediction accuracy, generalization and robustness such that it offers a potential application merit for the GSH fermentation process modeling.
A generalized least-squares framework for rare-variant analysis in family data.
Li, Dalin; Rotter, Jerome I; Guo, Xiuqing
2014-01-01
Rare variants may, in part, explain some of the hereditability missing in current genome-wide association studies. Many gene-based rare-variant analysis approaches proposed in recent years are aimed at population-based samples, although analysis strategies for family-based samples are clearly warranted since the family-based design has the potential to enhance our ability to enrich for rare causal variants. We have recently developed the generalized least squares, sequence kernel association test, or GLS-SKAT, approach for the rare-variant analyses in family samples, in which the kinship matrix that was computed from the high dimension genetic data was used to decorrelate the family structure. We then applied the SKAT-O approach for gene-/region-based inference in the decorrelated data. In this study, we applied this GLS-SKAT method to the systolic blood pressure data in the simulated family sample distributed by the Genetic Analysis Workshop 18. We compared the GLS-SKAT approach to the rare-variant analysis approach implemented in family-based association test-v1 and demonstrated that the GLS-SKAT approach provides superior power and good control of type I error rate.
Spatial modeling in ecology: the flexibility of eigenfunction spatial analyses.
Griffith, Daniel A; Peres-Neto, Pedro R
2006-10-01
Recently, analytical approaches based on the eigenfunctions of spatial configuration matrices have been proposed in order to consider explicitly spatial predictors. The present study demonstrates the usefulness of eigenfunctions in spatial modeling applied to ecological problems and shows equivalencies of and differences between the two current implementations of this methodology. The two approaches in this category are the distance-based (DB) eigenvector maps proposed by P. Legendre and his colleagues, and spatial filtering based upon geographic connectivity matrices (i.e., topology-based; CB) developed by D. A. Griffith and his colleagues. In both cases, the goal is to create spatial predictors that can be easily incorporated into conventional regression models. One important advantage of these two approaches over any other spatial approach is that they provide a flexible tool that allows the full range of general and generalized linear modeling theory to be applied to ecological and geographical problems in the presence of nonzero spatial autocorrelation.
Generalized SMO algorithm for SVM-based multitask learning.
Cai, Feng; Cherkassky, Vladimir
2012-06-01
Exploiting additional information to improve traditional inductive learning is an active research area in machine learning. In many supervised-learning applications, training data can be naturally separated into several groups, and incorporating this group information into learning may improve generalization. Recently, Vapnik proposed a general approach to formalizing such problems, known as "learning with structured data" and its support vector machine (SVM) based optimization formulation called SVM+. Liang and Cherkassky showed the connection between SVM+ and multitask learning (MTL) approaches in machine learning, and proposed an SVM-based formulation for MTL called SVM+MTL for classification. Training the SVM+MTL classifier requires the solution of a large quadratic programming optimization problem which scales as O(n(3)) with sample size n. So there is a need to develop computationally efficient algorithms for implementing SVM+MTL. This brief generalizes Platt's sequential minimal optimization (SMO) algorithm to the SVM+MTL setting. Empirical results show that, for typical SVM+MTL problems, the proposed generalized SMO achieves over 100 times speed-up, in comparison with general-purpose optimization routines.
Writing in Chemistry: An Effective Learning Tool.
ERIC Educational Resources Information Center
Sherwood, Donna W.; Kovac, Jeffrey
1999-01-01
Presents some general strategies for using writing in chemistry courses based on experiences in developing a systematic approach to using writing as an effective learning tool in chemistry courses, and testing this approach in high-enrollment general chemistry courses at the University of Tennessee-Knoxville. Contains 18 references. (WRM)
A Graphical Approach to Teaching Amplifier Design at the Undergraduate Level
ERIC Educational Resources Information Center
Assaad, R. S.; Silva-Martinez, J.
2009-01-01
Current methods of teaching basic amplifier design at the undergraduate level need further development to match today's technological advances. The general class approach to amplifier design is analytical and heavily based on mathematical manipulations. However, the students mathematical abilities are generally modest, creating a void in which…
Deductive Error Diagnosis and Inductive Error Generalization for Intelligent Tutoring Systems.
ERIC Educational Resources Information Center
Hoppe, H. Ulrich
1994-01-01
Examines the deductive approach to error diagnosis for intelligent tutoring systems. Topics covered include the principles of the deductive approach to diagnosis; domain-specific heuristics to solve the problem of generalizing error patterns; and deductive diagnosis and the hypertext-based learning environment. (Contains 26 references.) (JLB)
Purely numerical approach for analyzing flow to a well intercepting a vertical fracture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Narasimhan, T.N.; Palen, W.A.
1979-03-01
A numerical method, based on an Integral Finite Difference approach, is presented to investigate wells intercepting fractures in general and vertical fractures in particular. Such features as finite conductivity, wellbore storage, damage, and fracture deformability and its influence as permeability are easily handled. The advantage of the numerical approach is that it is based on fewer assumptions than analytic solutions and hence has greater generality. Illustrative examples are given to validate the method against known solutions. New results are presenteed to demonstrate the applicability of the method to problems not apparently considered in the literature so far.
Different perspectives on economic base.
Lisa K. Crone; Richard W. Haynes; Nicholas E. Reyna
1999-01-01
Two general approaches for measuring the economic base are discussed. Each method is used to define the economic base for each of the counties included in the Interior Columbia Basin Ecosystem Management Project area. A more detailed look at four selected counties results in similar findings from different approaches. Limitations of economic base analysis also are...
A probability-based approach for assessment of roadway safety hardware.
DOT National Transportation Integrated Search
2017-03-14
This report presents a general probability-based approach for assessment of roadway safety hardware (RSH). It was achieved using a reliability : analysis method and computational techniques. With the development of high-fidelity finite element (FE) m...
Oreshkov, Ognyan; Calsamiglia, John
2010-07-30
We propose a theory of adiabaticity in quantum markovian dynamics based on a decomposition of the Hilbert space induced by the asymptotic behavior of the Lindblad semigroup. A central idea of our approach is that the natural generalization of the concept of eigenspace of the Hamiltonian in the case of markovian dynamics is a noiseless subsystem with a minimal noisy cofactor. Unlike previous attempts to define adiabaticity for open systems, our approach deals exclusively with physical entities and provides a simple, intuitive picture at the Hilbert-space level, linking the notion of adiabaticity to the theory of noiseless subsystems. As two applications of our theory, we propose a general framework for decoherence-assisted computation in noiseless codes and a dissipation-driven approach to holonomic computation based on adiabatic dragging of subsystems that is generally not achievable by nondissipative means.
ERIC Educational Resources Information Center
Somba, Anne W.; Obura, Ger; Njuguna, Margaret; Itevete, Boniface; Mulwa, Jones; Wandera, Nooh
2015-01-01
The importance of writing skills in enhancing student performance in language exams and even other subject areas is widely acknowledged. At Jaffery secondary, the approach to the teaching of writing has generally been to use three approaches: product-based approach with focus on what the students composed; process-based approach that is focused on…
Rational-operator-based depth-from-defocus approach to scene reconstruction.
Li, Ang; Staunton, Richard; Tjahjadi, Tardi
2013-09-01
This paper presents a rational-operator-based approach to depth from defocus (DfD) for the reconstruction of three-dimensional scenes from two-dimensional images, which enables fast DfD computation that is independent of scene textures. Two variants of the approach, one using the Gaussian rational operators (ROs) that are based on the Gaussian point spread function (PSF) and the second based on the generalized Gaussian PSF, are considered. A novel DfD correction method is also presented to further improve the performance of the approach. Experimental results are considered for real scenes and show that both approaches outperform existing RO-based methods.
NASA Astrophysics Data System (ADS)
Kang, Jingoo; Keinonen, Tuula
2017-04-01
Since students have lost their interest in school science, several student-centered approaches, such as using topics that are relevant for students, inquiry-based learning, and discussion-based learning have been implemented to attract pupils into science. However, the effect of these approaches was usually measured in small-scale research, and thus, the large-scale evidence supporting student-centered approaches in general use is insufficient. Accordingly, this study aimed to investigate the effect of student-centered approaches on students' interest and achievement by analyzing a large-scale data set derived from Program for International Student Assessment (PISA) 2006, to add evidence for advocating these approaches in school science, and to generalize the effects on a large population. We used Finnish PISA 2006 data, which is the most recent data that measures science literacy and that contains relevant variables for the constructs of this study. As a consequence of the factor analyses, four teaching methods were grouped as student-centered approaches (relevant topic-based, open and guided inquiry-based, and discussion-based approaches in school science) from the Finnish PISA 2006 sample. The structural equation modeling result indicated that using topics relevant for students positively affected students' interest and achievement in science. Guided inquiry-based learning was also indicated as a strong positive predictor for students' achievement, and its effect was also positively associated with students' interest. On the other hand, open inquiry-based learning was indicated as a strong negative predictor for students' achievement, as was using discussion in school science. Implications and limitations of the study were discussed.
Comparing Methods for UAV-Based Autonomous Surveillance
NASA Technical Reports Server (NTRS)
Freed, Michael; Harris, Robert; Shafto, Michael
2004-01-01
We describe an approach to evaluating algorithmic and human performance in directing UAV-based surveillance. Its key elements are a decision-theoretic framework for measuring the utility of a surveillance schedule and an evaluation testbed consisting of 243 scenarios covering a well-defined space of possible missions. We apply this approach to two example UAV-based surveillance methods, a TSP-based algorithm and a human-directed approach, then compare them to identify general strengths, and weaknesses of each method.
Collaborative Core Research Program for Chemical-Biological Warfare Defense
2015-01-04
Discovery through High Throughput Screening (HTS) and Fragment-Based Drug Design (FBDD...Discovery through High Throughput Screening (HTS) and Fragment-Based Drug Design (FBDD) Current pharmaceutical approaches involving drug discovery...structural analysis and docking program generally known as fragment based drug design (FBDD). The main advantage of using these approaches is that
Market-based approaches to tree valuation
Geoffrey H. Donovan; David T. Butry
2008-01-01
A recent four-part series in Arborist News outlined different appraisal processes used to value urban trees. The final article in the series described the three generally accepted approaches to tree valuation: the sales comparison approach, the cost approach, and the income capitalization approach. The author, D. Logan Nelson, noted that the sales comparison approach...
A Hybrid Generalized Hidden Markov Model-Based Condition Monitoring Approach for Rolling Bearings
Liu, Jie; Hu, Youmin; Wu, Bo; Wang, Yan; Xie, Fengyun
2017-01-01
The operating condition of rolling bearings affects productivity and quality in the rotating machine process. Developing an effective rolling bearing condition monitoring approach is critical to accurately identify the operating condition. In this paper, a hybrid generalized hidden Markov model-based condition monitoring approach for rolling bearings is proposed, where interval valued features are used to efficiently recognize and classify machine states in the machine process. In the proposed method, vibration signals are decomposed into multiple modes with variational mode decomposition (VMD). Parameters of the VMD, in the form of generalized intervals, provide a concise representation for aleatory and epistemic uncertainty and improve the robustness of identification. The multi-scale permutation entropy method is applied to extract state features from the decomposed signals in different operating conditions. Traditional principal component analysis is adopted to reduce feature size and computational cost. With the extracted features’ information, the generalized hidden Markov model, based on generalized interval probability, is used to recognize and classify the fault types and fault severity levels. Finally, the experiment results show that the proposed method is effective at recognizing and classifying the fault types and fault severity levels of rolling bearings. This monitoring method is also efficient enough to quantify the two uncertainty components. PMID:28524088
Using the Logarithmic Concentration Diagram, Log "C", to Teach Acid-Base Equilibrium
ERIC Educational Resources Information Center
Kovac, Jeffrey
2012-01-01
Acid-base equilibrium is one of the most important and most challenging topics in a typical general chemistry course. This article introduces an alternative to the algebraic approach generally used in textbooks, the graphical log "C" method. Log "C" diagrams provide conceptual insight into the behavior of aqueous acid-base systems and allow…
Continuity-based model interfacing for plant-wide simulation: a general approach.
Volcke, Eveline I P; van Loosdrecht, Mark C M; Vanrolleghem, Peter A
2006-08-01
In plant-wide simulation studies of wastewater treatment facilities, often existing models from different origin need to be coupled. However, as these submodels are likely to contain different state variables, their coupling is not straightforward. The continuity-based interfacing method (CBIM) provides a general framework to construct model interfaces for models of wastewater systems, taking into account conservation principles. In this contribution, the CBIM approach is applied to study the effect of sludge digestion reject water treatment with a SHARON-Anammox process on a plant-wide scale. Separate models were available for the SHARON process and for the Anammox process. The Benchmark simulation model no. 2 (BSM2) is used to simulate the behaviour of the complete WWTP including sludge digestion. The CBIM approach is followed to develop three different model interfaces. At the same time, the generally applicable CBIM approach was further refined and particular issues when coupling models in which pH is considered as a state variable, are pointed out.
A Fiducial Approach to Extremes and Multiple Comparisons
ERIC Educational Resources Information Center
Wandler, Damian V.
2010-01-01
Generalized fiducial inference is a powerful tool for many difficult problems. Based on an extension of R. A. Fisher's work, we used generalized fiducial inference for two extreme value problems and a multiple comparison procedure. The first extreme value problem is dealing with the generalized Pareto distribution. The generalized Pareto…
Hong, X; Harris, C J
2000-01-01
This paper introduces a new neurofuzzy model construction algorithm for nonlinear dynamic systems based upon basis functions that are Bézier-Bernstein polynomial functions. This paper is generalized in that it copes with n-dimensional inputs by utilising an additive decomposition construction to overcome the curse of dimensionality associated with high n. This new construction algorithm also introduces univariate Bézier-Bernstein polynomial functions for the completeness of the generalized procedure. Like the B-spline expansion based neurofuzzy systems, Bézier-Bernstein polynomial function based neurofuzzy networks hold desirable properties such as nonnegativity of the basis functions, unity of support, and interpretability of basis function as fuzzy membership functions, moreover with the additional advantages of structural parsimony and Delaunay input space partition, essentially overcoming the curse of dimensionality associated with conventional fuzzy and RBF networks. This new modeling network is based on additive decomposition approach together with two separate basis function formation approaches for both univariate and bivariate Bézier-Bernstein polynomial functions used in model construction. The overall network weights are then learnt using conventional least squares methods. Numerical examples are included to demonstrate the effectiveness of this new data based modeling approach.
Algebraic features of some generalizations of the Lotka-Volterra system
NASA Astrophysics Data System (ADS)
Bibik, Yu. V.; Sarancha, D. A.
2010-10-01
For generalizations of the Lotka-Volterra system, an integration method is proposed based on the nontrivial algebraic structure of these generalizations. The method makes use of an auxiliary first-order differential equation derived from the phase curve equation with the help of this algebraic structure. Based on this equation, a Hamiltonian approach can be developed and canonical variables (moreover, action-angle variables) can be constructed.
Student Perceptions of a Form-Based Approach to Reflective Journaling
ERIC Educational Resources Information Center
Mabrouk, Patricia Ann
2015-01-01
The author describes the principal findings of a survey study looking at student perceptions of a new form-based approach to reflective journaling. A form-based journal assignment was developed for use in introductory lecture courses and tested over a two-year period in an Honors General Chemistry course for engineers with a total of 157…
Hierarchical structure of biological systems
Alcocer-Cuarón, Carlos; Rivera, Ana L; Castaño, Victor M
2014-01-01
A general theory of biological systems, based on few fundamental propositions, allows a generalization of both Wierner and Berthalanffy approaches to theoretical biology. Here, a biological system is defined as a set of self-organized, differentiated elements that interact pair-wise through various networks and media, isolated from other sets by boundaries. Their relation to other systems can be described as a closed loop in a steady-state, which leads to a hierarchical structure and functioning of the biological system. Our thermodynamical approach of hierarchical character can be applied to biological systems of varying sizes through some general principles, based on the exchange of energy information and/or mass from and within the systems. PMID:24145961
Physics-Based Stimulation for Night Vision Goggle Simulation
2006-11-01
a CRT display system can produce darker black level than displays based on digital light processing (DLP) or liquid crystal technologies. It should...The general form of the bucket equation for any gun (color) is as follows: (3) n n n n r MnRp f MxR MnR ⎛ ⎞− = ⎜ ⎟−⎝ ⎠ Equation 3 General...simulate rendering approach, we began by testing the bucket rendering approach already utilized by SensorHost: (10) n n n n r MnRp f MxR MnR
NASA Astrophysics Data System (ADS)
Colaninno, Nicola; Marambio Castillo, Alejandro; Roca Cladera, Josep
2017-10-01
The demand for remotely sensed data is growing increasingly, due to the possibility of managing information about huge geographic areas, in digital format, at different time periods, and suitable for analysis in GIS platforms. However, primary satellite information is not such immediate as desirable. Beside geometric and atmospheric limitations, clouds, cloud shadows, and haze generally contaminate optical images. In terms of land cover, such a contamination is intended as missing information and should be replaced. Generally, image reconstruction is classified according to three main approaches, i.e. in-painting-based, multispectral-based, and multitemporal-based methods. This work relies on a multitemporal-based approach to retrieve uncontaminated pixels for an image scene. We explore an automatic method for quickly getting daytime cloudless and shadow-free image at moderate spatial resolution for large geographical areas. The process expects two main steps: a multitemporal effect adjustment to avoid significant seasonal variations, and a data reconstruction phase, based on automatic selection of uncontaminated pixels from an image stack. The result is a composite image based on middle values of the stack, over a year. The assumption is that, for specific purposes, land cover changes at a coarse scale are not significant over relatively short time periods. Because it is largely recognized that satellite imagery along tropical areas are generally strongly affected by clouds, the methodology is tested for the case study of the Dominican Republic at the year 2015; while Landsat 8 imagery are employed to test the approach.
A Survey of Model Evaluation Approaches with a Tutorial on Hierarchical Bayesian Methods
ERIC Educational Resources Information Center
Shiffrin, Richard M.; Lee, Michael D.; Kim, Woojae; Wagenmakers, Eric-Jan
2008-01-01
This article reviews current methods for evaluating models in the cognitive sciences, including theoretically based approaches, such as Bayes factors and minimum description length measures; simulation approaches, including model mimicry evaluations; and practical approaches, such as validation and generalization measures. This article argues…
A Principles-Based Approach to Teaching International Financial Reporting Standards (IFRS)
ERIC Educational Resources Information Center
Persons, Obeua
2014-01-01
This article discusses the principles-based approach that emphasizes a "why" question by using the International Accounting Standards Board (IASB) "Conceptual Framework for Financial Reporting" to question and understand the basis for specific differences between IFRS and U.S. generally accepted accounting principles (U.S.…
A New Approach for Solving the Generalized Traveling Salesman Problem
NASA Astrophysics Data System (ADS)
Pop, P. C.; Matei, O.; Sabo, C.
The generalized traveling problem (GTSP) is an extension of the classical traveling salesman problem. The GTSP is known to be an NP-hard problem and has many interesting applications. In this paper we present a local-global approach for the generalized traveling salesman problem. Based on this approach we describe a novel hybrid metaheuristic algorithm for solving the problem using genetic algorithms. Computational results are reported for Euclidean TSPlib instances and compared with the existing ones. The obtained results point out that our hybrid algorithm is an appropriate method to explore the search space of this complex problem and leads to good solutions in a reasonable amount of time.
Generalized graph states based on Hadamard matrices
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cui, Shawn X.; Yu, Nengkun; Department of Mathematics and Statistics, University of Guelph, Guelph, Ontario N1G 2W1
2015-07-15
Graph states are widely used in quantum information theory, including entanglement theory, quantum error correction, and one-way quantum computing. Graph states have a nice structure related to a certain graph, which is given by either a stabilizer group or an encoding circuit, both can be directly given by the graph. To generalize graph states, whose stabilizer groups are abelian subgroups of the Pauli group, one approach taken is to study non-abelian stabilizers. In this work, we propose to generalize graph states based on the encoding circuit, which is completely determined by the graph and a Hadamard matrix. We study themore » entanglement structures of these generalized graph states and show that they are all maximally mixed locally. We also explore the relationship between the equivalence of Hadamard matrices and local equivalence of the corresponding generalized graph states. This leads to a natural generalization of the Pauli (X, Z) pairs, which characterizes the local symmetries of these generalized graph states. Our approach is also naturally generalized to construct graph quantum codes which are beyond stabilizer codes.« less
Leung, Chung-Chu
2006-03-01
Digital subtraction radiography requires close matching of the contrast in each pair of X-ray images to be subtracted. Previous studies have shown that nonparametric contrast/brightness correction methods using the cumulative density function (CDF) and its improvements, which are based on gray-level transformation associated with the pixel histogram, perform well in uniform contrast/brightness difference conditions. However, for radiographs with nonuniform contrast/ brightness, the CDF produces unsatisfactory results. In this paper, we propose a new approach in contrast correction based on the generalized fuzzy operator with least square method. The result shows that 50% of the contrast/brightness errors can be corrected using this approach when the contrast/brightness difference between a radiographic pair is 10 U. A comparison of our approach with that of CDF is presented, and this modified GFO method produces better contrast normalization results than the CDF approach.
Vibrational multiconfiguration self-consistent field theory: implementation and test calculations.
Heislbetz, Sandra; Rauhut, Guntram
2010-03-28
A state-specific vibrational multiconfiguration self-consistent field (VMCSCF) approach based on a multimode expansion of the potential energy surface is presented for the accurate calculation of anharmonic vibrational spectra. As a special case of this general approach vibrational complete active space self-consistent field calculations will be discussed. The latter method shows better convergence than the general VMCSCF approach and must be considered the preferred choice within the multiconfigurational framework. Benchmark calculations are provided for a small set of test molecules.
Wu, Jia-ting; Wang, Jian-qiang; Wang, Jing; Zhang, Hong-yu; Chen, Xiao-hong
2014-01-01
Based on linguistic term sets and hesitant fuzzy sets, the concept of hesitant fuzzy linguistic sets was introduced. The focus of this paper is the multicriteria decision-making (MCDM) problems in which the criteria are in different priority levels and the criteria values take the form of hesitant fuzzy linguistic numbers (HFLNs). A new approach to solving these problems is proposed, which is based on the generalized prioritized aggregation operator of HFLNs. Firstly, the new operations and comparison method for HFLNs are provided and some linguistic scale functions are applied. Subsequently, two prioritized aggregation operators and a generalized prioritized aggregation operator of HFLNs are developed and applied to MCDM problems. Finally, an illustrative example is given to illustrate the effectiveness and feasibility of the proposed method, which are then compared to the existing approach.
ERIC Educational Resources Information Center
Yen, Cheng-Huang; Chen, I-Chuan; Lai, Su-Chun; Chuang, Yea-Ru
2015-01-01
Traces of learning behaviors generally provide insights into learners and the learning processes that they employ. In this article, a learning-analytics-based approach is proposed for managing cognitive load by adjusting the instructional strategies used in online courses. The technology-based learning environment examined in this study involved a…
A novel feature extraction approach for microarray data based on multi-algorithm fusion
Jiang, Zhu; Xu, Rong
2015-01-01
Feature extraction is one of the most important and effective method to reduce dimension in data mining, with emerging of high dimensional data such as microarray gene expression data. Feature extraction for gene selection, mainly serves two purposes. One is to identify certain disease-related genes. The other is to find a compact set of discriminative genes to build a pattern classifier with reduced complexity and improved generalization capabilities. Depending on the purpose of gene selection, two types of feature extraction algorithms including ranking-based feature extraction and set-based feature extraction are employed in microarray gene expression data analysis. In ranking-based feature extraction, features are evaluated on an individual basis, without considering inter-relationship between features in general, while set-based feature extraction evaluates features based on their role in a feature set by taking into account dependency between features. Just as learning methods, feature extraction has a problem in its generalization ability, which is robustness. However, the issue of robustness is often overlooked in feature extraction. In order to improve the accuracy and robustness of feature extraction for microarray data, a novel approach based on multi-algorithm fusion is proposed. By fusing different types of feature extraction algorithms to select the feature from the samples set, the proposed approach is able to improve feature extraction performance. The new approach is tested against gene expression dataset including Colon cancer data, CNS data, DLBCL data, and Leukemia data. The testing results show that the performance of this algorithm is better than existing solutions. PMID:25780277
A novel feature extraction approach for microarray data based on multi-algorithm fusion.
Jiang, Zhu; Xu, Rong
2015-01-01
Feature extraction is one of the most important and effective method to reduce dimension in data mining, with emerging of high dimensional data such as microarray gene expression data. Feature extraction for gene selection, mainly serves two purposes. One is to identify certain disease-related genes. The other is to find a compact set of discriminative genes to build a pattern classifier with reduced complexity and improved generalization capabilities. Depending on the purpose of gene selection, two types of feature extraction algorithms including ranking-based feature extraction and set-based feature extraction are employed in microarray gene expression data analysis. In ranking-based feature extraction, features are evaluated on an individual basis, without considering inter-relationship between features in general, while set-based feature extraction evaluates features based on their role in a feature set by taking into account dependency between features. Just as learning methods, feature extraction has a problem in its generalization ability, which is robustness. However, the issue of robustness is often overlooked in feature extraction. In order to improve the accuracy and robustness of feature extraction for microarray data, a novel approach based on multi-algorithm fusion is proposed. By fusing different types of feature extraction algorithms to select the feature from the samples set, the proposed approach is able to improve feature extraction performance. The new approach is tested against gene expression dataset including Colon cancer data, CNS data, DLBCL data, and Leukemia data. The testing results show that the performance of this algorithm is better than existing solutions.
A personalized health-monitoring system for elderly by combining rules and case-based reasoning.
Ahmed, Mobyen Uddin
2015-01-01
Health-monitoring system for elderly in home environment is a promising solution to provide efficient medical services that increasingly interest by the researchers within this area. It is often more challenging when the system is self-served and functioning as personalized provision. This paper proposed a personalized self-served health-monitoring system for elderly in home environment by combining general rules with a case-based reasoning approach. Here, the system generates feedback, recommendation and alarm in a personalized manner based on elderly's medical information and health parameters such as blood pressure, blood glucose, weight, activity, pulse, etc. A set of general rules has used to classify individual health parameters. The case-based reasoning approach is used to combine all different health parameters, which generates an overall classification of health condition. According to the evaluation result considering 323 cases and k=2 i.e., top 2 most similar retrieved cases, the sensitivity, specificity and overall accuracy are achieved as 90%, 97% and 96% respectively. The preliminary result of the system is acceptable since the feedback; recommendation and alarm messages are personalized and differ from the general messages. Thus, this approach could be possibly adapted for other situations in personalized elderly monitoring.
Guaranteed cost control of polynomial fuzzy systems via a sum of squares approach.
Tanaka, Kazuo; Ohtake, Hiroshi; Wang, Hua O
2009-04-01
This paper presents the guaranteed cost control of polynomial fuzzy systems via a sum of squares (SOS) approach. First, we present a polynomial fuzzy model and controller that are more general representations of the well-known Takagi-Sugeno (T-S) fuzzy model and controller, respectively. Second, we derive a guaranteed cost control design condition based on polynomial Lyapunov functions. Hence, the design approach discussed in this paper is more general than the existing LMI approaches (to T-S fuzzy control system designs) based on quadratic Lyapunov functions. The design condition realizes a guaranteed cost control by minimizing the upper bound of a given performance function. In addition, the design condition in the proposed approach can be represented in terms of SOS and is numerically (partially symbolically) solved via the recent developed SOSTOOLS. To illustrate the validity of the design approach, two design examples are provided. The first example deals with a complicated nonlinear system. The second example presents micro helicopter control. Both the examples show that our approach provides more extensive design results for the existing LMI approach.
Priority setting in health care: trends and models from Scandinavian experiences.
Hofmann, Bjørn
2013-08-01
The Scandinavian welfare states have public health care systems which have universal coverage and traditionally low influence of private insurance and private provision. Due to raises in costs, elaborate public control of health care, and a significant technological development in health care, priority setting came on the public agenda comparatively early in the Scandinavian countries. The development of health care priority setting has been partly homogeneous and appears to follow certain phases. This can be of broader interest as it may shed light on alternative models and strategies in health care priority setting. Some general trends have been identified: from principles to procedures, from closed to open processes, and from experts to participation. Five general approaches have been recognized: The moral principles and values based approach, the moral principles and economic assessment approach, the procedural approach, the expert based practice defining approach, and the participatory practice defining approach. There are pros and cons with all of these approaches. For the time being the fifth approach appears attractive, but its lack of true participation and the lack of clear success criteria may pose significant challenges in the future.
Activity-Based Approach for Teaching Aqueous Solubility, Energy, and Entropy
ERIC Educational Resources Information Center
Eisen, Laura; Marano, Nadia; Glazier, Samantha
2014-01-01
We describe an activity-based approach for teaching aqueous solubility to introductory chemistry students that provides a more balanced presentation of the roles of energy and entropy in dissolution than is found in most general chemistry textbooks. In the first few activities, students observe that polar substances dissolve in water, whereas…
Content Based Image Retrieval and Information Theory: A General Approach.
ERIC Educational Resources Information Center
Zachary, John; Iyengar, S. S.; Barhen, Jacob
2001-01-01
Proposes an alternative real valued representation of color based on the information theoretic concept of entropy. A theoretical presentation of image entropy is accompanied by a practical description of the merits and limitations of image entropy compared to color histograms. Results suggest that image entropy is a promising approach to image…
Expanding Omani Learners' Horizons through Project-Based Learning: A Case Study
ERIC Educational Resources Information Center
Dauletova, Victoria
2014-01-01
As a relatively innovative teaching/learning approach in the Arabian Gulf region, in general, and in Oman, in particular, project-based learning requires progressive amendments and adaptations to the national culture of the learner. This article offers analysis of the current state of the approach in the local educational environment. Furthermore,…
ERIC Educational Resources Information Center
Ansary, Nadia S.; Elias, Maurice J.; Greene, Michael B.; Green, Stuart
2015-01-01
This article synthesizes the current research on bullying prevention and intervention in order to provide guidance to schools seeking to select and implement antibullying strategies. Evidence-based best practices that are shared across generally effective antibullying approaches are elucidated, and these strategies are grounded in examples…
NASA Technical Reports Server (NTRS)
Pahr, D. H.; Arnold, S. M.
2001-01-01
The paper begins with a short overview of the recent work done in the field of discontinuous reinforced composites, focusing on the different parameters which influence the material behavior of discontinuous reinforced composites, as well as the various analysis approaches undertaken. Based on this overview it became evident, that in order to investigate the enumerated effects in an efficient and comprehensive manner, an alternative approach to the computationally intensive finite-element based micromechanics approach is required. Therefore, an investigation is conducted to demonstrate the utility of utilizing the generalized method of cells (GMC), a semi-analytical micromechanics-based approach, to simulate the elastic and elastoplastic material behavior of aligned short fiber composites. The results are compared with (1) simulations using other micromechanical based mean field models and finite element (FE) unit cell models found in the literature given elastic material behavior, as well as (2) finite element unit cell and a new semianalytical elastoplastic shear lag model in the inelastic range. GMC is shown to definitely have a window of applicability when simulating discontinuously reinforced composite material behavior.
NASA Technical Reports Server (NTRS)
Pahr, D. H.; Arnold, S. M.
2001-01-01
The paper begins with a short overview of the recent work done in the field of discontinuous reinforced composites, focusing on the different parameters which influence the material behavior of discontinuous reinforced composites, as well as the various analysis approaches undertaken. Based on this overview it became evident that in order to investigate the enumerated effects in an efficient and comprehensive manner, an alternative approach to the computationally intensive finite-element based micromechanics approach is required. Therefore, an investigation is conducted to demonstrate the utility of utilizing the generalized method of cells (GMC), a semi-analytical micromechanics-based approach, to simulate the elastic and elastoplastic material behavior of aligned short fiber composites. The results are compared with simulations using other micromechanical based mean field models and finite element (FE) unit cell models found in the literature given elastic material behavior, as well as finite element unit cell and a new semianalytical elastoplastic shear lag model in the inelastic range. GMC is shown to definitely have a window of applicability when simulating discontinuously reinforced composite material behavior.
Support-vector-based emergent self-organising approach for emotional understanding
NASA Astrophysics Data System (ADS)
Nguwi, Yok-Yen; Cho, Siu-Yeung
2010-12-01
This study discusses the computational analysis of general emotion understanding from questionnaires methodology. The questionnaires method approaches the subject by investigating the real experience that accompanied the emotions, whereas the other laboratory approaches are generally associated with exaggerated elements. We adopted a connectionist model called support-vector-based emergent self-organising map (SVESOM) to analyse the emotion profiling from the questionnaires method. The SVESOM first identifies the important variables by giving discriminative features with high ranking. The classifier then performs the classification based on the selected features. Experimental results show that the top rank features are in line with the work of Scherer and Wallbott [(1994), 'Evidence for Universality and Cultural Variation of Differential Emotion Response Patterning', Journal of Personality and Social Psychology, 66, 310-328], which approached the emotions physiologically. While the performance measures show that using the full features for classifications can degrade the performance, the selected features provide superior results in terms of accuracy and generalisation.
Improved Shaping Approach to the Preliminary Design of Low-Thrust Trajectories
NASA Astrophysics Data System (ADS)
Novak, D. M.; Vasile, M.
2011-01-01
This paper presents a general framework for the development of shape-based approaches to low-thrust trajectory design. A novel shaping method, based on a three-dimensional description of the trajectory in spherical coordinates, is developed within this general framework. Both the exponential sinusoid and the inverse polynomial shaping are demonstrated to be particular two-dimensional cases of the spherical one. The pseudoequinoctial shaping is revisited within the new framework, and the nonosculating nature of the pseudoequinoctial elements is analyzed. A two-step approach is introduced to solve the time of flight constraint, related to the design of low-thrust arcs with boundary constraints for both spherical and pseudoequinoctial shaping. The solution derived from the shaping approach is improved with a feedback linear-quadratic controller and compared against a direct collocation method based on finite elements in time. The new shaping approach and the combination of shaping and linear-quadratic controller are tested on three case studies: a mission to Mars, a mission to asteroid 1989ML, a mission to comet Tempel-1, and a mission to Neptune.
The Relevance of Arieti's Work in the Age of Medication.
Balbuena Rivera, Francisco
2016-09-01
This paper looks at the relevance of psychoanalysis as a treatment option for psychotic individuals at a time when psychosis is invariably considered to be a biologically-based brain disease, for which the preferred course of treatment is psychotropic medication. In recent years, the use of psychoanalysis has declined noticeably in favor of evidenced-based biomedical approaches, which rely heavily upon statistical probabilities for ameliorating specific psychotic symptoms. Well-publicized biological approaches have proliferated, often to the detriment of the psychotic individual's general health, emotional recovery, and long-term rehabilitation. Sadly, these approaches may also be a significant factor affecting mortality rates in those suffering with psychosis, known to be about 25 years shorter, on average, than the general population.
Galbraith, Kevin; Ward, Alison; Heneghan, Carl
2017-05-03
Evidence-Based Medicine (EBM) skills have been included in general practice curricula and competency frameworks. However, GPs experience numerous barriers to developing and maintaining EBM skills, and some GPs feel the EBM movement misunderstands, and threatens their traditional role. We therefore need a new approach that acknowledges the constraints encountered in real-world general practice. The aim of this study was to synthesise from empirical research a real-world EBM competency framework for general practice, which could be applied in training, in the individual pursuit of continuing professional development, and in routine care. We sought to integrate evidence from the literature with evidence derived from the opinions of experts in the fields of general practice and EBM. We synthesised two sets of themes describing the meaning of EBM in general practice. One set of themes was derived from a mixed-methods systematic review of the literature; the other set was derived from the further development of those themes using a Delphi process among a panel of EBM and general practice experts. From these two sets of themes we constructed a real-world EBM competency framework for general practice. A simple competency framework was constructed, that acknowledges the constraints of real-world general practice: (1) mindfulness - in one's approach towards EBM itself, and to the influences on decision-making; (2) pragmatism - in one's approach to finding and evaluating evidence; and (3) knowledge of the patient - as the most useful resource in effective communication of evidence. We present a clinical scenario to illustrate how a GP might demonstrate these competencies in their routine daily work. We have proposed a real-world EBM competency framework for general practice, derived from empirical research, which acknowledges the constraints encountered in modern general practice. Further validation of these competencies is required, both as an educational resource and as a strategy for actual practice.
Balancing Generalization and Lexical Conservatism: An Artificial Language Study with Child Learners
ERIC Educational Resources Information Center
Wonnacott, Elizabeth
2011-01-01
Successful language acquisition involves generalization, but learners must balance this against the acquisition of lexical constraints. Such learning has been considered problematic for theories of acquisition: if learners generalize abstract patterns to new words, how do they learn lexically-based exceptions? One approach claims that learners use…
TOPICS IN THEORY OF GENERALIZED PARTON DISTRIBUTIONS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Radyushkin, Anatoly V.
Several topics in the theory of generalized parton distributions (GPDs) are reviewed. First, we give a brief overview of the basics of the theory of generalized parton distributions and their relationship with simpler phenomenological functions, viz. form factors, parton densities and distribution amplitudes. Then, we discuss recent developments in building models for GPDs that are based on the formalism of double distributions (DDs). A special attention is given to a careful analysis of the singularity structure of DDs. The DD formalism is applied to construction of a model GPDs with a singular Regge behavior. Within the developed DD-based approach, wemore » discuss the structure of GPD sum rules. It is shown that separation of DDs into the so-called ``plus'' part and the $D$-term part may be treated as a renormalization procedure for the GPD sum rules. This approach is compared with an alternative prescription based on analytic regularization.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Huiqiang; Wu, Xizeng, E-mail: xwu@uabmc.edu, E-mail: tqxiao@sinap.ac.cn; Xiao, Tiqiao, E-mail: xwu@uabmc.edu, E-mail: tqxiao@sinap.ac.cn
Purpose: Propagation-based phase-contrast CT (PPCT) utilizes highly sensitive phase-contrast technology applied to x-ray microtomography. Performing phase retrieval on the acquired angular projections can enhance image contrast and enable quantitative imaging. In this work, the authors demonstrate the validity and advantages of a novel technique for high-resolution PPCT by using the generalized phase-attenuation duality (PAD) method of phase retrieval. Methods: A high-resolution angular projection data set of a fish head specimen was acquired with a monochromatic 60-keV x-ray beam. In one approach, the projection data were directly used for tomographic reconstruction. In two other approaches, the projection data were preprocessed bymore » phase retrieval based on either the linearized PAD method or the generalized PAD method. The reconstructed images from all three approaches were then compared in terms of tissue contrast-to-noise ratio and spatial resolution. Results: The authors’ experimental results demonstrated the validity of the PPCT technique based on the generalized PAD-based method. In addition, the results show that the authors’ technique is superior to the direct PPCT technique as well as the linearized PAD-based PPCT technique in terms of their relative capabilities for tissue discrimination and characterization. Conclusions: This novel PPCT technique demonstrates great potential for biomedical imaging, especially for applications that require high spatial resolution and limited radiation exposure.« less
NASA Astrophysics Data System (ADS)
Lachowicz, Mirosław
2016-03-01
The very stimulating paper [6] discusses an approach to perception and learning in a large population of living agents. The approach is based on a generalization of kinetic theory methods in which the interactions between agents are described in terms of game theory. Such an approach was already discussed in Ref. [2-4] (see also references therein) in various contexts. The processes of perception and learning are based on the interactions between agents and therefore the general kinetic theory is a suitable tool for modeling them. However the main question that rises is how the perception and learning processes may be treated in the mathematical modeling. How may we precisely deliver suitable mathematical structures that are able to capture various aspects of perception and learning?
Management of venous leg ulcers in general practice - a practical guideline.
Sinha, Sankar; Sreedharan, Sadhishaan
2014-09-01
Chronic venous leg ulcers are the most common wounds seen in general practice. Their management can be both challenging and time-consuming. To produce a short practical guideline incorporating the TIME concept and A2BC2D approach to help general practitioners and their practice nurses in delivering evidence-based initial care to patients with chronic venous leg ulcers. Most chronic venous leg ulcers can be managed effectively in the general practice setting by following the simple, evidence-based approach described in this article. Figure 1 provides a flow chart to aid in this process. Figure 2 illustrates the principles of management in general practice. Effective management of chronic ulcers involves the assessment of both the ulcer and the patient. The essential requirements of management are to debride the ulcer with appropriate precautions, choose dressings that maintain adequate moisture balance, apply graduated compression bandage after evaluation of the arterial circulation and address the patient's concerns, such as pain and offensive wound discharge.
Improving software maintenance through measurement
NASA Technical Reports Server (NTRS)
Rombach, H. Dieter; Ulery, Bradford T.
1989-01-01
A practical approach to improving software maintenance through measurements is presented. This approach is based on general models for measurement and improvement. Both models, their integration, and practical guidelines for transferring them into industrial maintenance settings are presented. Several examples of applications of the approach to real-world maintenance environments are discussed.
Influence Based Learning Program Scientific Learning Approach to Science Students Generic Skills
ERIC Educational Resources Information Center
Wahyuni, Ida; Amdani, Khairul
2016-01-01
This study aims to determine the influence of scientific approach based learning program (P2BPS) against generic science skills of students. The method used in this research is "quasi experiment" with "two-group pretest posttest" design.The population in this study were all students who take courses in general physics II at the…
ERIC Educational Resources Information Center
Rapp, Brenda; Miozzo, Michele
2011-01-01
The papers in this special issue of "Language and Cognitive Processing" on the neural bases of language production illustrate two general approaches in current cognitive neuroscience. One approach focuses on investigating cognitive issues, making use of the logic of associations/dissociations or the logic of neural markers as key investigative…
ERIC Educational Resources Information Center
Kaiser, Gabriele; Busse, Andreas; Hoth, Jessica; König, Johannes; Blömeke, Sigrid
2015-01-01
Research on the evaluation of the professional knowledge of mathematics teachers (comprising for example mathematical content knowledge, mathematics pedagogical content knowledge and general pedagogical knowledge) has become prominent in the last decade; however, the development of video-based assessment approaches is a more recent topic. This…
Pflueger, Marlon O; Franke, Irina; Graf, Marc; Hachtel, Henning
2015-03-29
Psychiatric expert opinions are supposed to assess the accused individual's risk of reoffending based on a valid scientific foundation. In contrast to specific recidivism, general recidivism has only been poorly considered in Continental Europe; we therefore aimed to develop a valid instrument for assessing the risk of general criminal recidivism of mentally ill offenders. Data of 259 mentally ill offenders with a median time at risk of 107 months were analyzed and combined with the individuals' criminal records. We derived risk factors for general criminal recidivism and classified re-offences by using a random forest approach. In our sample of mentally ill offenders, 51% were reconvicted. The most important predictive factors for general criminal recidivism were: number of prior convictions, age, type of index offence, diversity of criminal history, and substance abuse. With our statistical approach we were able to correctly identify 58-95% of all reoffenders and 65-97% of all committed offences (AUC = .90). Our study presents a new statistical approach to forensic-psychiatric risk-assessment, allowing experts to evaluate general risk of reoffending in mentally disordered individuals, with a special focus on high-risk groups. This approach might serve not only for expert opinions in court, but also for risk management strategies and therapeutic interventions.
NASA Astrophysics Data System (ADS)
Cave, Robert J.; Newton, Marshall D.
1997-06-01
Two independent methods are presented for the nonperturbative calculation of the electronic coupling matrix element (Hab) for electron transfer reactions using ab initio electronic structure theory. The first is based on the generalized Mulliken-Hush (GMH) model, a multistate generalization of the Mulliken Hush formalism for the electronic coupling. The second is based on the block diagonalization (BD) approach of Cederbaum, Domcke, and co-workers. Detailed quantitative comparisons of the two methods are carried out based on results for (a) several states of the system Zn2OH2+ and (b) the low-lying states of the benzene-Cl atom complex and its contact ion pair. Generally good agreement between the two methods is obtained over a range of geometries. Either method can be applied at an arbitrary nuclear geometry and, as a result, may be used to test the validity of the Condon approximation. Examples of nonmonotonic behavior of the electronic coupling as a function of nuclear coordinates are observed for Zn2OH2+. Both methods also yield a natural definition of the effective distance (rDA) between donor (D) and acceptor (A) sites, in contrast to earlier approaches which required independent estimates of rDA, generally based on molecular structure data.
NASA Astrophysics Data System (ADS)
Plimak, L. I.; Fleischhauer, M.; Olsen, M. K.; Collett, M. J.
2003-01-01
We present an introduction to phase-space techniques (PST) based on a quantum-field-theoretical (QFT) approach. In addition to bridging the gap between PST and QFT, our approach results in a number of generalizations of the PST. First, for problems where the usual PST do not result in a genuine Fokker-Planck equation (even after phase-space doubling) and hence fail to produce a stochastic differential equation (SDE), we show how the system in question may be approximated via stochastic difference equations (SΔE). Second, we show that introducing sources into the SDE’s (or SΔE’s) generalizes them to a full quantum nonlinear stochastic response problem (thus generalizing Kubo’s linear reaction theory to a quantum nonlinear stochastic response theory). Third, we establish general relations linking quantum response properties of the system in question to averages of operator products ordered in a way different from time normal. This extends PST to a much wider assemblage of operator products than are usually considered in phase-space approaches. In all cases, our approach yields a very simple and straightforward way of deriving stochastic equations in phase space.
Simulation-based robust optimization for signal timing and setting.
DOT National Transportation Integrated Search
2009-12-30
The performance of signal timing plans obtained from traditional approaches for : pre-timed (fixed-time or actuated) control systems is often unstable under fluctuating traffic : conditions. This report develops a general approach for optimizing the ...
Lippert, Maria Laura; Kousgaard, Marius Brostrøm; Bjerrum, Lars
2014-11-30
Currently, there is a strong focus on the diffusion and implementation of indicator-based technologies for assessing and improving the quality of care in general practice. The aim of this study was to explore how and for what purposes indicator-based feedback is used by the general practitioners (GPs) and how they perceive it to contribute to their work. Qualitative interviews with nine GPs in two regions in Denmark. The main selection criterion was that the informants had experience with retrieving electronic feedback. The data generation was explorative and open-ended and the analysis took an iterative approach with continuous refinement of themes that emerged from the data. The study identified two main uses of feedback: i) Administration of a regular disease control schedule for patients with chronic disease and ii) Routine monitoring of outcomes for purposes of resource prioritisation and medication management. Both uses were deemed valuable by the GPs, but also as an additional extra to the clinical core task. All the GPs experienced the feedback to be of limited relevance to the most central and challenging aspects of clinical work understood as the care for individuals. This led to different reactions: Some GPs would use the feedback as a point of departure for broader deliberations about individual patient needs and treatment approaches. For others, the perceived limitations decreased their overall motivation to seek feedback. The study points to the importance of clarifying limitations as well as possibilities with respect to different aspects of clinical quality when introducing indicator-based technologies to practitioners. The results also emphasize that an indicator-based approach to quality improvement should not stand alone in general practice since some of the most central and challenging aspects of clinical work are not covered by this approach.
Bradford, Daniel E.; Starr, Mark J.; Shackman, Alexander J.
2015-01-01
Abstract Startle potentiation is a well‐validated translational measure of negative affect. Startle potentiation is widely used in clinical and affective science, and there are multiple approaches for its quantification. The three most commonly used approaches quantify startle potentiation as the increase in startle response from a neutral to threat condition based on (1) raw potentiation, (2) standardized potentiation, or (3) percent‐change potentiation. These three quantification approaches may yield qualitatively different conclusions about effects of independent variables (IVs) on affect when within‐ or between‐group differences exist for startle response in the neutral condition. Accordingly, we directly compared these quantification approaches in a shock‐threat task using four IVs known to influence startle response in the no‐threat condition: probe intensity, time (i.e., habituation), alcohol administration, and individual differences in general startle reactivity measured at baseline. We confirmed the expected effects of time, alcohol, and general startle reactivity on affect using self‐reported fear/anxiety as a criterion. The percent‐change approach displayed apparent artifact across all four IVs, which raises substantial concerns about its validity. Both raw and standardized potentiation approaches were stable across probe intensity and time, which supports their validity. However, only raw potentiation displayed effects that were consistent with a priori specifications and/or the self‐report criterion for the effects of alcohol and general startle reactivity. Supplemental analyses of reliability and validity for each approach provided additional evidence in support of raw potentiation. PMID:26372120
Different approach to the modeling of nonfree particle diffusion
NASA Astrophysics Data System (ADS)
Buhl, Niels
2018-03-01
A new approach to the modeling of nonfree particle diffusion is presented. The approach uses a general setup based on geometric graphs (networks of curves), which means that particle diffusion in anything from arrays of barriers and pore networks to general geometric domains can be considered and that the (free random walk) central limit theorem can be generalized to cover also the nonfree case. The latter gives rise to a continuum-limit description of the diffusive motion where the effect of partially absorbing barriers is accounted for in a natural and non-Markovian way that, in contrast to the traditional approach, quantifies the absorptivity of a barrier in terms of a dimensionless parameter in the range 0 to 1. The generalized theorem gives two general analytic expressions for the continuum-limit propagator: an infinite sum of Gaussians and an infinite sum of plane waves. These expressions entail the known method-of-images and Laplace eigenfunction expansions as special cases and show how the presence of partially absorbing barriers can lead to phenomena such as line splitting and band gap formation in the plane wave wave-number spectrum.
ERIC Educational Resources Information Center
Boyd, Susan L.
2007-01-01
Several puzzles are designed to be used by chemistry students as learning tools and teach them basic chemical concepts. The topics of the puzzles are based on the chapters from Chemistry, The Central Science used in general chemistry course and the puzzles are in various forms like crosswords, word searches, number searches, puzzles based on…
ERIC Educational Resources Information Center
Elmelegy, Reda Ibrahim
2015-01-01
The current research aims at clarifying how school-based management (SBM) can contribute to achieve the decision-making quality in Egyptian general secondary schools and determine the requirements of quality decision-making. It depends on the descriptive method in order to acknowledge the basics of the SBM and its relationship with the quality of…
ERIC Educational Resources Information Center
Li, Deping; Oranje, Andreas
2007-01-01
Two versions of a general method for approximating standard error of regression effect estimates within an IRT-based latent regression model are compared. The general method is based on Binder's (1983) approach, accounting for complex samples and finite populations by Taylor series linearization. In contrast, the current National Assessment of…
The Estimation of Gestational Age at Birth in Database Studies.
Eberg, Maria; Platt, Robert W; Filion, Kristian B
2017-11-01
Studies on the safety of prenatal medication use require valid estimation of the pregnancy duration. However, gestational age is often incompletely recorded in administrative and clinical databases. Our objective was to compare different approaches to estimating the pregnancy duration. Using data from the Clinical Practice Research Datalink and Hospital Episode Statistics, we examined the following four approaches to estimating missing gestational age: (1) generalized estimating equations for longitudinal data; (2) multiple imputation; (3) estimation based on fetal birth weight and sex; and (4) conventional approaches that assigned a fixed value (39 weeks for all or 39 weeks for full term and 35 weeks for preterm). The gestational age recorded in Hospital Episode Statistics was considered the gold standard. We conducted a simulation study comparing the described approaches in terms of estimated bias and mean square error. A total of 25,929 infants from 22,774 mothers were included in our "gold standard" cohort. The smallest average absolute bias was observed for the generalized estimating equation that included birth weight, while the largest absolute bias occurred when assigning 39-week gestation to all those with missing values. The smallest mean square errors were detected with generalized estimating equations while multiple imputation had the highest mean square errors. The use of generalized estimating equations resulted in the most accurate estimation of missing gestational age when birth weight information was available. In the absence of birth weight, assignment of fixed gestational age based on term/preterm status may be the optimal approach.
A Comparison of Two Approaches for Measuring Educational Growth from CTBS and P-ACT+ Scores.
ERIC Educational Resources Information Center
Noble, Julie; Sawyer, Richard
The purpose of the study was to compare two regression-based approaches for measuring educational effectiveness in Tennessee high schools: the mean residual approach (MR), and a more general linear models (LM) approach. Data were obtained from a sample of 1,011 students who were enrolled in 48 high schools, and who had taken the Comprehensive…
A quasiparticle-based multi-reference coupled-cluster method.
Rolik, Zoltán; Kállay, Mihály
2014-10-07
The purpose of this paper is to introduce a quasiparticle-based multi-reference coupled-cluster (MRCC) approach. The quasiparticles are introduced via a unitary transformation which allows us to represent a complete active space reference function and other elements of an orthonormal multi-reference (MR) basis in a determinant-like form. The quasiparticle creation and annihilation operators satisfy the fermion anti-commutation relations. On the basis of these quasiparticles, a generalization of the normal-ordered operator products for the MR case can be introduced as an alternative to the approach of Mukherjee and Kutzelnigg [Recent Prog. Many-Body Theor. 4, 127 (1995); Mukherjee and Kutzelnigg, J. Chem. Phys. 107, 432 (1997)]. Based on the new normal ordering any quasiparticle-based theory can be formulated using the well-known diagram techniques. Beyond the general quasiparticle framework we also present a possible realization of the unitary transformation. The suggested transformation has an exponential form where the parameters, holding exclusively active indices, are defined in a form similar to the wave operator of the unitary coupled-cluster approach. The definition of our quasiparticle-based MRCC approach strictly follows the form of the single-reference coupled-cluster method and retains several of its beneficial properties. Test results for small systems are presented using a pilot implementation of the new approach and compared to those obtained by other MR methods.
Phylogenetic diversity measures based on Hill numbers.
Chao, Anne; Chiu, Chun-Huo; Jost, Lou
2010-11-27
We propose a parametric class of phylogenetic diversity (PD) measures that are sensitive to both species abundance and species taxonomic or phylogenetic distances. This work extends the conventional parametric species-neutral approach (based on 'effective number of species' or Hill numbers) to take into account species relatedness, and also generalizes the traditional phylogenetic approach (based on 'total phylogenetic length') to incorporate species abundances. The proposed measure quantifies 'the mean effective number of species' over any time interval of interest, or the 'effective number of maximally distinct lineages' over that time interval. The product of the measure and the interval length quantifies the 'branch diversity' of the phylogenetic tree during that interval. The new measures generalize and unify many existing measures and lead to a natural definition of taxonomic diversity as a special case. The replication principle (or doubling property), an important requirement for species-neutral diversity, is generalized to PD. The widely used Rao's quadratic entropy and the phylogenetic entropy do not satisfy this essential property, but a simple transformation converts each to our measures, which do satisfy the property. The proposed approach is applied to forest data for interpreting the effects of thinning.
Automating Media Centers and Small Libraries: A Microcomputer-Based Approach.
ERIC Educational Resources Information Center
Meghabghab, Dania Bilal
Although the general automation process can be applied to most libraries, small libraries and media centers require a customized approach. Using a systematic approach, this guide covers each step and aspect of automation in a small library setting, and combines the principles of automation with field- tested activities. After discussing needs…
An adhesive contact mechanics formulation based on atomistically induced surface traction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fan, Houfu; Ren, Bo; Li, Shaofan, E-mail: shaofan@berkeley.edu
2015-12-01
In this work, we have developed a novel multiscale computational contact formulation based on the generalized Derjuguin approximation for continua that are characterized by atomistically enriched constitutive relations in order to study macroscopic interaction between arbitrarily shaped deformable continua. The proposed adhesive contact formulation makes use of the microscopic interaction forces between individual particles in the interacting bodies. In particular, the double-layer volume integral describing the contact interaction (energy, force vector, matrix) is converted into a double-layer surface integral through a mathematically consistent approach that employs the divergence theorem and a special partitioning technique. The proposed contact model is formulatedmore » in the nonlinear continuum mechanics framework and implemented using the standard finite element method. With no large penalty constant, the stiffness matrix of the system will in general be well-conditioned, which is of great significance for quasi-static analysis. Three numerical examples are presented to illustrate the capability of the proposed method. Results indicate that with the same mesh configuration, the finite element computation based on the surface integral approach is faster and more accurate than the volume integral based approach. In addition, the proposed approach is energy preserving even in a very long dynamic simulation.« less
MacDonald, Donald D.; Dipinto, Lisa M.; Field, Jay; Ingersoll, Christopher G.; Long, Edward R.; Swartz, Richard C.
2000-01-01
Sediment-quality guidelines (SQGs) have been published for polychlorinated biphenyls (PCBs) using both empirical and theoretical approaches. Empirically based guidelines have been developed using the screening-level concentration, effects range, effects level, and apparent effects threshold approaches. Theoretically based guidelines have been developed using the equilibrium-partitioning approach. Empirically-based guidelines were classified into three general categories, in accordance with their original narrative intents, and used to develop three consensus-based sediment effect concentrations (SECs) for total PCBs (tPCBs), including a threshold effect concentration, a midrange effect concentration, and an extreme effect concentration. Consensus-based SECs were derived because they estimate the central tendency of the published SQGs and, thus, reconcile the guidance values that have been derived using various approaches. Initially, consensus-based SECs for tPCBs were developed separately for freshwater sediments and for marine and estuarine sediments. Because the respective SECs were statistically similar, the underlying SQGs were subsequently merged and used to formulate more generally applicable SECs. The three consensus-based SECs were then evaluated for reliability using matching sediment chemistry and toxicity data from field studies, dose-response data from spiked-sediment toxicity tests, and SQGs derived from the equilibrium-partitioning approach. The results of this evaluation demonstrated that the consensus-based SECs can accurately predict both the presence and absence of toxicity in field-collected sediments. Importantly, the incidence of toxicity increases incrementally with increasing concentrations of tPCBs. Moreover, the consensus-based SECs are comparable to the chronic toxicity thresholds that have been estimated from dose-response data and equilibrium-partitioning models. Therefore, consensus-based SECs provide a unifying synthesis of existing SQGs, reflect causal rather than correlative effects, and accurately predict sediment toxicity in PCB-contaminated sediments.
Multi-Level Building Reconstruction for Automatic Enhancement of High Resolution Dsms
NASA Astrophysics Data System (ADS)
Arefi, H.; Reinartz, P.
2012-07-01
In this article a multi-level approach is proposed for reconstruction-based improvement of high resolution Digital Surface Models (DSMs). The concept of Levels of Detail (LOD) defined by CityGML standard has been considered as basis for abstraction levels of building roof structures. Here, the LOD1 and LOD2 which are related to prismatic and parametric roof shapes are reconstructed. Besides proposing a new approach for automatic LOD1 and LOD2 generation from high resolution DSMs, the algorithm contains two generalization levels namely horizontal and vertical. Both generalization levels are applied to prismatic model of buildings. The horizontal generalization allows controlling the approximation level of building footprints which is similar to cartographic generalization concept of the urban maps. In vertical generalization, the prismatic model is formed using an individual building height and continuous to included all flat structures locating in different height levels. The concept of LOD1 generation is based on approximation of the building footprints into rectangular or non-rectangular polygons. For a rectangular building containing one main orientation a method based on Minimum Bounding Rectangle (MBR) in employed. In contrast, a Combined Minimum Bounding Rectangle (CMBR) approach is proposed for regularization of non-rectilinear polygons, i.e. buildings without perpendicular edge directions. Both MBRand CMBR-based approaches are iteratively employed on building segments to reduce the original building footprints to a minimum number of nodes with maximum similarity to original shapes. A model driven approach based on the analysis of the 3D points of DSMs in a 2D projection plane is proposed for LOD2 generation. Accordingly, a building block is divided into smaller parts according to the direction and number of existing ridge lines. The 3D model is derived for each building part and finally, a complete parametric model is formed by merging all the 3D models of the individual parts and adjusting the nodes after the merging step. In order to provide an enhanced DSM, a surface model is provided for each building by interpolation of the internal points of the generated models. All interpolated models are situated on a Digital Terrain Model (DTM) of corresponding area to shape the enhanced DSM. Proposed DSM enhancement approach has been tested on a dataset from Munich central area. The original DSM is created using robust stereo matching of Worldview-2 stereo images. A quantitative assessment of the new DSM by comparing the heights of the ridges and eaves shows a standard deviation of better than 50cm.
Passive vibration control: a structure–immittance approach
Zhang, Sara Ying; Neild, Simon A.
2017-01-01
Linear passive vibration absorbers, such as tuned mass dampers, often contain springs, dampers and masses, although recently there has been a growing trend to employ or supplement the mass elements with inerters. When considering possible configurations with these elements broadly, two approaches are normally used: one structure-based and one immittance-based. Both approaches have their advantages and disadvantages. In this paper, a new approach is proposed: the structure–immittance approach. Using this approach, a full set of possible series–parallel networks with predetermined numbers of each element type can be represented by structural immittances, obtained via a proposed general formulation process. Using the structural immittances, both the ability to investigate a class of absorber possibilities together (advantage of the immittance-based approach), and the ability to control the complexity, topology and element values in resulting absorber configurations (advantages of the structure-based approach) are provided at the same time. The advantages of the proposed approach are demonstrated through two case studies on building vibration suppression and automotive suspension design, respectively. PMID:28588407
Passive vibration control: a structure-immittance approach.
Zhang, Sara Ying; Jiang, Jason Zheng; Neild, Simon A
2017-05-01
Linear passive vibration absorbers, such as tuned mass dampers, often contain springs, dampers and masses, although recently there has been a growing trend to employ or supplement the mass elements with inerters. When considering possible configurations with these elements broadly, two approaches are normally used: one structure-based and one immittance-based. Both approaches have their advantages and disadvantages. In this paper, a new approach is proposed: the structure-immittance approach. Using this approach, a full set of possible series-parallel networks with predetermined numbers of each element type can be represented by structural immittances, obtained via a proposed general formulation process. Using the structural immittances, both the ability to investigate a class of absorber possibilities together (advantage of the immittance-based approach), and the ability to control the complexity, topology and element values in resulting absorber configurations (advantages of the structure-based approach) are provided at the same time. The advantages of the proposed approach are demonstrated through two case studies on building vibration suppression and automotive suspension design, respectively.
Passive vibration control: a structure-immittance approach
NASA Astrophysics Data System (ADS)
Zhang, Sara Ying; Jiang, Jason Zheng; Neild, Simon A.
2017-05-01
Linear passive vibration absorbers, such as tuned mass dampers, often contain springs, dampers and masses, although recently there has been a growing trend to employ or supplement the mass elements with inerters. When considering possible configurations with these elements broadly, two approaches are normally used: one structure-based and one immittance-based. Both approaches have their advantages and disadvantages. In this paper, a new approach is proposed: the structure-immittance approach. Using this approach, a full set of possible series-parallel networks with predetermined numbers of each element type can be represented by structural immittances, obtained via a proposed general formulation process. Using the structural immittances, both the ability to investigate a class of absorber possibilities together (advantage of the immittance-based approach), and the ability to control the complexity, topology and element values in resulting absorber configurations (advantages of the structure-based approach) are provided at the same time. The advantages of the proposed approach are demonstrated through two case studies on building vibration suppression and automotive suspension design, respectively.
A high speed model-based approach for wavefront sensorless adaptive optics systems
NASA Astrophysics Data System (ADS)
Lianghua, Wen; Yang, Ping; Shuai, Wang; Wenjing, Liu; Shanqiu, Chen; Xu, Bing
2018-02-01
To improve temporal-frequency property of wavefront sensorless adaptive optics (AO) systems, a fast general model-based aberration correction algorithm is presented. The fast general model-based approach is based on the approximately linear relation between the mean square of the aberration gradients and the second moment of far-field intensity distribution. The presented model-based method is capable of completing a mode aberration effective correction just applying one disturbing onto the deformable mirror(one correction by one disturbing), which is reconstructed by the singular value decomposing the correlation matrix of the Zernike functions' gradients. Numerical simulations of AO corrections under the various random and dynamic aberrations are implemented. The simulation results indicate that the equivalent control bandwidth is 2-3 times than that of the previous method with one aberration correction after applying N times disturbing onto the deformable mirror (one correction by N disturbing).
Sun, Xu; May, Andrew; Wang, Qingfeng
2016-05-01
This article describes an experimental study investigating the impact on user experience of two approaches of personalization of content provided on a mobile device, for spectators at large sports events. A lab-based experiment showed that a system-driven approach to personalization was generally preferable, but that there were advantages to retaining some user control over the process. Usability implications for a hybrid approach, and design implications are discussed, with general support for countermeasures designed to overcome recognised limitations of adaptive systems. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Distorted Representations of the "Capability Approach" in Australian School Education
ERIC Educational Resources Information Center
Skourdoumbis, Andrew
2015-01-01
Recently, curriculum developments in Australia have seen the incorporation of functionalist "general capabilities" as essential markers of schooling, meaning that any pedagogical expression of classroom-based practice, including subsequent instruction, should entail the identification and development of operational general capabilities.…
Generalized Flip-Flop Input Equations Based on a Four-Valued Boolean Algebra
NASA Technical Reports Server (NTRS)
Tucker, Jerry H.; Tapia, Moiez A.
1996-01-01
A procedure is developed for obtaining generalized flip-flop input equations, and a concise method is presented for representing these equations. The procedure is based on solving a four-valued characteristic equation of the flip-flop, and can encompass flip-flops that are too complex to approach intuitively. The technique is presented using Karnaugh maps, but could easily be implemented in software.
NASA Technical Reports Server (NTRS)
Liou, J.; Tezduyar, T. E.
1990-01-01
Adaptive implicit-explicit (AIE), grouped element-by-element (GEBE), and generalized minimum residuals (GMRES) solution techniques for incompressible flows are combined. In this approach, the GEBE and GMRES iteration methods are employed to solve the equation systems resulting from the implicitly treated elements, and therefore no direct solution effort is involved. The benchmarking results demonstrate that this approach can substantially reduce the CPU time and memory requirements in large-scale flow problems. Although the description of the concepts and the numerical demonstration are based on the incompressible flows, the approach presented here is applicable to larger class of problems in computational mechanics.
Model compilation: An approach to automated model derivation
NASA Technical Reports Server (NTRS)
Keller, Richard M.; Baudin, Catherine; Iwasaki, Yumi; Nayak, Pandurang; Tanaka, Kazuo
1990-01-01
An approach is introduced to automated model derivation for knowledge based systems. The approach, model compilation, involves procedurally generating the set of domain models used by a knowledge based system. With an implemented example, how this approach can be used to derive models of different precision and abstraction is illustrated, and models are tailored to different tasks, from a given set of base domain models. In particular, two implemented model compilers are described, each of which takes as input a base model that describes the structure and behavior of a simple electromechanical device, the Reaction Wheel Assembly of NASA's Hubble Space Telescope. The compilers transform this relatively general base model into simple task specific models for troubleshooting and redesign, respectively, by applying a sequence of model transformations. Each transformation in this sequence produces an increasingly more specialized model. The compilation approach lessens the burden of updating and maintaining consistency among models by enabling their automatic regeneration.
NASA Technical Reports Server (NTRS)
Waszak, Martin R.
1992-01-01
The application of a sector-based stability theory approach to the formulation of useful uncertainty descriptions for linear, time-invariant, multivariable systems is explored. A review of basic sector properties and sector-based approach are presented first. The sector-based approach is then applied to several general forms of parameter uncertainty to investigate its advantages and limitations. The results indicate that the sector uncertainty bound can be used effectively to evaluate the impact of parameter uncertainties on the frequency response of the design model. Inherent conservatism is a potential limitation of the sector-based approach, especially for highly dependent uncertain parameters. In addition, the representation of the system dynamics can affect the amount of conservatism reflected in the sector bound. Careful application of the model can help to reduce this conservatism, however, and the solution approach has some degrees of freedom that may be further exploited to reduce the conservatism.
Implementing evidence-based medicine in general practice: a focus group based study
Hannes, Karin; Leys, Marcus; Vermeire, Etienne; Aertgeerts, Bert; Buntinx, Frank; Depoorter, Anne-Marie
2005-01-01
Background Over the past years concerns are rising about the use of Evidence-Based Medicine (EBM) in health care. The calls for an increase in the practice of EBM, seem to be obstructed by many barriers preventing the implementation of evidence-based thinking and acting in general practice. This study aims to explore the barriers of Flemish GPs (General Practitioners) to the implementation of EBM in routine clinical work and to identify possible strategies for integrating EBM in daily work. Methods We used a qualitative research strategy to gather and analyse data. We organised focus groups between September 2002 and April 2003. The focus group data were analysed using a combined strategy of 'between-case' analysis and 'grounded theory approach'. Thirty-one general practitioners participated in four focus groups. Purposeful sampling was used to recruit participants. Results A basic classification model documents the influencing factors and actors on a micro-, meso- as well as macro-level. Patients, colleagues, competences, logistics and time were identified on the micro-level (the GPs' individual practice), commercial and consumer organisations on the meso-level (institutions, organisations) and health care policy, media and specific characteristics of evidence on the macro-level (policy level and international scientific community). Existing barriers and possible strategies to overcome these barriers were described. Conclusion In order to implement EBM in routine general practice, an integrated approach on different levels needs to be developed. PMID:16153300
A nonlinear generalized continuum approach for electro-elasticity including scale effects
NASA Astrophysics Data System (ADS)
Skatulla, S.; Arockiarajan, A.; Sansour, C.
2009-01-01
Materials characterized by an electro-mechanically coupled behaviour fall into the category of so-called smart materials. In particular, electro-active polymers (EAP) recently attracted much interest, because, upon electrical loading, EAP exhibit a large amount of deformation while sustaining large forces. This property can be utilized for actuators in electro-mechanical systems, artificial muscles and so forth. When it comes to smaller structures, it is a well-known fact that the mechanical response deviates from the prediction of classical mechanics theory. These scale effects are due to the fact that the size of the microscopic material constituents of such structures cannot be considered to be negligible small anymore compared to the structure's overall dimensions. In this context so-called generalized continuum formulations have been proven to account for the micro-structural influence to the macroscopic material response. Here, we want to adopt a strain gradient approach based on a generalized continuum framework [Sansour, C., 1998. A unified concept of elastic-viscoplastic Cosserat and micromorphic continua. J. Phys. IV Proc. 8, 341-348; Sansour, C., Skatulla, S., 2007. A higher gradient formulation and meshfree-based computation for elastic rock. Geomech. Geoeng. 2, 3-15] and extend it to also encompass the electro-mechanically coupled behaviour of EAP. The approach introduces new strain and stress measures which lead to the formulation of a corresponding generalized variational principle. The theory is completed by Dirichlet boundary conditions for the displacement field and its derivatives normal to the boundary as well as the electric potential. The basic idea behind this generalized continuum theory is the consideration of a micro- and a macro-space which together span the generalized space. As all quantities are defined in this generalized space, also the constitutive law, which is in this work conventional electro-mechanically coupled nonlinear hyperelasticity, is embedded in the generalized continuum. In this way material information of the micro-space, which are here only the geometrical specifications of the micro-continuum, can naturally enter the constitutive law. Several applications with moving least square-based approximations (MLS) demonstrate the potential of the proposed method. This particular meshfree method is chosen, as it has been proven to be highly flexible with regard to continuity and consistency required by this generalized approach.
Functional characterization of two concrete biofilms using pyrosequencing data
Phylogenetic studies of concrete biofilms using 16SrRNA-based approaches have demonstrated that concrete surfaces harbor a diverse microbial community. These approaches can provide information on the general taxonomical groups present in a sample but cannot shed light on the func...
Adaptive Role Playing Games: An Immersive Approach for Problem Based Learning
ERIC Educational Resources Information Center
Sancho, Pilar; Moreno-Ger, Pablo; Fuentes-Fernandez, Ruben; Fernandez-Manjon, Baltasar
2009-01-01
In this paper we present a general framework, called NUCLEO, for the application of socio-constructive educational approaches in higher education. The underlying pedagogical approach relies on an adaptation model in order to improve group dynamics, as this has been identified as one of the key features in the success of collaborative learning…
Evaluation of environmental aspects significance in ISO 14001.
Põder, Tõnis
2006-05-01
The methodological framework set by standards ISO 14001 and ISO 14004 gives only general principles for environmental aspects assessment, which is regarded as one of the most critical stages of implementing environmental management system. In Estonia, about 100 organizations have been certified to the ISO 14001. Experience obtained from numerous companies has demonstrated that limited transparency and reproducibility of the assessment process serves as a common shortcoming. Despite rather complicated assessment schemes sometimes used, the evaluation procedures have been largely based on subjective judgments because of ill-defined and inadequate assessment criteria. A comparison with some similar studies in other countries indicates a general nature of observed inconsistencies. The diversity of approaches to the aspects' assessment in concept literature and to the related problems has been discussed. The general structure of basic assessment criteria, compatible with environmental impact assessment and environmental risk analysis has also been outlined. Based on this general structure, the article presents a tiered approach to help organize the assessment in a more consistent manner.
Pattern recognition tool based on complex network-based approach
NASA Astrophysics Data System (ADS)
Casanova, Dalcimar; Backes, André Ricardo; Martinez Bruno, Odemir
2013-02-01
This work proposed a generalization of the method proposed by the authors: 'A complex network-based approach for boundary shape analysis'. Instead of modelling a contour into a graph and use complex networks rules to characterize it, here, we generalize the technique. This way, the work proposes a mathematical tool for characterization signals, curves and set of points. To evaluate the pattern description power of the proposal, an experiment of plat identification based on leaf veins image are conducted. Leaf vein is a taxon characteristic used to plant identification proposes, and one of its characteristics is that these structures are complex, and difficult to be represented as a signal or curves and this way to be analyzed in a classical pattern recognition approach. Here, we model the veins as a set of points and model as graphs. As features, we use the degree and joint degree measurements in a dynamic evolution. The results demonstrates that the technique has a good power of discrimination and can be used for plant identification, as well as other complex pattern recognition tasks.
The Role of Perceived In-group Moral Superiority in Reparative Intentions and Approach Motivation.
Szabó, Zsolt P; Mészáros, Noémi Z; Csertő, István
2017-01-01
Three studies examined how members of a national group react to in-group wrongdoings. We expected that perceived in-group moral superiority would lead to unwillingness to repair the aggression. We also expected that internal-focused emotions such as group-based guilt and group-based shame would predict specific, misdeed-related reparative intentions but not general approach motivation toward the victim groups. In Study 1, facing the in-group's recent aggression, participants who believed that the Hungarians have been more moral throughout their history than members of other nations, used more exonerating cognitions, experienced less in-group critical emotions and showed less willingness to provide reparations for the members of the victim group. Study 2 and Study 3 confirmed most findings of Study 1. Perceived in-group moral superiority directly or indirectly reduced willingness to provide either general or specific reparations, while internally focused in-group critical emotions predicted specific misdeed-related reparative intentions but not general approach motivation. The role of emotional attachment to the in-group is considered.
Gocłowska, Małgorzata A; Aldhobaiban, Nawal; Elliot, Andrew J; Murayama, Kou; Kobeisy, Ahmed; Abdelaziz, Ashraf
2017-06-01
People vary in the extent to which they prefer cooperative, competitive or individualistic achievement tasks. In this research, we conducted two studies designed to investigate correlates and possible roots of these social interdependence orientations, namely approach and avoidance temperament, general self-efficacy, implicit theories of intelligence, and contingencies of self-worth based in others' approval, competition and academic competence. The results indicated that approach temperament, general self-efficacy and incremental theory were positively related, and entity theory was negatively related to cooperative preferences (|r| range from .11 to .41); approach temperament, general self-efficacy, competition contingencies and academic competence contingencies were positively related to competitive preferences (|r| range from .16 to .46); and avoidance temperament, entity theory, competitive contingencies and academic competence contingencies were positively related, and incremental theory was negatively related to individualistic preferences (|r| range from .09 to .15). The findings are discussed with regard to the meaning of each of the three social interdependence orientations, cultural differences among the observed relations and implications for practitioners. © 2015 International Union of Psychological Science.
The Role of Perceived In-group Moral Superiority in Reparative Intentions and Approach Motivation
Szabó, Zsolt P.; Mészáros, Noémi Z.; Csertő, István
2017-01-01
Three studies examined how members of a national group react to in-group wrongdoings. We expected that perceived in-group moral superiority would lead to unwillingness to repair the aggression. We also expected that internal-focused emotions such as group-based guilt and group-based shame would predict specific, misdeed-related reparative intentions but not general approach motivation toward the victim groups. In Study 1, facing the in-group’s recent aggression, participants who believed that the Hungarians have been more moral throughout their history than members of other nations, used more exonerating cognitions, experienced less in-group critical emotions and showed less willingness to provide reparations for the members of the victim group. Study 2 and Study 3 confirmed most findings of Study 1. Perceived in-group moral superiority directly or indirectly reduced willingness to provide either general or specific reparations, while internally focused in-group critical emotions predicted specific misdeed-related reparative intentions but not general approach motivation. The role of emotional attachment to the in-group is considered. PMID:28620333
NASA Astrophysics Data System (ADS)
Mananga, Eugene Stephane; Charpentier, Thibault
2015-04-01
In this paper we present a theoretical perturbative approach for describing the NMR spectrum of strongly dipolar-coupled spin systems under fast magic-angle spinning. Our treatment is based on two approaches: the Floquet approach and the Floquet-Magnus expansion. The Floquet approach is well known in the NMR community as a perturbative approach to get analytical approximations. Numerical procedures are based on step-by-step numerical integration of the corresponding differential equations. The Floquet-Magnus expansion is a perturbative approach of the Floquet theory. Furthermore, we address the " γ -encoding" effect using the Floquet-Magnus expansion approach. We show that the average over " γ " angle can be performed for any Hamiltonian with γ symmetry.
Rank-preserving regression: a more robust rank regression model against outliers.
Chen, Tian; Kowalski, Jeanne; Chen, Rui; Wu, Pan; Zhang, Hui; Feng, Changyong; Tu, Xin M
2016-08-30
Mean-based semi-parametric regression models such as the popular generalized estimating equations are widely used to improve robustness of inference over parametric models. Unfortunately, such models are quite sensitive to outlying observations. The Wilcoxon-score-based rank regression (RR) provides more robust estimates over generalized estimating equations against outliers. However, the RR and its extensions do not sufficiently address missing data arising in longitudinal studies. In this paper, we propose a new approach to address outliers under a different framework based on the functional response models. This functional-response-model-based alternative not only addresses limitations of the RR and its extensions for longitudinal data, but, with its rank-preserving property, even provides more robust estimates than these alternatives. The proposed approach is illustrated with both real and simulated data. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
NASA Technical Reports Server (NTRS)
Seidman, T. I.; Munteanu, M. J.
1979-01-01
The relationships of a variety of general computational methods (and variances) for treating illposed problems such as geophysical inverse problems are considered. Differences in approach and interpretation based on varying assumptions as to, e.g., the nature of measurement uncertainties are discussed along with the factors to be considered in selecting an approach. The reliability of the results of such computation is addressed.
Mortensen, Martin B; Afzal, Shoaib; Nordestgaard, Børge G; Falk, Erling
2015-12-22
Guidelines recommend initiating primary prevention for atherosclerotic cardiovascular disease (ASCVD) with statins based on absolute ASCVD risk assessment. Recently, alternative trial-based and hybrid approaches were suggested for statin treatment eligibility. This study compared these approaches in a direct head-to-head fashion in a contemporary population. The study used the CGPS (Copenhagen General Population Study) with 37,892 subjects aged 40 to 75 years recruited in 2003 to 2008, all free of ASCVD, diabetes, and statin use at baseline. Among the population studied, 42% were eligible for statin therapy according to the 2013 American College of Cardiology/American Heart Association (ACC/AHA) risk assessment and cholesterol treatment guidelines approach, versus 56% with the trial-based approach and 21% with the hybrid approach. Among these statin-eligible subjects, the ASCVD event rate per 1,000 person-years was 9.8, 6.8, and 11.2, respectively. The ACC/AHA-recommended absolute risk score was well calibrated around the 7.5% 10-year ASCVD risk treatment threshold and discriminated better than the trial-based or hybrid approaches. Compared with the ACC/AHA risk-based approach, the net reclassification index for eligibility for statin therapy among 40- to 75-year-old subjects from the CGPS was -0.21 for the trial-based approach and -0.13 for the hybrid approach. The clinical performance of the ACC/AHA risk-based approach for primary prevention of ASCVD with statins was superior to the trial-based and hybrid approaches. Our results indicate that the ACC/AHA guidelines will prevent more ASCVD events than the trial-based and hybrid approaches, while treating fewer people compared with the trial-based approach. Copyright © 2015 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.
Maji, Kaushik; Kouri, Donald J
2011-03-28
We have developed a new method for solving quantum dynamical scattering problems, using the time-independent Schrödinger equation (TISE), based on a novel method to generalize a "one-way" quantum mechanical wave equation, impose correct boundary conditions, and eliminate exponentially growing closed channel solutions. The approach is readily parallelized to achieve approximate N(2) scaling, where N is the number of coupled equations. The full two-way nature of the TISE is included while propagating the wave function in the scattering variable and the full S-matrix is obtained. The new algorithm is based on a "Modified Cayley" operator splitting approach, generalizing earlier work where the method was applied to the time-dependent Schrödinger equation. All scattering variable propagation approaches to solving the TISE involve solving a Helmholtz-type equation, and for more than one degree of freedom, these are notoriously ill-behaved, due to the unavoidable presence of exponentially growing contributions to the numerical solution. Traditionally, the method used to eliminate exponential growth has posed a major obstacle to the full parallelization of such propagation algorithms. We stabilize by using the Feshbach projection operator technique to remove all the nonphysical exponentially growing closed channels, while retaining all of the propagating open channel components, as well as exponentially decaying closed channel components.
A General Architecture for Intelligent Tutoring of Diagnostic Classification Problem Solving
Crowley, Rebecca S.; Medvedeva, Olga
2003-01-01
We report on a general architecture for creating knowledge-based medical training systems to teach diagnostic classification problem solving. The approach is informed by our previous work describing the development of expertise in classification problem solving in Pathology. The architecture envelops the traditional Intelligent Tutoring System design within the Unified Problem-solving Method description Language (UPML) architecture, supporting component modularity and reuse. Based on the domain ontology, domain task ontology and case data, the abstract problem-solving methods of the expert model create a dynamic solution graph. Student interaction with the solution graph is filtered through an instructional layer, which is created by a second set of abstract problem-solving methods and pedagogic ontologies, in response to the current state of the student model. We outline the advantages and limitations of this general approach, and describe it’s implementation in SlideTutor–a developing Intelligent Tutoring System in Dermatopathology. PMID:14728159
Flavoured tobacco products and the public's health: lessons from the TPSAC menthol report.
Samet, Jonathan M; Pentz, Mary Ann; Unger, Jennifer B
2016-11-01
The menthol report developed by the Tobacco Products Scientific Advisory Committee (TPSAC) of the Center for Tobacco Products elaborated a methodology for considering the public health impact of menthol in cigarettes that has relevance to flavourings generally. The TPSAC report was based on a conceptual framework on how menthol in cigarettes has public health impact results of evidence from related systematic reviews, and an evidence-based statistical model. In extending this approach to flavourings generally, consideration will need to be given to the existence of multiple flavourings, a very dynamic market place and regulatory interventions and industry activities. Now is the time to begin to develop the research strategies and models needed to extend the TPSAC approach to flavoured tobacco products generally. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Nielsen, Henrik
2017-01-01
Many computational methods are available for predicting protein sorting in bacteria. When comparing them, it is important to know that they can be grouped into three fundamentally different approaches: signal-based, global-property-based and homology-based prediction. In this chapter, the strengths and drawbacks of each of these approaches is described through many examples of methods that predict secretion, integration into membranes, or subcellular locations in general. The aim of this chapter is to provide a user-level introduction to the field with a minimum of computational theory.
On Decision-Making Among Multiple Rule-Bases in Fuzzy Control Systems
NASA Technical Reports Server (NTRS)
Tunstel, Edward; Jamshidi, Mo
1997-01-01
Intelligent control of complex multi-variable systems can be a challenge for single fuzzy rule-based controllers. This class of problems cam often be managed with less difficulty by distributing intelligent decision-making amongst a collection of rule-bases. Such an approach requires that a mechanism be chosen to ensure goal-oriented interaction between the multiple rule-bases. In this paper, a hierarchical rule-based approach is described. Decision-making mechanisms based on generalized concepts from single-rule-based fuzzy control are described. Finally, the effects of different aggregation operators on multi-rule-base decision-making are examined in a navigation control problem for mobile robots.
Michalowski, Martin; Wilk, Szymon; Tan, Xing; Michalowski, Wojtek
2014-01-01
Clinical practice guidelines (CPGs) implement evidence-based medicine designed to help generate a therapy for a patient suffering from a single disease. When applied to a comorbid patient, the concurrent combination of treatment steps from multiple CPGs is susceptible to adverse interactions in the resulting combined therapy (i.e., a therapy established according to all considered CPGs). This inability to concurrently apply CPGs has been shown to be one of the key shortcomings of CPG uptake in a clinical setting1. Several research efforts are underway to address this issue such as the K4CARE2 and GuideLine INteraction Detection Assistant (GLINDA)3 projects and our previous research on applying constraint logic programming to developing a consistent combined therapy for a comorbid patient4. However, there is no generalized framework for mitigation that effectively captures general characteristics of the problem while handling nuances such as time and ordering requirements imposed by specific CPGs. In this paper we propose a first-order logic-based (FOL) approach for developing a generalized framework of mitigation. This approach uses a meta-algorithm and entailment properties to mitigate (i.e., identify and address) adverse interactions introduced by concurrently applied CPGs. We use an illustrative case study of a patient suffering from type 2 diabetes being treated for an onset of severe rheumatoid arthritis to show the expressiveness and robustness of our proposed FOL-based approach, and we discuss its appropriateness as the basis for the generalized theory.
ERIC Educational Resources Information Center
Onorato, P.
2011-01-01
An introduction to quantum mechanics based on the sum-over-paths (SOP) method originated by Richard P. Feynman and developed by E. F. Taylor and coworkers is presented. The Einstein-Brillouin-Keller (EBK) semiclassical quantization rules are obtained following the SOP approach for bounded systems, and a general approach to the calculation of…
ERIC Educational Resources Information Center
Drabinová, Adéla; Martinková, Patrícia
2017-01-01
In this article we present a general approach not relying on item response theory models (non-IRT) to detect differential item functioning (DIF) in dichotomous items with presence of guessing. The proposed nonlinear regression (NLR) procedure for DIF detection is an extension of method based on logistic regression. As a non-IRT approach, NLR can…
FAMILY FINANCE EDUCATION, AN INTERDISCIPLINARY APPROACH. VOLUME I.
ERIC Educational Resources Information Center
GIBBS, MARY S., ED.; AND OTHERS
THE FIRST OF TWO VOLUMES PRESENTS SCHOOL CURRICULUM DEVELOPMENT AS IT RELATES TO FAMILY FINANCE AND BACKGROUND FOR MONEY MANAGEMENT. AN INTERDISCIPLINARY APPROACH IS USED, BASED ON PHILOSOPHY, SOCIOLOGY, AND PSYCHOLOGY. PART I DEALS WITH GENERAL CURRICULUM PLANNING, CONCEPT FORMATION, ESTABLISHING BEHAVIORAL OBJECTIVES, OVERVIEW OF CURRICULUM…
Teaching Thinking and Problem Solving.
ERIC Educational Resources Information Center
Bransford, John; And Others
1986-01-01
This article focuses on two approaches to teaching reasoning and problem solving. One emphasizes the role of domain-specific knowledge; the other emphasizes general strategic and metacognitive knowledge. Many instructional programs are based on the latter approach. The article concludes that these programs can be strengthened by focusing on domain…
General purpose graphic processing unit implementation of adaptive pulse compression algorithms
NASA Astrophysics Data System (ADS)
Cai, Jingxiao; Zhang, Yan
2017-07-01
This study introduces a practical approach to implement real-time signal processing algorithms for general surveillance radar based on NVIDIA graphical processing units (GPUs). The pulse compression algorithms are implemented using compute unified device architecture (CUDA) libraries such as CUDA basic linear algebra subroutines and CUDA fast Fourier transform library, which are adopted from open source libraries and optimized for the NVIDIA GPUs. For more advanced, adaptive processing algorithms such as adaptive pulse compression, customized kernel optimization is needed and investigated. A statistical optimization approach is developed for this purpose without needing much knowledge of the physical configurations of the kernels. It was found that the kernel optimization approach can significantly improve the performance. Benchmark performance is compared with the CPU performance in terms of processing accelerations. The proposed implementation framework can be used in various radar systems including ground-based phased array radar, airborne sense and avoid radar, and aerospace surveillance radar.
Fluorescent Protein Approaches in Alpha Herpesvirus Research
Hogue, Ian B.; Bosse, Jens B.; Engel, Esteban A.; Scherer, Julian; Hu, Jiun-Ruey; del Rio, Tony; Enquist, Lynn W.
2015-01-01
In the nearly two decades since the popularization of green fluorescent protein (GFP), fluorescent protein-based methodologies have revolutionized molecular and cell biology, allowing us to literally see biological processes as never before. Naturally, this revolution has extended to virology in general, and to the study of alpha herpesviruses in particular. In this review, we provide a compendium of reported fluorescent protein fusions to herpes simplex virus 1 (HSV-1) and pseudorabies virus (PRV) structural proteins, discuss the underappreciated challenges of fluorescent protein-based approaches in the context of a replicating virus, and describe general strategies and best practices for creating new fluorescent fusions. We compare fluorescent protein methods to alternative approaches, and review two instructive examples of the caveats associated with fluorescent protein fusions, including describing several improved fluorescent capsid fusions in PRV. Finally, we present our future perspectives on the types of powerful experiments these tools now offer. PMID:26610544
A general system for automatic biomedical image segmentation using intensity neighborhoods.
Chen, Cheng; Ozolek, John A; Wang, Wei; Rohde, Gustavo K
2011-01-01
Image segmentation is important with applications to several problems in biology and medicine. While extensively researched, generally, current segmentation methods perform adequately in the applications for which they were designed, but often require extensive modifications or calibrations before being used in a different application. We describe an approach that, with few modifications, can be used in a variety of image segmentation problems. The approach is based on a supervised learning strategy that utilizes intensity neighborhoods to assign each pixel in a test image its correct class based on training data. We describe methods for modeling rotations and variations in scales as well as a subset selection for training the classifiers. We show that the performance of our approach in tissue segmentation tasks in magnetic resonance and histopathology microscopy images, as well as nuclei segmentation from fluorescence microscopy images, is similar to or better than several algorithms specifically designed for each of these applications.
NASA Astrophysics Data System (ADS)
Ma, Yuan-Zhuo; Li, Hong-Shuang; Yao, Wei-Xing
2018-05-01
The evaluation of the probabilistic constraints in reliability-based design optimization (RBDO) problems has always been significant and challenging work, which strongly affects the performance of RBDO methods. This article deals with RBDO problems using a recently developed generalized subset simulation (GSS) method and a posterior approximation approach. The posterior approximation approach is used to transform all the probabilistic constraints into ordinary constraints as in deterministic optimization. The assessment of multiple failure probabilities required by the posterior approximation approach is achieved by GSS in a single run at all supporting points, which are selected by a proper experimental design scheme combining Sobol' sequences and Bucher's design. Sequentially, the transformed deterministic design optimization problem can be solved by optimization algorithms, for example, the sequential quadratic programming method. Three optimization problems are used to demonstrate the efficiency and accuracy of the proposed method.
Generalized Bezout's Theorem and its applications in coding theory
NASA Technical Reports Server (NTRS)
Berg, Gene A.; Feng, Gui-Liang; Rao, T. R. N.
1996-01-01
This paper presents a generalized Bezout theorem which can be used to determine a tighter lower bound of the number of distinct points of intersection of two or more curves for a large class of plane curves. A new approach to determine a lower bound on the minimum distance (and also the generalized Hamming weights) for algebraic-geometric codes defined from a class of plane curves is introduced, based on the generalized Bezout theorem. Examples of more efficient linear codes are constructed using the generalized Bezout theorem and the new approach. For d = 4, the linear codes constructed by the new construction are better than or equal to the known linear codes. For d greater than 5, these new codes are better than the known codes. The Klein code over GF(2(sup 3)) is also constructed.
NASA Astrophysics Data System (ADS)
Zafar, I.; Edirisinghe, E. A.; Acar, S.; Bez, H. E.
2007-02-01
Automatic vehicle Make and Model Recognition (MMR) systems provide useful performance enhancements to vehicle recognitions systems that are solely based on Automatic License Plate Recognition (ALPR) systems. Several car MMR systems have been proposed in literature. However these approaches are based on feature detection algorithms that can perform sub-optimally under adverse lighting and/or occlusion conditions. In this paper we propose a real time, appearance based, car MMR approach using Two Dimensional Linear Discriminant Analysis that is capable of addressing this limitation. We provide experimental results to analyse the proposed algorithm's robustness under varying illumination and occlusions conditions. We have shown that the best performance with the proposed 2D-LDA based car MMR approach is obtained when the eigenvectors of lower significance are ignored. For the given database of 200 car images of 25 different make-model classifications, a best accuracy of 91% was obtained with the 2D-LDA approach. We use a direct Principle Component Analysis (PCA) based approach as a benchmark to compare and contrast the performance of the proposed 2D-LDA approach to car MMR. We conclude that in general the 2D-LDA based algorithm supersedes the performance of the PCA based approach.
Students' Written Arguments in General Chemistry Laboratory Investigations
ERIC Educational Resources Information Center
Choi, Aeran; Hand, Brian; Greenbowe, Thomas
2013-01-01
This study aimed to examine the written arguments developed by college freshman students using the Science Writing Heuristic approach in inquiry-based general chemistry laboratory classrooms and its relationships with students' achievement in chemistry courses. Fourteen freshman students participated in the first year of the study while 19…
Generalized Structured Component Analysis with Latent Interactions
ERIC Educational Resources Information Center
Hwang, Heungsun; Ho, Moon-Ho Ringo; Lee, Jonathan
2010-01-01
Generalized structured component analysis (GSCA) is a component-based approach to structural equation modeling. In practice, researchers may often be interested in examining the interaction effects of latent variables. However, GSCA has been geared only for the specification and testing of the main effects of variables. Thus, an extension of GSCA…
An Investigative, Cooperative Learning Approach to the General Microbiology Laboratory
ERIC Educational Resources Information Center
Seifert, Kyle; Fenster, Amy; Dilts, Judith A.; Temple, Louise
2009-01-01
Investigative- and cooperative-based learning strategies have been used effectively in a variety of classrooms to enhance student learning and engagement. In the General Microbiology laboratory for juniors and seniors at James Madison University, these strategies were combined to make a semester-long, investigative, cooperative learning experience…
Regularized Generalized Structured Component Analysis
ERIC Educational Resources Information Center
Hwang, Heungsun
2009-01-01
Generalized structured component analysis (GSCA) has been proposed as a component-based approach to structural equation modeling. In practice, GSCA may suffer from multi-collinearity, i.e., high correlations among exogenous variables. GSCA has yet no remedy for this problem. Thus, a regularized extension of GSCA is proposed that integrates a ridge…
Using a Thematic Laboratory-Centered Curriculum to Teach General Chemistry
ERIC Educational Resources Information Center
Hopkins, Todd A.; Samide, Michael
2013-01-01
This article describes an approach to general chemistry that involves teaching chemical concepts in the context of two thematic laboratory modules: environmental remediation and the fate of pharmaceuticals in the environment. These modules were designed based on active-learning pedagogies and involve multiple-week projects that dictate what…
Informal and Formal Learning of General Practitioners
ERIC Educational Resources Information Center
Spaan, Nadia Roos; Dekker, Anne R. J.; van der Velden, Alike W.; de Groot, Esther
2016-01-01
Purpose: The purpose of this study is to understand the influence of formal learning from a web-based training and informal (workplace) learning afterwards on the behaviour of general practitioners (GPs) with respect to prescription of antibiotics. Design/methodology/approach: To obtain insight in various learning processes, semi-structured…
Some requirements and suggestions for a methodology to develop knowledge based systems.
Green, D W; Colbert, M; Long, J
1989-11-01
This paper describes an approach to the creation of a methodology for the development of knowledge based systems. It specifies some requirements and suggests how these requirements might be met. General requirements can be satisfied using a systems approach. More specific ones can be met by viewing an organization as a network of consultations for coordinating expertise. The nature of consultations is described and the form of a possible cognitive model using a blackboard architecture is outlined. The value of the approach is illustrated in terms of certain knowledge elicitation methods.
NASA Astrophysics Data System (ADS)
Liu, Y.; Guo, Q.; Sun, Y.
2014-04-01
In map production and generalization, it is inevitable to arise some spatial conflicts, but the detection and resolution of these spatial conflicts still requires manual operation. It is become a bottleneck hindering the development of automated cartographic generalization. Displacement is the most useful contextual operator that is often used for resolving the conflicts arising between two or more map objects. Automated generalization researches have reported many approaches of displacement including sequential approaches and optimization approaches. As an excellent optimization approach on the basis of energy minimization principles, elastic beams model has been used in resolving displacement problem of roads and buildings for several times. However, to realize a complete displacement solution, techniques of conflict detection and spatial context analysis should be also take into consideration. So we proposed a complete solution of displacement based on the combined use of elastic beams model and constrained Delaunay triangulation (CDT) in this paper. The solution designed as a cyclic and iterative process containing two phases: detection phase and displacement phase. In detection phase, CDT of map is use to detect proximity conflicts, identify spatial relationships and structures, and construct auxiliary structure, so as to support the displacement phase on the basis of elastic beams. In addition, for the improvements of displacement algorithm, a method for adaptive parameters setting and a new iterative strategy are put forward. Finally, we implemented our solution on a testing map generalization platform, and successfully tested it against 2 hand-generated test datasets of roads and buildings respectively.
Suspension parameter estimation in the frequency domain using a matrix inversion approach
NASA Astrophysics Data System (ADS)
Thite, A. N.; Banvidi, S.; Ibicek, T.; Bennett, L.
2011-12-01
The dynamic lumped parameter models used to optimise the ride and handling of a vehicle require base values of the suspension parameters. These parameters are generally experimentally identified. The accuracy of identified parameters can depend on the measurement noise and the validity of the model used. The existing publications on suspension parameter identification are generally based on the time domain and use a limited degree of freedom. Further, the data used are either from a simulated 'experiment' or from a laboratory test on an idealised quarter or a half-car model. In this paper, a method is developed in the frequency domain which effectively accounts for the measurement noise. Additional dynamic constraining equations are incorporated and the proposed formulation results in a matrix inversion approach. The nonlinearities in damping are estimated, however, using a time-domain approach. Full-scale 4-post rig test data of a vehicle are used. The variations in the results are discussed using the modal resonant behaviour. Further, a method is implemented to show how the results can be improved when the matrix inverted is ill-conditioned. The case study shows a good agreement between the estimates based on the proposed frequency-domain approach and measurable physical parameters.
NASA Astrophysics Data System (ADS)
Daude, F.; Galon, P.
2018-06-01
A Finite-Volume scheme for the numerical computations of compressible single- and two-phase flows in flexible pipelines is proposed based on an approximate Godunov-type approach. The spatial discretization is here obtained using the HLLC scheme. In addition, the numerical treatment of abrupt changes in area and network including several pipelines connected at junctions is also considered. The proposed approach is based on the integral form of the governing equations making it possible to tackle general equations of state. A coupled approach for the resolution of fluid-structure interaction of compressible fluid flowing in flexible pipes is considered. The structural problem is solved using Euler-Bernoulli beam finite elements. The present Finite-Volume method is applied to ideal gas and two-phase steam-water based on the Homogeneous Equilibrium Model (HEM) in conjunction with a tabulated equation of state in order to demonstrate its ability to tackle general equations of state. The extensive application of the scheme for both shock tube and other transient flow problems demonstrates its capability to resolve such problems accurately and robustly. Finally, the proposed 1-D fluid-structure interaction model appears to be computationally efficient.
Simulating Runoff from a Grid Based Mercury Model: Flow Comparisons
Several mercury cycling models, including general mass balance approaches, mixed-batch reactors in streams or lakes, or regional process-based models, exist to assess the ecological exposure risks associated with anthropogenically increased atmospheric mercury (Hg) deposition, so...
Graph-based similarity concepts in virtual screening.
Hutter, Michael C
2011-03-01
Applying similarity for finding new promising compounds is a key issue in drug design. Conversely, quantifying similarity between molecules has remained a difficult task despite the numerous approaches. Here, some general aspects along with recent developments regarding similarity criteria are collected. For the purpose of virtual screening, the compounds have to be encoded into a computer-readable format that permits a comparison, according to given similarity criteria, comprising the use of the 3D structure, fingerprints, graph-based and alignment-based approaches. Whereas finding the most common substructures is the most obvious method, more recent approaches take into account chemical modifications that appear throughout existing drugs, from various therapeutic categories and targets.
Pawlowski, Roger P.; Phipps, Eric T.; Salinger, Andrew G.
2012-01-01
An approach for incorporating embedded simulation and analysis capabilities in complex simulation codes through template-based generic programming is presented. This approach relies on templating and operator overloading within the C++ language to transform a given calculation into one that can compute a variety of additional quantities that are necessary for many state-of-the-art simulation and analysis algorithms. An approach for incorporating these ideas into complex simulation codes through general graph-based assembly is also presented. These ideas have been implemented within a set of packages in the Trilinos framework and are demonstrated on a simple problem from chemical engineering.
Modelling vortex-induced fluid-structure interaction.
Benaroya, Haym; Gabbai, Rene D
2008-04-13
The principal goal of this research is developing physics-based, reduced-order, analytical models of nonlinear fluid-structure interactions associated with offshore structures. Our primary focus is to generalize the Hamilton's variational framework so that systems of flow-oscillator equations can be derived from first principles. This is an extension of earlier work that led to a single energy equation describing the fluid-structure interaction. It is demonstrated here that flow-oscillator models are a subclass of the general, physical-based framework. A flow-oscillator model is a reduced-order mechanical model, generally comprising two mechanical oscillators, one modelling the structural oscillation and the other a nonlinear oscillator representing the fluid behaviour coupled to the structural motion.Reduced-order analytical model development continues to be carried out using a Hamilton's principle-based variational approach. This provides flexibility in the long run for generalizing the modelling paradigm to complex, three-dimensional problems with multiple degrees of freedom, although such extension is very difficult. As both experimental and analytical capabilities advance, the critical research path to developing and implementing fluid-structure interaction models entails-formulating generalized equations of motion, as a superset of the flow-oscillator models; and-developing experimentally derived, semi-analytical functions to describe key terms in the governing equations of motion. The developed variational approach yields a system of governing equations. This will allow modelling of multiple d.f. systems. The extensions derived generalize the Hamilton's variational formulation for such problems. The Navier-Stokes equations are derived and coupled to the structural oscillator. This general model has been shown to be a superset of the flow-oscillator model. Based on different assumptions, one can derive a variety of flow-oscillator models.
A novel logic-based approach for quantitative toxicology prediction.
Amini, Ata; Muggleton, Stephen H; Lodhi, Huma; Sternberg, Michael J E
2007-01-01
There is a pressing need for accurate in silico methods to predict the toxicity of molecules that are being introduced into the environment or are being developed into new pharmaceuticals. Predictive toxicology is in the realm of structure activity relationships (SAR), and many approaches have been used to derive such SAR. Previous work has shown that inductive logic programming (ILP) is a powerful approach that circumvents several major difficulties, such as molecular superposition, faced by some other SAR methods. The ILP approach reasons with chemical substructures within a relational framework and yields chemically understandable rules. Here, we report a general new approach, support vector inductive logic programming (SVILP), which extends the essentially qualitative ILP-based SAR to quantitative modeling. First, ILP is used to learn rules, the predictions of which are then used within a novel kernel to derive a support-vector generalization model. For a highly heterogeneous dataset of 576 molecules with known fathead minnow fish toxicity, the cross-validated correlation coefficients (R2CV) from a chemical descriptor method (CHEM) and SVILP are 0.52 and 0.66, respectively. The ILP, CHEM, and SVILP approaches correctly predict 55, 58, and 73%, respectively, of toxic molecules. In a set of 165 unseen molecules, the R2 values from the commercial software TOPKAT and SVILP are 0.26 and 0.57, respectively. In all calculations, SVILP showed significant improvements in comparison with the other methods. The SVILP approach has a major advantage in that it uses ILP automatically and consistently to derive rules, mostly novel, describing fragments that are toxicity alerts. The SVILP is a general machine-learning approach and has the potential of tackling many problems relevant to chemoinformatics including in silico drug design.
Real-time traffic sign recognition based on a general purpose GPU and deep-learning.
Lim, Kwangyong; Hong, Yongwon; Choi, Yeongwoo; Byun, Hyeran
2017-01-01
We present a General Purpose Graphics Processing Unit (GPGPU) based real-time traffic sign detection and recognition method that is robust against illumination changes. There have been many approaches to traffic sign recognition in various research fields; however, previous approaches faced several limitations when under low illumination or wide variance of light conditions. To overcome these drawbacks and improve processing speeds, we propose a method that 1) is robust against illumination changes, 2) uses GPGPU-based real-time traffic sign detection, and 3) performs region detecting and recognition using a hierarchical model. This method produces stable results in low illumination environments. Both detection and hierarchical recognition are performed in real-time, and the proposed method achieves 0.97 F1-score on our collective dataset, which uses the Vienna convention traffic rules (Germany and South Korea).
ERIC Educational Resources Information Center
Flewelling, Robert L.; Austin, David; Hale, Kelly; LaPlante, Marcia; Liebig, Melissa; Piasecki, Linda; Uerz, Lori
2005-01-01
Despite the popularity and perceived potential effectiveness of community-based coalitions in helping to prevent and reduce adolescent substance use, empirical evidence supporting this approach is sparse. Many reasons have been suggested for why coalition-based prevention initiatives, and community-level interventions in general, have not…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cave, R.J.; Newton, M.D.
1997-06-01
Two independent methods are presented for the nonperturbative calculation of the electronic coupling matrix element (H{sub ab}) for electron transfer reactions using {ital ab initio} electronic structure theory. The first is based on the generalized Mulliken{endash}Hush (GMH) model, a multistate generalization of the Mulliken Hush formalism for the electronic coupling. The second is based on the block diagonalization (BD) approach of Cederbaum, Domcke, and co-workers. Detailed quantitative comparisons of the two methods are carried out based on results for (a) several states of the system Zn{sub 2}OH{sub 2}{sup +} and (b) the low-lying states of the benzene{endash}Cl atom complex andmore » its contact ion pair. Generally good agreement between the two methods is obtained over a range of geometries. Either method can be applied at an arbitrary nuclear geometry and, as a result, may be used to test the validity of the Condon approximation. Examples of nonmonotonic behavior of the electronic coupling as a function of nuclear coordinates are observed for Zn{sub 2}OH{sub 2}{sup +}. Both methods also yield a natural definition of the effective distance (r{sub DA}) between donor (D) and acceptor (A) sites, in contrast to earlier approaches which required independent estimates of r{sub DA}, generally based on molecular structure data. {copyright} {ital 1997 American Institute of Physics.}« less
Gender Competence of the Modern Teacher
ERIC Educational Resources Information Center
Auhadeeva, Ludmila A.; Yarmakeev, Iskander E.; Aukhadeev, Aver E.
2015-01-01
This article discusses improvements in education and modern teacher's gender training in terms of a competence-based approach as a basic strategy of general and vocational education development in Russia. The article substantiates the relevance of teachers' gender training and the necessity to use the gender approach in their professional…
The Promise of Dynamic Systems Approaches for an Integrated Account of Human Development.
ERIC Educational Resources Information Center
Lewis, Marc D.
2000-01-01
Argues that dynamic systems approaches may provide an explanatory framework based on general scientific principles for developmental psychology, using principles of self-organization to explain how novel forms emerge without predetermination and become increasingly complex with development. Contends that self-organization provides a single…
Administrator Preparation: Looking Backwards and Forwards
ERIC Educational Resources Information Center
Bridges, Edwin
2012-01-01
Purpose: The purpose of this paper was to conduct a critical analysis of the origins and implementation of problem-based learning in educational administration as a window into the limitations of this approach and more generally administrator preparation. Design/methodology/approach: The author reviewed the published work of the originator from…
A Comparison of Two Mathematics Problem-Solving Strategies: Facilitate Algebra-Readiness
ERIC Educational Resources Information Center
Xin, Yan Ping; Zhang, Dake; Park, Joo Young; Tom, Kinsey; Whipple, Amanda; Si, Luo
2011-01-01
The authors compared a conceptual model-based problem-solving (COMPS) approach with a general heuristic instructional approach for teaching multiplication-division word-problem solving to elementary students with learning problems (LP). The results indicate that only the COMPS group significantly improved, from pretests to posttests, their…
Supervision--growing and building a sustainable general practice supervisor system.
Thomson, Jennifer S; Anderson, Katrina J; Mara, Paul R; Stevenson, Alexander D
2011-06-06
This article explores various models and ideas for future sustainable general practice vocational training supervision in Australia. The general practitioner supervisor in the clinical practice setting is currently central to training the future general practice workforce. Finding ways to recruit, retain and motivate both new and experienced GP teachers is discussed, as is the creation of career paths for such teachers. Some of the newer methods of practice-based teaching are considered for further development, including vertically integrated teaching, e-learning, wave consulting and teaching on the run, teaching teams and remote teaching. Approaches to supporting and resourcing teaching and the required infrastructure are also considered. Further research into sustaining the practice-based general practice supervision model will be required.
Learning Grasp Context Distinctions that Generalize
NASA Technical Reports Server (NTRS)
Platt, Robert; Grupen, Roderic A.; Fagg, Andrew H.
2006-01-01
Control-based approaches to grasp synthesis create grasping behavior by sequencing and combining control primitives. In the absence of any other structure, these approaches must evaluate a large number of feasible control sequences as a function of object shape, object pose, and task. This work explores a new approach to grasp synthesis that limits consideration to variations on a generalized localize-reach-grasp control policy. A new learning algorithm, known as schema structured learning, is used to learn which instantiations of the generalized policy are most likely to lead to a successful grasp in different problem contexts. Two experiments are described where Dexter, a bimanual upper torso, learns to select an appropriate grasp strategy as a function of object eccentricity and orientation. In addition, it is shown that grasp skills learned in this way can generalize to new objects. Results are presented showing that after learning how to grasp a small, representative set of objects, the robot's performance quantitatively improves for similar objects that it has not experienced before.
Efficient approach to the free energy of crystals via Monte Carlo simulations
NASA Astrophysics Data System (ADS)
Navascués, G.; Velasco, E.
2015-08-01
We present a general approach to compute the absolute free energy of a system of particles with constrained center of mass based on the Monte Carlo thermodynamic coupling integral method. The version of the Frenkel-Ladd approach [J. Chem. Phys. 81, 3188 (1984)], 10.1063/1.448024, which uses a harmonic coupling potential, is recovered. Also, we propose a different choice, based on one-particle square-well coupling potentials, which is much simpler, more accurate, and free from some of the difficulties of the Frenkel-Ladd method. We apply our approach to hard spheres and compare with the standard harmonic method.
NASA Astrophysics Data System (ADS)
Skersys, Tomas; Butleris, Rimantas; Kapocius, Kestutis
2013-10-01
Approaches for the analysis and specification of business vocabularies and rules are very relevant topics in both Business Process Management and Information Systems Development disciplines. However, in common practice of Information Systems Development, the Business modeling activities still are of mostly empiric nature. In this paper, basic aspects of the approach for business vocabularies' semi-automated extraction from business process models are presented. The approach is based on novel business modeling-level OMG standards "Business Process Model and Notation" (BPMN) and "Semantics for Business Vocabularies and Business Rules" (SBVR), thus contributing to OMG's vision about Model-Driven Architecture (MDA) and to model-driven development in general.
Multidomain approach for calculating compressible flows
NASA Technical Reports Server (NTRS)
Cambier, L.; Chazzi, W.; Veuillot, J. P.; Viviand, H.
1982-01-01
A multidomain approach for calculating compressible flows by using unsteady or pseudo-unsteady methods is presented. This approach is based on a general technique of connecting together two domains in which hyperbolic systems (that may differ) are solved with the aid of compatibility relations associated with these systems. Some examples of this approach's application to calculating transonic flows in ideal fluids are shown, particularly the adjustment of shock waves. The approach is then applied to treating a shock/boundary layer interaction problem in a transonic channel.
Application of Complex Adaptive Systems in Portfolio Management
ERIC Educational Resources Information Center
Su, Zheyuan
2017-01-01
Simulation-based methods are becoming a promising research tool in financial markets. A general Complex Adaptive System can be tailored to different application scenarios. Based on the current research, we built two models that would benefit portfolio management by utilizing Complex Adaptive Systems (CAS) in Agent-based Modeling (ABM) approach.…
Classification Framework for ICT-Based Learning Technologies for Disabled People
ERIC Educational Resources Information Center
Hersh, Marion
2017-01-01
The paper presents the first systematic approach to the classification of inclusive information and communication technologies (ICT)-based learning technologies and ICT-based learning technologies for disabled people which covers both assistive and general learning technologies, is valid for all disabled people and considers the full range of…
Heddam, Salim
2014-11-01
The prediction of colored dissolved organic matter (CDOM) using artificial neural network approaches has received little attention in the past few decades. In this study, colored dissolved organic matter (CDOM) was modeled using generalized regression neural network (GRNN) and multiple linear regression (MLR) models as a function of Water temperature (TE), pH, specific conductance (SC), and turbidity (TU). Evaluation of the prediction accuracy of the models is based on the root mean square error (RMSE), mean absolute error (MAE), coefficient of correlation (CC), and Willmott's index of agreement (d). The results indicated that GRNN can be applied successfully for prediction of colored dissolved organic matter (CDOM).
NASA Astrophysics Data System (ADS)
Boé, Julien; Terray, Laurent
2014-05-01
Ensemble approaches for climate change projections have become ubiquitous. Because of large model-to-model variations and, generally, lack of rationale for the choice of a particular climate model against others, it is widely accepted that future climate change and its impacts should not be estimated based on a single climate model. Generally, as a default approach, the multi-model ensemble mean (MMEM) is considered to provide the best estimate of climate change signals. The MMEM approach is based on the implicit hypothesis that all the models provide equally credible projections of future climate change. This hypothesis is unlikely to be true and ideally one would want to give more weight to more realistic models. A major issue with this alternative approach lies in the assessment of the relative credibility of future climate projections from different climate models, as they can only be evaluated against present-day observations: which present-day metric(s) should be used to decide which models are "good" and which models are "bad" in the future climate? Once a supposedly informative metric has been found, other issues arise. What is the best statistical method to combine multiple models results taking into account their relative credibility measured by a given metric? How to be sure in the end that the metric-based estimate of future climate change is not in fact less realistic than the MMEM? It is impossible to provide strict answers to those questions in the climate change context. Yet, in this presentation, we propose a methodological approach based on a perfect model framework that could bring some useful elements of answer to the questions previously mentioned. The basic idea is to take a random climate model in the ensemble and treat it as if it were the truth (results of this model, in both past and future climate, are called "synthetic observations"). Then, all the other members from the multi-model ensemble are used to derive thanks to a metric-based approach a posterior estimate of climate change, based on the synthetic observation of the metric. Finally, it is possible to compare the posterior estimate to the synthetic observation of future climate change to evaluate the skill of the method. The main objective of this presentation is to describe and apply this perfect model framework to test different methodological issues associated with non-uniform model weighting and similar metric-based approaches. The methodology presented is general, but will be applied to the specific case of summer temperature change in France, for which previous works have suggested potentially useful metrics associated with soil-atmosphere and cloud-temperature interactions. The relative performances of different simple statistical approaches to combine multiple model results based on metrics will be tested. The impact of ensemble size, observational errors, internal variability, and model similarity will be characterized. The potential improvements associated with metric-based approaches compared to the MMEM is terms of errors and uncertainties will be quantified.
Data splitting for artificial neural networks using SOM-based stratified sampling.
May, R J; Maier, H R; Dandy, G C
2010-03-01
Data splitting is an important consideration during artificial neural network (ANN) development where hold-out cross-validation is commonly employed to ensure generalization. Even for a moderate sample size, the sampling methodology used for data splitting can have a significant effect on the quality of the subsets used for training, testing and validating an ANN. Poor data splitting can result in inaccurate and highly variable model performance; however, the choice of sampling methodology is rarely given due consideration by ANN modellers. Increased confidence in the sampling is of paramount importance, since the hold-out sampling is generally performed only once during ANN development. This paper considers the variability in the quality of subsets that are obtained using different data splitting approaches. A novel approach to stratified sampling, based on Neyman sampling of the self-organizing map (SOM), is developed, with several guidelines identified for setting the SOM size and sample allocation in order to minimize the bias and variance in the datasets. Using an example ANN function approximation task, the SOM-based approach is evaluated in comparison to random sampling, DUPLEX, systematic stratified sampling, and trial-and-error sampling to minimize the statistical differences between data sets. Of these approaches, DUPLEX is found to provide benchmark performance with good model performance, with no variability. The results show that the SOM-based approach also reliably generates high-quality samples and can therefore be used with greater confidence than other approaches, especially in the case of non-uniform datasets, with the benefit of scalability to perform data splitting on large datasets. Copyright 2009 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Delgado, Francisco
2017-12-01
Quantum information is an emergent area merging physics, mathematics, computer science and engineering. To reach its technological goals, it is requiring adequate approaches to understand how to combine physical restrictions, computational approaches and technological requirements to get functional universal quantum information processing. This work presents the modeling and the analysis of certain general type of Hamiltonian representing several physical systems used in quantum information and establishing a dynamics reduction in a natural grammar for bipartite processing based on entangled states.
Cobalt: A GPU-based correlator and beamformer for LOFAR
NASA Astrophysics Data System (ADS)
Broekema, P. Chris; Mol, J. Jan David; Nijboer, R.; van Amesfoort, A. S.; Brentjens, M. A.; Loose, G. Marcel; Klijn, W. F. A.; Romein, J. W.
2018-04-01
For low-frequency radio astronomy, software correlation and beamforming on general purpose hardware is a viable alternative to custom designed hardware. LOFAR, a new-generation radio telescope centered in the Netherlands with international stations in Germany, France, Ireland, Poland, Sweden and the UK, has successfully used software real-time processors based on IBM Blue Gene technology since 2004. Since then, developments in technology have allowed us to build a system based on commercial off-the-shelf components that combines the same capabilities with lower operational cost. In this paper, we describe the design and implementation of a GPU-based correlator and beamformer with the same capabilities as the Blue Gene based systems. We focus on the design approach taken, and show the challenges faced in selecting an appropriate system. The design, implementation and verification of the software system show the value of a modern test-driven development approach. Operational experience, based on three years of operations, demonstrates that a general purpose system is a good alternative to the previous supercomputer-based system or custom-designed hardware.
Scaling the Fractal Plain: Towards a General View of Knowledge Management
ERIC Educational Resources Information Center
Griffiths, David; Evans, Peter
2011-01-01
Purpose: The purpose of the paper is to explore coherence across key disciplines of knowledge management (KM) for a general model as a way to address performance dissatisfaction in the field. Design/methodology/approach: Research employed an evidence-based meta-analysis (287 aspects of literature), triangulated through an exploratory survey (91…
General Training System; GENTRAS. Final Report.
ERIC Educational Resources Information Center
International Business Machines Corp., Gaithersburg, MD. Federal Systems Div.
GENTRAS (General Training System) is a computer-based training model for the Marine Corps which makes use of a systems approach. The model defines the skill levels applicable for career growth and classifies and defines the training needed for this growth. It also provides a training cost subsystem which will provide a more efficient means of…
Improving Reading and Social Studies Learning for Secondary Students with Reading Disabilities
ERIC Educational Resources Information Center
Capin, Philip; Vaughn, Sharon
2017-01-01
This article describes evidence-based practices that beginning special education teachers can readily implement in special or general education settings that promote reading and content outcomes for students with disabilities as well as general education students. We describe two approaches: (a) Promoting Adolescents' Comprehension of Text (PACT),…
Physical Activity Promotion in General Practices of Barcelona: A Case Study
ERIC Educational Resources Information Center
Puig Ribera, Anna; McKenna, Jim; Riddoch, Chris
2006-01-01
This case study aimed to generate explanations for the lack of integration of physical activity (PA) promotion in general practices of Barcelona, the capital of Catalonia. This explanatory study adopted a qualitative approach, based on three techniques; focus groups (n = 3), semi-structured (n = 25) and short individual interviews (n = 5). These…
USDA-ARS?s Scientific Manuscript database
A general method was developed for the quantification of hydroxycinnamic acid derivatives and flavones, flavonols, and their glycosides based on the UV molar relative response factors (MRRF) of the standards. Each of these phenolic compounds contains a cinnamoyl structure and has a maximum absorban...
ERIC Educational Resources Information Center
Sugiyama, Keimei; Cavanagh, Kevin V.; van Esch, Chantal; Bilimoria, Diana; Brown, Cara
2016-01-01
Trends in extant literature suggest that more relational and identity-based leadership approaches are necessary for leadership that can harness the benefits of the diverse and globalized workforces of today and the future. In this study, we compared general leadership development programs (GLDPs) and women's leadership development programs (WLDPs)…
ERIC Educational Resources Information Center
George, Linda A.; Brenner, Johanna
2010-01-01
The authors have developed and implemented a novel general education science course that examines scientific knowledge, laboratory experimentation, and science-related public policy through the lens of feminist science studies. They argue that this approach to teaching general science education is useful for improving science literacy. Goals for…
ERIC Educational Resources Information Center
Bouck, Emily C.; Satsangi, Rajiv; Doughty, Teresa Taber; Courtney, William T.
2014-01-01
Students with autism spectrum disorder (ASD) are included in general education classes and expected to participate in general education content, such as mathematics. Yet, little research explores academically-based mathematics instruction for this population. This single subject alternating treatment design study explored the effectiveness of…
Acquiring Software Design Schemas: A Machine Learning Perspective
NASA Technical Reports Server (NTRS)
Harandi, Mehdi T.; Lee, Hing-Yan
1991-01-01
In this paper, we describe an approach based on machine learning that acquires software design schemas from design cases of existing applications. An overview of the technique, design representation, and acquisition system are presented. the paper also addresses issues associated with generalizing common features such as biases. The generalization process is illustrated using an example.
A new method for calculating differential distributions directly in Mellin space
NASA Astrophysics Data System (ADS)
Mitov, Alexander
2006-12-01
We present a new method for the calculation of differential distributions directly in Mellin space without recourse to the usual momentum-fraction (or z-) space. The method is completely general and can be applied to any process. It is based on solving the integration-by-parts identities when one of the powers of the propagators is an abstract number. The method retains the full dependence on the Mellin variable and can be implemented in any program for solving the IBP identities based on algebraic elimination, like Laporta. General features of the method are: (1) faster reduction, (2) smaller number of master integrals compared to the usual z-space approach and (3) the master integrals satisfy difference instead of differential equations. This approach generalizes previous results related to fully inclusive observables like the recently calculated three-loop space-like anomalous dimensions and coefficient functions in inclusive DIS to more general processes requiring separate treatment of the various physical cuts. Many possible applications of this method exist, the most notable being the direct evaluation of the three-loop time-like splitting functions in QCD.
A new approach for describing glass transition kinetics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vasin, N. M.; Shchelkachev, M. G.; Vinokur, V. M.
2010-04-01
We use a functional integral technique generalizing the Keldysh diagram technique to describe glass transition kinetics. We show that the Keldysh functional approach takes the dynamical determinant arising in the glass dynamics into account exactly and generalizes the traditional approach based on using the supersymmetric dynamic generating functional method. In contrast to the supersymmetric method, this approach allows avoiding additional Grassmannian fields and tracking the violation of the fluctuation-dissipation theorem explicitly. We use this method to describe the dynamics of an Edwards-Anderson soft spin-glass-type model near the paramagnet-glass transition. We show that a Vogel-Fulcher-type dynamics arises in the fluctuation regionmore » only if the fluctuation-dissipation theorem is violated in the process of dynamical renormalization of the Keldysh action in the replica space.« less
Knowledge-based verification of clinical guidelines by detection of anomalies.
Duftschmid, G; Miksch, S
2001-04-01
As shown in numerous studies, a significant part of published clinical guidelines is tainted with different types of semantical errors that interfere with their practical application. The adaptation of generic guidelines, necessitated by circumstances such as resource limitations within the applying organization or unexpected events arising in the course of patient care, further promotes the introduction of defects. Still, most current approaches for the automation of clinical guidelines are lacking mechanisms, which check the overall correctness of their output. In the domain of software engineering in general and in the domain of knowledge-based systems (KBS) in particular, a common strategy to examine a system for potential defects consists in its verification. The focus of this work is to present an approach, which helps to ensure the semantical correctness of clinical guidelines in a three-step process. We use a particular guideline specification language called Asbru to demonstrate our verification mechanism. A scenario-based evaluation of our method is provided based on a guideline for the artificial ventilation of newborn infants. The described approach is kept sufficiently general in order to allow its application to several other guideline representation formats.
Quasi-classical approaches to vibronic spectra revisited
NASA Astrophysics Data System (ADS)
Karsten, Sven; Ivanov, Sergei D.; Bokarev, Sergey I.; Kühn, Oliver
2018-03-01
The framework to approach quasi-classical dynamics in the electronic ground state is well established and is based on the Kubo-transformed time correlation function (TCF), being the most classical-like quantum TCF. Here we discuss whether the choice of the Kubo-transformed TCF as a starting point for simulating vibronic spectra is as unambiguous as it is for vibrational ones. Employing imaginary-time path integral techniques in combination with the interaction representation allowed us to formulate a method for simulating vibronic spectra in the adiabatic regime that takes nuclear quantum effects and dynamics on multiple potential energy surfaces into account. Further, a generalized quantum TCF is proposed that contains many well-established TCFs, including the Kubo one, as particular cases. Importantly, it also provides a framework to construct new quantum TCFs. Applying the developed methodology to the generalized TCF leads to a plethora of simulation protocols, which are based on the well-known TCFs as well as on new ones. Their performance is investigated on 1D anharmonic model systems at finite temperatures. It is shown that the protocols based on the new TCFs may lead to superior results with respect to those based on the common ones. The strategies to find the optimal approach are discussed.
Bourjaily, Jacob L.; Herrmann, Enrico; Trnka, Jaroslav
2017-06-12
We introduce a prescriptive approach to generalized unitarity, resulting in a strictly-diagonal basis of loop integrands with coefficients given by specifically-tailored residues in field theory. We illustrate the power of this strategy in the case of planar, maximally supersymmetric Yang-Mills theory (SYM), where we construct closed-form representations of all (n-point N k MHV) scattering amplitudes through three loops. The prescriptive approach contrasts with the ordinary description of unitarity-based methods by avoiding any need for linear algebra to determine integrand coefficients. We describe this approach in general terms as it should have applications to many quantum field theories, including those withoutmore » planarity, supersymmetry, or massless spectra defined in any number of dimensions.« less
High Order Modulation Protograph Codes
NASA Technical Reports Server (NTRS)
Nguyen, Thuy V. (Inventor); Nosratinia, Aria (Inventor); Divsalar, Dariush (Inventor)
2014-01-01
Digital communication coding methods for designing protograph-based bit-interleaved code modulation that is general and applies to any modulation. The general coding framework can support not only multiple rates but also adaptive modulation. The method is a two stage lifting approach. In the first stage, an original protograph is lifted to a slightly larger intermediate protograph. The intermediate protograph is then lifted via a circulant matrix to the expected codeword length to form a protograph-based low-density parity-check code.
Truccolo, Wilson
2017-01-01
This review presents a perspective on capturing collective dynamics in recorded neuronal ensembles based on multivariate point process models, inference of low-dimensional dynamics and coarse graining of spatiotemporal measurements. A general probabilistic framework for continuous time point processes reviewed, with an emphasis on multivariate nonlinear Hawkes processes with exogenous inputs. A point process generalized linear model (PP-GLM) framework for the estimation of discrete time multivariate nonlinear Hawkes processes is described. The approach is illustrated with the modeling of collective dynamics in neocortical neuronal ensembles recorded in human and non-human primates, and prediction of single-neuron spiking. A complementary approach to capture collective dynamics based on low-dimensional dynamics (“order parameters”) inferred via latent state-space models with point process observations is presented. The approach is illustrated by inferring and decoding low-dimensional dynamics in primate motor cortex during naturalistic reach and grasp movements. Finally, we briefly review hypothesis tests based on conditional inference and spatiotemporal coarse graining for assessing collective dynamics in recorded neuronal ensembles. PMID:28336305
Truccolo, Wilson
2016-11-01
This review presents a perspective on capturing collective dynamics in recorded neuronal ensembles based on multivariate point process models, inference of low-dimensional dynamics and coarse graining of spatiotemporal measurements. A general probabilistic framework for continuous time point processes reviewed, with an emphasis on multivariate nonlinear Hawkes processes with exogenous inputs. A point process generalized linear model (PP-GLM) framework for the estimation of discrete time multivariate nonlinear Hawkes processes is described. The approach is illustrated with the modeling of collective dynamics in neocortical neuronal ensembles recorded in human and non-human primates, and prediction of single-neuron spiking. A complementary approach to capture collective dynamics based on low-dimensional dynamics ("order parameters") inferred via latent state-space models with point process observations is presented. The approach is illustrated by inferring and decoding low-dimensional dynamics in primate motor cortex during naturalistic reach and grasp movements. Finally, we briefly review hypothesis tests based on conditional inference and spatiotemporal coarse graining for assessing collective dynamics in recorded neuronal ensembles. Published by Elsevier Ltd.
Bayesian networks improve causal environmental assessments for evidence-based policy
Rule-based weight of evidence approaches to ecological risk assessment may not account for uncertainties and generally lack probabilistic integration of lines of evidence. Bayesian networks allow causal inferences to be made from evidence by including causal knowledge about the p...
Lim, Seong-Rin; Lam, Carl W; Schoenung, Julie M
2011-09-01
Life Cycle Impact Assessment (LCIA) and Risk Assessment (RA) employ different approaches to evaluate toxic impact potential for their own general applications. LCIA is often used to evaluate toxicity potentials for corporate environmental management and RA is often used to evaluate a risk score for environmental policy in government. This study evaluates the cancer, non-cancer, and ecotoxicity potentials and risk scores of chemicals and industry sectors in the United States on the basis of the LCIA- and RA-based tools developed by U.S. EPA, and compares the priority screening of toxic chemicals and industry sectors identified with each method to examine whether the LCIA- and RA-based results lead to the same prioritization schemes. The Tool for the Reduction and Assessment of Chemical and other environmental Impacts (TRACI) is applied as an LCIA-based screening approach with a focus on air and water emissions, and the Risk-Screening Environmental Indicator (RSEI) is applied in equivalent fashion as an RA-based screening approach. The U.S. Toxic Release Inventory is used as the dataset for this analysis, because of its general applicability to a comprehensive list of chemical substances and industry sectors. Overall, the TRACI and RSEI results do not agree with each other in part due to the unavailability of characterization factors and toxic scores for select substances, but primarily because of their different evaluation approaches. Therefore, TRACI and RSEI should be used together both to support a more comprehensive and robust approach to screening of chemicals for environmental management and policy and to highlight substances that are found to be of concern from both perspectives. Copyright © 2011 Elsevier Ltd. All rights reserved.
An evaluation of risk estimation procedures for mixtures of carcinogens
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hwang, J.S.; Chen, J.J.
1999-12-01
The estimation of health risks from exposure to a mixture of chemical carcinogens is generally based on the combination of information from several available single compound studies. The current practice of directly summing the upper bound risk estimates of individual carcinogenic components as an upper bound on the total risk of a mixture is known to be generally too conservative. Gaylor and Chen (1996, Risk Analysis) proposed a simple procedure to compute an upper bound on the total risk using only the upper confidence limits and central risk estimates of individual carcinogens. The Gaylor-Chen procedure was derived based on anmore » underlying assumption of the normality for the distributions of individual risk estimates. IN this paper the authors evaluated the Gaylor-Chen approach in terms the coverages of the upper confidence limits on the true risks of individual carcinogens. In general, if the coverage probabilities for the individual carcinogens are all approximately equal to the nominal level, then the Gaylor-Chen approach should perform well. However, the Gaylor-Chen approach can be conservative or anti-conservative if some of all individual upper confidence limit estimates are conservative or anti-conservative.« less
Campbell, D A; Chkrebtii, O
2013-12-01
Statistical inference for biochemical models often faces a variety of characteristic challenges. In this paper we examine state and parameter estimation for the JAK-STAT intracellular signalling mechanism, which exemplifies the implementation intricacies common in many biochemical inference problems. We introduce an extension to the Generalized Smoothing approach for estimating delay differential equation models, addressing selection of complexity parameters, choice of the basis system, and appropriate optimization strategies. Motivated by the JAK-STAT system, we further extend the generalized smoothing approach to consider a nonlinear observation process with additional unknown parameters, and highlight how the approach handles unobserved states and unevenly spaced observations. The methodology developed is generally applicable to problems of estimation for differential equation models with delays, unobserved states, nonlinear observation processes, and partially observed histories. Crown Copyright © 2013. Published by Elsevier Inc. All rights reserved.
Computational Approaches to the Chemical Equilibrium Constant in Protein-ligand Binding.
Montalvo-Acosta, Joel José; Cecchini, Marco
2016-12-01
The physiological role played by protein-ligand recognition has motivated the development of several computational approaches to the ligand binding affinity. Some of them, termed rigorous, have a strong theoretical foundation but involve too much computation to be generally useful. Some others alleviate the computational burden by introducing strong approximations and/or empirical calibrations, which also limit their general use. Most importantly, there is no straightforward correlation between the predictive power and the level of approximation introduced. Here, we present a general framework for the quantitative interpretation of protein-ligand binding based on statistical mechanics. Within this framework, we re-derive self-consistently the fundamental equations of some popular approaches to the binding constant and pinpoint the inherent approximations. Our analysis represents a first step towards the development of variants with optimum accuracy/efficiency ratio for each stage of the drug discovery pipeline. © 2016 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Kiani, M.; Abdolali, A.; Safari, M.
2018-03-01
In this article, an analytical approach is presented for the analysis of electromagnetic (EM) scattering from radially inhomogeneous spherical structures (RISSs) based on the duality principle. According to the spherical symmetry, similar angular dependencies in all the regions are considered using spherical harmonics. To extract the radial dependency, the system of differential equations of wave propagation toward the inhomogeneity direction is equated with the dual planar ones. A general duality between electromagnetic fields and parameters and scattering parameters of the two structures is introduced. The validity of the proposed approach is verified through a comprehensive example. The presented approach substitutes a complicated problem in spherical coordinate to an easy, well posed, and previously solved problem in planar geometry. This approach is valid for all continuously varying inhomogeneity profiles. One of the major advantages of the proposed method is the capability of studying two general and applicable types of RISSs. As an interesting application, a class of lens antenna based on the physical concept of the gradient refractive index material is introduced. The approach is used to analyze the EM scattering from the structure and validate strong performance of the lens.
Dutta, Shuchismita; Zardecki, Christine; Goodsell, David S; Berman, Helen M
2010-10-01
The Research Collaboratory for Structural Bioinformatics Protein Data Bank (RCSB PDB) supports scientific research and education worldwide by providing an essential resource of information on biomolecular structures. In addition to serving as a deposition, data-processing and distribution center for PDB data, the RCSB PDB offers resources and online materials that different audiences can use to customize their structural biology instruction. These include resources for general audiences that present macromolecular structure in the context of a biological theme, method-based materials for researchers who take a more traditional approach to the presentation of structural science, and materials that mix theme-based and method-based approaches for educators and students. Through these efforts the RCSB PDB aims to enable optimal use of structural data by researchers, educators and students designing and understanding experiments in biology, chemistry and medicine, and by general users making informed decisions about their life and health.
Discrete Adjoint-Based Design for Unsteady Turbulent Flows On Dynamic Overset Unstructured Grids
NASA Technical Reports Server (NTRS)
Nielsen, Eric J.; Diskin, Boris
2012-01-01
A discrete adjoint-based design methodology for unsteady turbulent flows on three-dimensional dynamic overset unstructured grids is formulated, implemented, and verified. The methodology supports both compressible and incompressible flows and is amenable to massively parallel computing environments. The approach provides a general framework for performing highly efficient and discretely consistent sensitivity analysis for problems involving arbitrary combinations of overset unstructured grids which may be static, undergoing rigid or deforming motions, or any combination thereof. General parent-child motions are also accommodated, and the accuracy of the implementation is established using an independent verification based on a complex-variable approach. The methodology is used to demonstrate aerodynamic optimizations of a wind turbine geometry, a biologically-inspired flapping wing, and a complex helicopter configuration subject to trimming constraints. The objective function for each problem is successfully reduced and all specified constraints are satisfied.
Real-time traffic sign recognition based on a general purpose GPU and deep-learning
Hong, Yongwon; Choi, Yeongwoo; Byun, Hyeran
2017-01-01
We present a General Purpose Graphics Processing Unit (GPGPU) based real-time traffic sign detection and recognition method that is robust against illumination changes. There have been many approaches to traffic sign recognition in various research fields; however, previous approaches faced several limitations when under low illumination or wide variance of light conditions. To overcome these drawbacks and improve processing speeds, we propose a method that 1) is robust against illumination changes, 2) uses GPGPU-based real-time traffic sign detection, and 3) performs region detecting and recognition using a hierarchical model. This method produces stable results in low illumination environments. Both detection and hierarchical recognition are performed in real-time, and the proposed method achieves 0.97 F1-score on our collective dataset, which uses the Vienna convention traffic rules (Germany and South Korea). PMID:28264011
Toyota Prius HEV neurocontrol and diagnostics.
Prokhorov, Danil V
2008-01-01
A neural network controller for improved fuel efficiency of the Toyota Prius hybrid electric vehicle is proposed. A new method to detect and mitigate a battery fault is also presented. The approach is based on recurrent neural networks and includes the extended Kalman filter. The proposed approach is quite general and applicable to other control systems.
Revisiting First Language Acquisition through Empirical and Rational Perspectives
ERIC Educational Resources Information Center
Tahriri, Abdorreza
2012-01-01
Acquisition in general and first language acquisition in particular is a very complex and a multifaceted phenomenon. The way that children acquire a language in a very limited period is astonishing. Various approaches have been proposed so far to account for this extraordinary phenomenon. These approaches are indeed based on various philosophical…
Learning English with an Invisible Teacher: An Experimental Video Approach.
ERIC Educational Resources Information Center
Eisenstein, Miriam; And Others
1987-01-01
Reports on an experimental teaching approach, based on an innovative video series, used in an English-as-a-second-language (ESL) class for beginning learners. The tapes, which focused on students as they learned (with the viewers learning along with them), showed generally favorable results for ESL students. (Author/CB)
A Feminist Theory of Psychotherapy Based on Authenticity.
ERIC Educational Resources Information Center
Brody, Claire M.
In a "direct" approach to psychotherapy, the therapist generally uses herself as a model and communicates her own values, thereby influencing the gender roles of her clients, particularly her female clients. In this approach, the therapist is seen as more authentic by the client, especially by clients from diverse cultural and social backgrounds.…
ERIC Educational Resources Information Center
Seth, Anupam
2009-01-01
Production planning and scheduling for printed circuit, board assembly has so far defied standard operations research approaches due to the size and complexity of the underlying problems, resulting in unexploited automation flexibility. In this thesis, the increasingly popular collect-and-place machine configuration is studied and the assembly…
Weighted Least Squares Fitting Using Ordinary Least Squares Algorithms.
ERIC Educational Resources Information Center
Kiers, Henk A. L.
1997-01-01
A general approach for fitting a model to a data matrix by weighted least squares (WLS) is studied. The approach consists of iteratively performing steps of existing algorithms for ordinary least squares fitting of the same model and is based on maximizing a function that majorizes WLS loss function. (Author/SLD)
ERIC Educational Resources Information Center
Cox, Pamela L.; Bobrowski, Paula E.; Spector, Margaret
2004-01-01
Reforms in higher education are appearing in the new guidelines that are being developed for general education curriculums across the country. Constituents leading education reform have suggested that writing be integrated across the curriculum and embedded within several discipline-based courses. Advocates of this approach require that schools…
Teaching and Learning Cycles in a Constructivist Approach to Instruction
ERIC Educational Resources Information Center
Singer, Florence Mihaela; Moscovici, Hedy
2008-01-01
This study attempts to analyze and synthesize the knowledge collected in the area of conceptual models used in teaching and learning during inquiry-based projects, and to propose a new frame for organizing the classroom interactions within a constructivist approach. The IMSTRA model consists in three general phases: Immersion, Structuring,…
NASA Astrophysics Data System (ADS)
Divine, D. V.; Godtliebsen, F.; Rue, H.
2012-01-01
The paper proposes an approach to assessment of timescale errors in proxy-based series with chronological uncertainties. The method relies on approximation of the physical process(es) forming a proxy archive by a random Gamma process. Parameters of the process are partly data-driven and partly determined from prior assumptions. For a particular case of a linear accumulation model and absolutely dated tie points an analytical solution is found suggesting the Beta-distributed probability density on age estimates along the length of a proxy archive. In a general situation of uncertainties in the ages of the tie points the proposed method employs MCMC simulations of age-depth profiles yielding empirical confidence intervals on the constructed piecewise linear best guess timescale. It is suggested that the approach can be further extended to a more general case of a time-varying expected accumulation between the tie points. The approach is illustrated by using two ice and two lake/marine sediment cores representing the typical examples of paleoproxy archives with age models based on tie points of mixed origin.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beau, Mathieu, E-mail: mbeau@stp.dias.ie; Savoie, Baptiste, E-mail: baptiste.savoie@gmail.com
2014-05-15
In this paper, we rigorously investigate the reduced density matrix (RDM) associated to the ideal Bose gas in harmonic traps. We present a method based on a sum-decomposition of the RDM allowing to treat not only the isotropic trap, but also general anisotropic traps. When focusing on the isotropic trap, the method is analogous to the loop-gas approach developed by Mullin [“The loop-gas approach to Bose-Einstein condensation for trapped particles,” Am. J. Phys. 68(2), 120 (2000)]. Turning to the case of anisotropic traps, we examine the RDM for some anisotropic trap models corresponding to some quasi-1D and quasi-2D regimes. Formore » such models, we bring out an additional contribution in the local density of particles which arises from the mesoscopic loops. The close connection with the occurrence of generalized-Bose-Einstein condensation is discussed. Our loop-gas-like approach provides relevant information which can help guide numerical investigations on highly anisotropic systems based on the Path Integral Monte Carlo method.« less
Numerically pricing American options under the generalized mixed fractional Brownian motion model
NASA Astrophysics Data System (ADS)
Chen, Wenting; Yan, Bowen; Lian, Guanghua; Zhang, Ying
2016-06-01
In this paper, we introduce a robust numerical method, based on the upwind scheme, for the pricing of American puts under the generalized mixed fractional Brownian motion (GMFBM) model. By using portfolio analysis and applying the Wick-Itô formula, a partial differential equation (PDE) governing the prices of vanilla options under the GMFBM is successfully derived for the first time. Based on this, we formulate the pricing of American puts under the current model as a linear complementarity problem (LCP). Unlike the classical Black-Scholes (B-S) model or the generalized B-S model discussed in Cen and Le (2011), the newly obtained LCP under the GMFBM model is difficult to be solved accurately because of the numerical instability which results from the degeneration of the governing PDE as time approaches zero. To overcome this difficulty, a numerical approach based on the upwind scheme is adopted. It is shown that the coefficient matrix of the current method is an M-matrix, which ensures its stability in the maximum-norm sense. Remarkably, we have managed to provide a sharp theoretic error estimate for the current method, which is further verified numerically. The results of various numerical experiments also suggest that this new approach is quite accurate, and can be easily extended to price other types of financial derivatives with an American-style exercise feature under the GMFBM model.
NASA Astrophysics Data System (ADS)
Boyer, Frédéric; Porez, Mathieu; Morsli, Ferhat; Morel, Yannick
2017-08-01
In animal locomotion, either in fish or flying insects, the use of flexible terminal organs or appendages greatly improves the performance of locomotion (thrust and lift). In this article, we propose a general unified framework for modeling and simulating the (bio-inspired) locomotion of robots using soft organs. The proposed approach is based on the model of Mobile Multibody Systems (MMS). The distributed flexibilities are modeled according to two major approaches: the Floating Frame Approach (FFA) and the Geometrically Exact Approach (GEA). Encompassing these two approaches in the Newton-Euler modeling formalism of robotics, this article proposes a unique modeling framework suited to the fast numerical integration of the dynamics of a MMS in both the FFA and the GEA. This general framework is applied on two illustrative examples drawn from bio-inspired locomotion: the passive swimming in von Karman Vortex Street, and the hovering flight with flexible flapping wings.
Efficient Proximity Computation Techniques Using ZIP Code Data for Smart Cities †
Murdani, Muhammad Harist; Hong, Bonghee
2018-01-01
In this paper, we are interested in computing ZIP code proximity from two perspectives, proximity between two ZIP codes (Ad-Hoc) and neighborhood proximity (Top-K). Such a computation can be used for ZIP code-based target marketing as one of the smart city applications. A naïve approach to this computation is the usage of the distance between ZIP codes. We redefine a distance metric combining the centroid distance with the intersecting road network between ZIP codes by using a weighted sum method. Furthermore, we prove that the results of our combined approach conform to the characteristics of distance measurement. We have proposed a general and heuristic approach for computing Ad-Hoc proximity, while for computing Top-K proximity, we have proposed a general approach only. Our experimental results indicate that our approaches are verifiable and effective in reducing the execution time and search space. PMID:29587366
Progressive fracture of polymer matrix composite structures: A new approach
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Murthy, P. L. N.; Minnetyan, L.
1992-01-01
A new approach independent of stress intensity factors and fracture toughness parameters has been developed and is described for the computational simulation of progressive fracture of polymer matrix composite structures. The damage stages are quantified based on physics via composite mechanics while the degradation of the structural behavior is quantified via the finite element method. The approach account for all types of composite behavior, structures, load conditions, and fracture processes starting from damage initiation, to unstable propagation and to global structural collapse. Results of structural fracture in composite beams, panels, plates, and shells are presented to demonstrate the effectiveness and versatility of this new approach. Parameters and guidelines are identified which can be used as criteria for structural fracture, inspection intervals, and retirement for cause. Generalization to structures made of monolithic metallic materials are outlined and lessons learned in undertaking the development of new approaches, in general, are summarized.
Efficient Proximity Computation Techniques Using ZIP Code Data for Smart Cities †.
Murdani, Muhammad Harist; Kwon, Joonho; Choi, Yoon-Ho; Hong, Bonghee
2018-03-24
In this paper, we are interested in computing ZIP code proximity from two perspectives, proximity between two ZIP codes ( Ad-Hoc ) and neighborhood proximity ( Top-K ). Such a computation can be used for ZIP code-based target marketing as one of the smart city applications. A naïve approach to this computation is the usage of the distance between ZIP codes. We redefine a distance metric combining the centroid distance with the intersecting road network between ZIP codes by using a weighted sum method. Furthermore, we prove that the results of our combined approach conform to the characteristics of distance measurement. We have proposed a general and heuristic approach for computing Ad-Hoc proximity, while for computing Top-K proximity, we have proposed a general approach only. Our experimental results indicate that our approaches are verifiable and effective in reducing the execution time and search space.
The Sociological Imagination and Community-Based Learning: Using an Asset-Based Approach
ERIC Educational Resources Information Center
Garoutte, Lisa
2018-01-01
Fostering a sociological imagination in students is a central goal for most introductory sociology courses and sociology departments generally, yet success is difficult to achieve. This project suggests that using elements of asset-based community development can be used in sociology classrooms to develop a sociological perspective. After…
Version Control in Project-Based Learning
ERIC Educational Resources Information Center
Milentijevic, Ivan; Ciric, Vladimir; Vojinovic, Oliver
2008-01-01
This paper deals with the development of a generalized model for version control systems application as a support in a range of project-based learning methods. The model is given as UML sequence diagram and described in detail. The proposed model encompasses a wide range of different project-based learning approaches by assigning a supervisory…
Concept Cartoons Supported Problem Based Learning Method in Middle School Science Classrooms
ERIC Educational Resources Information Center
Balim, Ali Günay; Inel-Ekici, Didem; Özcan, Erkan
2016-01-01
Problem based learning, in which events from daily life are presented as interesting scenarios, is one of the active learning approaches that encourages students to self-direct learning. Problem based learning, generally used in higher education, requires students to use high end thinking skills in learning environments. In order to use…
Evidence-Based Practice for Teachers of Children with Autism: A Dynamic Approach
ERIC Educational Resources Information Center
Lubas, Margaret; Mitchell, Jennifer; De Leo, Gianluca
2016-01-01
Evidence-based practice related to autism research is a controversial topic. Governmental entities and national agencies are defining evidence-based practice as a specific set of interventions that educators should implement; however, large-scale efforts to generalize autism research, which are often single-subject case designs, may be a setback…
A method for evaluating models that use galaxy rotation curves to derive the density profiles
NASA Astrophysics Data System (ADS)
de Almeida, Álefe O. F.; Piattella, Oliver F.; Rodrigues, Davi C.
2016-11-01
There are some approaches, either based on General Relativity (GR) or modified gravity, that use galaxy rotation curves to derive the matter density of the corresponding galaxy, and this procedure would either indicate a partial or a complete elimination of dark matter in galaxies. Here we review these approaches, clarify the difficulties on this inverted procedure, present a method for evaluating them, and use it to test two specific approaches that are based on GR: the Cooperstock-Tieu (CT) and the Balasin-Grumiller (BG) approaches. Using this new method, we find that neither of the tested approaches can satisfactorily fit the observational data without dark matter. The CT approach results can be significantly improved if some dark matter is considered, while for the BG approach no usual dark matter halo can improve its results.
Model-based synthesis of locally contingent responses to global market signals
NASA Astrophysics Data System (ADS)
Magliocca, N. R.
2015-12-01
Rural livelihoods and the land systems on which they depend are increasingly influenced by distant markets through economic globalization. Place-based analyses of land and livelihood system sustainability must then consider both proximate and distant influences on local decision-making. Thus, advancing land change theory in the context of economic globalization calls for a systematic understanding of the general processes as well as local contingencies shaping local responses to global signals. Synthesis of insights from place-based case studies of land and livelihood change is a path forward for developing such systematic knowledge. This paper introduces a model-based synthesis approach to investigating the influence of local socio-environmental and agent-level factors in mediating land-use and livelihood responses to changing global market signals. A generalized agent-based modeling framework is applied to six case-study sites that differ in environmental conditions, market access and influence, and livelihood settings. The largest modeled land conversions and livelihood transitions to market-oriented production occurred in sties with relatively productive agricultural land and/or with limited livelihood options. Experimental shifts in the distributions of agents' risk tolerances generally acted to attenuate or amplify responses to changes in global market signals. Importantly, however, responses of agents at different points in the risk tolerance distribution varied widely, with the wealth gap growing wider between agents with higher or lower risk tolerance. These results demonstrate model-based synthesis is a promising approach to overcome many of the challenges of current synthesis methods in land change science, and to identify generalized as well as locally contingent responses to global market signals.
... discipline. There is no recognized TMJ specialty in dentistry. For a massage-based approach, look for a ... Updated by: Michael Kapner, DDS, general and aesthetic dentistry, Norwalk Medical Center, Norwalk, CT. Review provided by ...
Photochemically Initiated Single Polymer Immobilization
2015-01-01
This Concept article surveys methods for attaching single polymer molecules on solid substrates. A general approach to single polymer immobilization based on the photochemistry of perfluorophenylazides is elaborated. PMID:17444538
Pickup, John Alexander; Dewaele, Joost; Furmanski, Nicola L; Kowalczyk, Agnieszka; Luijkx, Gerard Ca; Mathieu, Sophie; Stelter, Norbert
2017-01-01
Cleaning products have long been a focus of efforts to improve sustainability and assure safety for the aquatic environment when disposed of after use. The latter is addressed at ingredient level through environmental risk assessment, including in formal frameworks such as REACH. Nevertheless, in the context of programs to improve overall sustainability, stakeholders demand both environmental safety assurance and progress at product level. Current product-level approaches for aquatic toxicity (e.g., USEtox™, Critical Dilution Volume) can be seen as predominantly hazard-based. The more logical approach would be risk-based, because ecotoxicity is generally threshold-dependent and hazard-based assessment produces conflicts with risk-based learnings. The development of a risk-based approach to assess formulated products is described: the International Association for Soaps, Detergents and Maintenance Products (A.I.S.E.) Charter Environmental Safety Check (ESC), which is consistent with the scientific principles underlying REACH. This is implemented through a simple spreadsheet tool and internal database of ingredient parameters including predicted no-effect concentration (PNEC) and removal rate. A novel feature is applying market volume information for both product types and ingredients to permit a risk-based calculation. To pass the ESC check, the projected environmental safety ratio (PESR) for each ingredient as formulated and dosed (unless cleared by a published risk assessment or exempted as inherently low risk) must be less than 1. The advantages of a risk-based approach are discussed. The strengths and limitations of various possible approaches to standard-setting, product-ranking and driving continuous improvement in respect of potential ecotoxic impacts on the aquatic environment are considered. It is proposed that as ecotoxicity is generally accepted to be threshold-dependent, with no effect below the threshold, the most constructive approach to continuous improvement of sustainability with regard to ecotoxicity is to focus efforts on instances where the safety margins for ingredients as used in specific products are narrow. This necessitates a risk-based approach. Integr Environ Assess Manag 2017;13:127-138. © 2016 The Authors. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of SETAC. © 2016 The Authors. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of SETAC.
Contemporary Didactics in Higher Education in Russia
ERIC Educational Resources Information Center
Shershneva, Victoria A.; Shkerina, Lyudmila V.; Sidorov, Valery N.; Sidorova, Tatiana V.; Safonov, Konstantin V.
2016-01-01
The article presents the theoretical framework for a competency-based approach in higher education. It shows that the general didactic principles of professional direction, interdisciplinary connections, fundamentalization and informatization form the didactic basis for the competency-based training in university. The article also actualizes the…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saygin, H.; Hebert, A.
The calculation of a dilution cross section {bar {sigma}}{sub e} is the most important step in the self-shielding formalism based on the equivalence principle. If a dilution cross section that accurately characterizes the physical situation can be calculated, it can then be used for calculating the effective resonance integrals and obtaining accurate self-shielded cross sections. A new technique for the calculation of equivalent cross sections based on the formalism of Riemann integration in the resolved energy domain is proposed. This new method is compared to the generalized Stamm`ler method, which is also based on an equivalence principle, for a two-regionmore » cylindrical cell and for a small pressurized water reactor assembly in two dimensions. The accuracy of each computing approach is obtained using reference results obtained from a fine-group slowing-down code named CESCOL. It is shown that the proposed method leads to slightly better performance than the generalized Stamm`ler approach.« less
Adaptive Long-Term Monitoring at Environmental Restoration Sites (ER-0629)
2009-05-01
Figures Figure 2-1 General Flowchart of Software Application Figure 2-2 Overview of the Genetic Algorithm Approach Figure 2-3 Example of a...and Model Builder) are highlighted on Figure 2-1, which is a general flowchart illustrating the application of the software. The software is applied...monitoring event (e.g., contaminant mass based on interpolation) that modeling is provided by Model Builder. 4 Figure 2-1. General Flowchart of Software
Courses of action for effects based operations using evolutionary algorithms
NASA Astrophysics Data System (ADS)
Haider, Sajjad; Levis, Alexander H.
2006-05-01
This paper presents an Evolutionary Algorithms (EAs) based approach to identify effective courses of action (COAs) in Effects Based Operations. The approach uses Timed Influence Nets (TINs) as the underlying mathematical model to capture a dynamic uncertain situation. TINs provide a concise graph-theoretic probabilistic approach to specify the cause and effect relationships that exist among the variables of interest (actions, desired effects, and other uncertain events) in a problem domain. The purpose of building these TIN models is to identify and analyze several alternative courses of action. The current practice is to use trial and error based techniques which are not only labor intensive but also produce sub-optimal results and are not capable of modeling constraints among actionable events. The EA based approach presented in this paper is aimed to overcome these limitations. The approach generates multiple COAs that are close enough in terms of achieving the desired effect. The purpose of generating multiple COAs is to give several alternatives to a decision maker. Moreover, the alternate COAs could be generalized based on the relationships that exist among the actions and their execution timings. The approach also allows a system analyst to capture certain types of constraints among actionable events.
An internet graph model based on trade-off optimization
NASA Astrophysics Data System (ADS)
Alvarez-Hamelin, J. I.; Schabanel, N.
2004-03-01
This paper presents a new model for the Internet graph (AS graph) based on the concept of heuristic trade-off optimization, introduced by Fabrikant, Koutsoupias and Papadimitriou in[CITE] to grow a random tree with a heavily tailed degree distribution. We propose here a generalization of this approach to generate a general graph, as a candidate for modeling the Internet. We present the results of our simulations and an analysis of the standard parameters measured in our model, compared with measurements from the physical Internet graph.
Center for Parallel Optimization.
1996-03-19
A NEW OPTIMIZATION BASED APPROACH TO IMPROVING GENERALIZATION IN MACHINE LEARNING HAS BEEN PROPOSED AND COMPUTATIONALLY VALIDATED ON SIMPLE LINEAR MODELS AS WELL AS ON HIGHLY NONLINEAR SYSTEMS SUCH AS NEURAL NETWORKS.
ERIC Educational Resources Information Center
Chopra, I.; O'Connor, J.; Pancho, R.; Chrzanowski, M.; Sandi-Urena, S.
2017-01-01
This qualitative study investigated the experience of a cohort of students exposed consecutively to two substantially different environments in their General Chemistry Laboratory programme. To this end, the first semester in a traditional expository programme was followed by a semester in a cooperative, problem-based, multi-week format. The focus…
Information Seeking Research Needs Extension towards Tasks and Technology
ERIC Educational Resources Information Center
Järvelin, Kalervo; Ingwersen, Peter
2004-01-01
This paper discusses the research into information seeking and its directions at a general level. We approach this topic by analysis and argumentation based on past research in the domain. We begin by presenting a general model of information seeking and retrieval which is used to derive nine broad dimensions that are needed to analyze information…
ERIC Educational Resources Information Center
Graham, John P.
2014-01-01
Symmetry properties of molecules are generally introduced in second-year or third-year-level inorganic or physical chemistry courses. Students generally adapt readily to understanding and applying the operations of rotation (C[subscript n]), reflection (s), and inversion (i). However, the two-step operation of improper rotation-reflection…
Haque, Mohammad Nazmul; Noman, Nasimul; Berretta, Regina; Moscato, Pablo
2016-01-01
Classification of datasets with imbalanced sample distributions has always been a challenge. In general, a popular approach for enhancing classification performance is the construction of an ensemble of classifiers. However, the performance of an ensemble is dependent on the choice of constituent base classifiers. Therefore, we propose a genetic algorithm-based search method for finding the optimum combination from a pool of base classifiers to form a heterogeneous ensemble. The algorithm, called GA-EoC, utilises 10 fold-cross validation on training data for evaluating the quality of each candidate ensembles. In order to combine the base classifiers decision into ensemble's output, we used the simple and widely used majority voting approach. The proposed algorithm, along with the random sub-sampling approach to balance the class distribution, has been used for classifying class-imbalanced datasets. Additionally, if a feature set was not available, we used the (α, β) - k Feature Set method to select a better subset of features for classification. We have tested GA-EoC with three benchmarking datasets from the UCI-Machine Learning repository, one Alzheimer's disease dataset and a subset of the PubFig database of Columbia University. In general, the performance of the proposed method on the chosen datasets is robust and better than that of the constituent base classifiers and many other well-known ensembles. Based on our empirical study we claim that a genetic algorithm is a superior and reliable approach to heterogeneous ensemble construction and we expect that the proposed GA-EoC would perform consistently in other cases.
Haque, Mohammad Nazmul; Noman, Nasimul; Berretta, Regina; Moscato, Pablo
2016-01-01
Classification of datasets with imbalanced sample distributions has always been a challenge. In general, a popular approach for enhancing classification performance is the construction of an ensemble of classifiers. However, the performance of an ensemble is dependent on the choice of constituent base classifiers. Therefore, we propose a genetic algorithm-based search method for finding the optimum combination from a pool of base classifiers to form a heterogeneous ensemble. The algorithm, called GA-EoC, utilises 10 fold-cross validation on training data for evaluating the quality of each candidate ensembles. In order to combine the base classifiers decision into ensemble’s output, we used the simple and widely used majority voting approach. The proposed algorithm, along with the random sub-sampling approach to balance the class distribution, has been used for classifying class-imbalanced datasets. Additionally, if a feature set was not available, we used the (α, β) − k Feature Set method to select a better subset of features for classification. We have tested GA-EoC with three benchmarking datasets from the UCI-Machine Learning repository, one Alzheimer’s disease dataset and a subset of the PubFig database of Columbia University. In general, the performance of the proposed method on the chosen datasets is robust and better than that of the constituent base classifiers and many other well-known ensembles. Based on our empirical study we claim that a genetic algorithm is a superior and reliable approach to heterogeneous ensemble construction and we expect that the proposed GA-EoC would perform consistently in other cases. PMID:26764911
Knowledge-based nonuniform sampling in multidimensional NMR.
Schuyler, Adam D; Maciejewski, Mark W; Arthanari, Haribabu; Hoch, Jeffrey C
2011-07-01
The full resolution afforded by high-field magnets is rarely realized in the indirect dimensions of multidimensional NMR experiments because of the time cost of uniformly sampling to long evolution times. Emerging methods utilizing nonuniform sampling (NUS) enable high resolution along indirect dimensions by sampling long evolution times without sampling at every multiple of the Nyquist sampling interval. While the earliest NUS approaches matched the decay of sampling density to the decay of the signal envelope, recent approaches based on coupled evolution times attempt to optimize sampling by choosing projection angles that increase the likelihood of resolving closely-spaced resonances. These approaches employ knowledge about chemical shifts to predict optimal projection angles, whereas prior applications of tailored sampling employed only knowledge of the decay rate. In this work we adapt the matched filter approach as a general strategy for knowledge-based nonuniform sampling that can exploit prior knowledge about chemical shifts and is not restricted to sampling projections. Based on several measures of performance, we find that exponentially weighted random sampling (envelope matched sampling) performs better than shift-based sampling (beat matched sampling). While shift-based sampling can yield small advantages in sensitivity, the gains are generally outweighed by diminished robustness. Our observation that more robust sampling schemes are only slightly less sensitive than schemes highly optimized using prior knowledge about chemical shifts has broad implications for any multidimensional NMR study employing NUS. The results derived from simulated data are demonstrated with a sample application to PfPMT, the phosphoethanolamine methyltransferase of the human malaria parasite Plasmodium falciparum.
Uncertainty characterization approaches for risk assessment of DBPs in drinking water: a review.
Chowdhury, Shakhawat; Champagne, Pascale; McLellan, P James
2009-04-01
The management of risk from disinfection by-products (DBPs) in drinking water has become a critical issue over the last three decades. The areas of concern for risk management studies include (i) human health risk from DBPs, (ii) disinfection performance, (iii) technical feasibility (maintenance, management and operation) of treatment and disinfection approaches, and (iv) cost. Human health risk assessment is typically considered to be the most important phase of the risk-based decision-making or risk management studies. The factors associated with health risk assessment and other attributes are generally prone to considerable uncertainty. Probabilistic and non-probabilistic approaches have both been employed to characterize uncertainties associated with risk assessment. The probabilistic approaches include sampling-based methods (typically Monte Carlo simulation and stratified sampling) and asymptotic (approximate) reliability analysis (first- and second-order reliability methods). Non-probabilistic approaches include interval analysis, fuzzy set theory and possibility theory. However, it is generally accepted that no single method is suitable for the entire spectrum of problems encountered in uncertainty analyses for risk assessment. Each method has its own set of advantages and limitations. In this paper, the feasibility and limitations of different uncertainty analysis approaches are outlined for risk management studies of drinking water supply systems. The findings assist in the selection of suitable approaches for uncertainty analysis in risk management studies associated with DBPs and human health risk.
The National Children's Study: Recruitment Outcomes Using the Provider-Based Recruitment Approach.
Hale, Daniel E; Wyatt, Sharon B; Buka, Stephen; Cherry, Debra; Cislo, Kendall K; Dudley, Donald J; McElfish, Pearl Anna; Norman, Gwendolyn S; Reynolds, Simone A; Siega-Riz, Anna Maria; Wadlinger, Sandra; Walker, Cheryl K; Robbins, James M
2016-06-01
In 2009, the National Children's Study (NCS) Vanguard Study tested the feasibility of household-based recruitment and participant enrollment using a birth-rate probability sample. In 2010, the NCS Program Office launched 3 additional recruitment approaches. We tested whether provider-based recruitment could improve recruitment outcomes compared with household-based recruitment. The NCS aimed to recruit 18- to 49-year-old women who were pregnant or at risk for becoming pregnant who lived in designated geographic segments within primary sampling units, generally counties. Using provider-based recruitment, 10 study centers engaged providers to enroll eligible participants at their practice. Recruitment models used different levels of provider engagement (full, intermediate, information-only). The percentage of eligible women per county ranged from 1.5% to 57.3%. Across the centers, 3371 potential participants were approached for screening, 3459 (92%) were screened and 1479 were eligible (43%). Of those 1181 (80.0%) gave consent and 1008 (94%) were retained until delivery. Recruited participants were generally representative of the county population. Provider-based recruitment was successful in recruiting NCS participants. Challenges included time-intensity of engaging the clinical practices, differential willingness of providers to participate, and necessary reliance on providers for participant identification. The vast majority of practices cooperated to some degree. Recruitment from obstetric practices is an effective means of obtaining a representative sample. Copyright © 2016 by the American Academy of Pediatrics.
The National Children’s Study: Recruitment Outcomes Using the Provider-Based Recruitment Approach
Wyatt, Sharon B.; Buka, Stephen; Cherry, Debra; Cislo, Kendall K.; Dudley, Donald J.; McElfish, Pearl Anna; Norman, Gwendolyn S.; Reynolds, Simone A.; Siega-Riz, Anna Maria; Wadlinger, Sandra; Walker, Cheryl K.; Robbins, James M.
2016-01-01
OBJECTIVE: In 2009, the National Children’s Study (NCS) Vanguard Study tested the feasibility of household-based recruitment and participant enrollment using a birth-rate probability sample. In 2010, the NCS Program Office launched 3 additional recruitment approaches. We tested whether provider-based recruitment could improve recruitment outcomes compared with household-based recruitment. METHODS: The NCS aimed to recruit 18- to 49-year-old women who were pregnant or at risk for becoming pregnant who lived in designated geographic segments within primary sampling units, generally counties. Using provider-based recruitment, 10 study centers engaged providers to enroll eligible participants at their practice. Recruitment models used different levels of provider engagement (full, intermediate, information-only). RESULTS: The percentage of eligible women per county ranged from 1.5% to 57.3%. Across the centers, 3371 potential participants were approached for screening, 3459 (92%) were screened and 1479 were eligible (43%). Of those 1181 (80.0%) gave consent and 1008 (94%) were retained until delivery. Recruited participants were generally representative of the county population. CONCLUSIONS: Provider-based recruitment was successful in recruiting NCS participants. Challenges included time-intensity of engaging the clinical practices, differential willingness of providers to participate, and necessary reliance on providers for participant identification. The vast majority of practices cooperated to some degree. Recruitment from obstetric practices is an effective means of obtaining a representative sample. PMID:27251870
Wightman, Jade; Julio, Flávia; Virués-Ortega, Javier
2014-05-01
Experimental functional analysis is an assessment methodology to identify the environmental factors that maintain problem behavior in individuals with developmental disabilities and in other populations. Functional analysis provides the basis for the development of reinforcement-based approaches to treatment. This article reviews the procedures, validity, and clinical implementation of the methodological variations of functional analysis and function-based interventions. We present six variations of functional analysis methodology in addition to the typical functional analysis: brief functional analysis, single-function tests, latency-based functional analysis, functional analysis of precursors, and trial-based functional analysis. We also present the three general categories of function-based interventions: extinction, antecedent manipulation, and differential reinforcement. Functional analysis methodology is a valid and efficient approach to the assessment of problem behavior and the selection of treatment strategies.
Planning perception and action for cognitive mobile manipulators
NASA Astrophysics Data System (ADS)
Gaschler, Andre; Nogina, Svetlana; Petrick, Ronald P. A.; Knoll, Alois
2013-12-01
We present a general approach to perception and manipulation planning for cognitive mobile manipulators. Rather than hard-coding single purpose robot applications, a robot should be able to reason about its basic skills in order to solve complex problems autonomously. Humans intuitively solve tasks in real-world scenarios by breaking down abstract problems into smaller sub-tasks and use heuristics based on their previous experience. We apply a similar idea for planning perception and manipulation to cognitive mobile robots. Our approach is based on contingent planning and run-time sensing, integrated in our knowledge of volumes" planning framework, called KVP. Using the general-purpose PKS planner, we model information-gathering actions at plan time that have multiple possible outcomes at run time. As a result, perception and sensing arise as necessary preconditions for manipulation, rather than being hard-coded as tasks themselves. We demonstrate the e ectiveness of our approach on two scenarios covering visual and force sensing on a real mobile manipulator.
Landsgesell, Jonas; Holm, Christian; Smiatek, Jens
2017-02-14
We present a novel method for the study of weak polyelectrolytes and general acid-base reactions in molecular dynamics and Monte Carlo simulations. The approach combines the advantages of the reaction ensemble and the Wang-Landau sampling method. Deprotonation and protonation reactions are simulated explicitly with the help of the reaction ensemble method, while the accurate sampling of the corresponding phase space is achieved by the Wang-Landau approach. The combination of both techniques provides a sufficient statistical accuracy such that meaningful estimates for the density of states and the partition sum can be obtained. With regard to these estimates, several thermodynamic observables like the heat capacity or reaction free energies can be calculated. We demonstrate that the computation times for the calculation of titration curves with a high statistical accuracy can be significantly decreased when compared to the original reaction ensemble method. The applicability of our approach is validated by the study of weak polyelectrolytes and their thermodynamic properties.
Why psychopathy matters: Implications for public health and violence prevention✩
Reidy, Dennis E.; Kearns, Megan C.; DeGue, Sarah; Lilienfeld, Scott O.; Massetti, Greta; Kiehl, Kent A.
2018-01-01
Psychopathy is an early-appearing risk factor for severe and chronic violence. The violence largely attributable to psychopathy constitutes a substantial portion of the societal burden to the public health and criminal justice systems, and thus necessitates significant attention from prevention experts. Yet, despite a vast base of research in psychology and criminology, the public health approach to violence has generally neglected to consider this key variable. Fundamentally, the public health approach to violence prevention is focused on achieving change at the population level to provide the most benefit to the maximum number of people. Increasing attention to the individual-level factor of psychopathy in public health could improve our ability to reduce violence at the community and societal levels. We conclude that the research literature on psychopathy points to a pressing need for a broad-based public health approach with a focus on primary prevention. Further, we consider how measuring psychopathy in public health research may benefit violence prevention, and ultimately society, in general. PMID:29593448
MacCarthy, Dan; Hollander, Marcus J
2014-01-01
In 2002, the British Columbia Ministry of Health and the British Columbia Medical Association (now Doctors of BC) came together to form the British Columbia General Practice Services Committee to bring about transformative change in primary care in British Columbia, Canada. This committee's approach to primary care was to respond to an operational problem--the decline of family practice in British Columbia--with an operational solution--assist general practitioners to provide better care by introducing new incentive fees into the fee-for-service payment schedule, and by providing additional training to general practitioners. This may be referred to as a "soft power" approach, which can be summarized in the abbreviation RISQ: focus on Relationships; provide Incentives for general practitioners to spend more time with their patients and provide guidelines-based care; Support general practitioners by developing learning modules to improve their practices; and, through the incentive payments and learning modules, provide better Quality care to patients and improved satisfaction to physicians. There are many similarities between the British Columbian approach to primary care and the US patient-centered medical home.
Audio-Visual Situational Awareness for General Aviation Pilots
NASA Technical Reports Server (NTRS)
Spirkovska, Lilly; Lodha, Suresh K.; Clancy, Daniel (Technical Monitor)
2001-01-01
Weather is one of the major causes of general aviation accidents. Researchers are addressing this problem from various perspectives including improving meteorological forecasting techniques, collecting additional weather data automatically via on-board sensors and "flight" modems, and improving weather data dissemination and presentation. We approach the problem from the improved presentation perspective and propose weather visualization and interaction methods tailored for general aviation pilots. Our system, Aviation Weather Data Visualization Environment (AWE), utilizes information visualization techniques, a direct manipulation graphical interface, and a speech-based interface to improve a pilot's situational awareness of relevant weather data. The system design is based on a user study and feedback from pilots.
General framework for dynamic large deformation contact problems based on phantom-node X-FEM
NASA Astrophysics Data System (ADS)
Broumand, P.; Khoei, A. R.
2018-04-01
This paper presents a general framework for modeling dynamic large deformation contact-impact problems based on the phantom-node extended finite element method. The large sliding penalty contact formulation is presented based on a master-slave approach which is implemented within the phantom-node X-FEM and an explicit central difference scheme is used to model the inertial effects. The method is compared with conventional contact X-FEM; advantages, limitations and implementational aspects are also addressed. Several numerical examples are presented to show the robustness and accuracy of the proposed method.
Reinforcing loose foundation stones in trait-based plant ecology.
Shipley, Bill; De Bello, Francesco; Cornelissen, J Hans C; Laliberté, Etienne; Laughlin, Daniel C; Reich, Peter B
2016-04-01
The promise of "trait-based" plant ecology is one of generalized prediction across organizational and spatial scales, independent of taxonomy. This promise is a major reason for the increased popularity of this approach. Here, we argue that some important foundational assumptions of trait-based ecology have not received sufficient empirical evaluation. We identify three such assumptions and, where possible, suggest methods of improvement: (i) traits are functional to the degree that they determine individual fitness, (ii) intraspecific variation in functional traits can be largely ignored, and (iii) functional traits show general predictive relationships to measurable environmental gradients.
A General Algorithm for Reusing Krylov Subspace Information. I. Unsteady Navier-Stokes
NASA Technical Reports Server (NTRS)
Carpenter, Mark H.; Vuik, C.; Lucas, Peter; vanGijzen, Martin; Bijl, Hester
2010-01-01
A general algorithm is developed that reuses available information to accelerate the iterative convergence of linear systems with multiple right-hand sides A x = b (sup i), which are commonly encountered in steady or unsteady simulations of nonlinear equations. The algorithm is based on the classical GMRES algorithm with eigenvector enrichment but also includes a Galerkin projection preprocessing step and several novel Krylov subspace reuse strategies. The new approach is applied to a set of test problems, including an unsteady turbulent airfoil, and is shown in some cases to provide significant improvement in computational efficiency relative to baseline approaches.
A Prior for Neural Networks utilizing Enclosing Spheres for Normalization
NASA Astrophysics Data System (ADS)
v. Toussaint, U.; Gori, S.; Dose, V.
2004-11-01
Neural Networks are famous for their advantageous flexibility for problems when there is insufficient knowledge to set up a proper model. On the other hand this flexibility can cause over-fitting and can hamper the generalization properties of neural networks. Many approaches to regularize NN have been suggested but most of them based on ad-hoc arguments. Employing the principle of transformation invariance we derive a general prior in accordance with the Bayesian probability theory for a class of feedforward networks. Optimal networks are determined by Bayesian model comparison verifying the applicability of this approach.
Compiler-directed cache management in multiprocessors
NASA Technical Reports Server (NTRS)
Cheong, Hoichi; Veidenbaum, Alexander V.
1990-01-01
The necessity of finding alternatives to hardware-based cache coherence strategies for large-scale multiprocessor systems is discussed. Three different software-based strategies sharing the same goals and general approach are presented. They consist of a simple invalidation approach, a fast selective invalidation scheme, and a version control scheme. The strategies are suitable for shared-memory multiprocessor systems with interconnection networks and a large number of processors. Results of trace-driven simulations conducted on numerical benchmark routines to compare the performance of the three schemes are presented.
Collective learning modeling based on the kinetic theory of active particles
NASA Astrophysics Data System (ADS)
Burini, D.; De Lillo, S.; Gibelli, L.
2016-03-01
This paper proposes a systems approach to the theory of perception and learning in populations composed of many living entities. Starting from a phenomenological description of these processes, a mathematical structure is derived which is deemed to incorporate their complexity features. The modeling is based on a generalization of kinetic theory methods where interactions are described by theoretical tools of game theory. As an application, the proposed approach is used to model the learning processes that take place in a classroom.
Level statistics of words: Finding keywords in literary texts and symbolic sequences
NASA Astrophysics Data System (ADS)
Carpena, P.; Bernaola-Galván, P.; Hackenberg, M.; Coronado, A. V.; Oliver, J. L.
2009-03-01
Using a generalization of the level statistics analysis of quantum disordered systems, we present an approach able to extract automatically keywords in literary texts. Our approach takes into account not only the frequencies of the words present in the text but also their spatial distribution along the text, and is based on the fact that relevant words are significantly clustered (i.e., they self-attract each other), while irrelevant words are distributed randomly in the text. Since a reference corpus is not needed, our approach is especially suitable for single documents for which no a priori information is available. In addition, we show that our method works also in generic symbolic sequences (continuous texts without spaces), thus suggesting its general applicability.
Density-matrix approach for the electroluminescence of molecules in a scanning tunneling microscope.
Tian, Guangjun; Liu, Ji-Cai; Luo, Yi
2011-04-29
The electroluminescence (EL) of molecules confined inside a nanocavity in the scanning tunneling microscope possesses many intriguing but unexplained features. We present here a general theoretical approach based on the density-matrix formalism to describe the EL from molecules near a metal surface induced by both electron tunneling and localized surface plasmon excitations simultaneously. It reveals the underlying physical mechanism for the external bias dependent EL. The important role played by the localized surface plasmon on the EL is highlighted. Calculations for porphyrin derivatives have reproduced corresponding experimental spectra and nicely explained the observed unusual large variation of emission spectral profiles. This general theoretical approach can find many applications in the design of molecular electronic and photonic devices.
ERIC Educational Resources Information Center
Harriman, Stanley L.
2011-01-01
The introduction of the glass cockpit, as well as a whole new generation of high performance general aviation aircraft, highlights the need for a comprehensive overhaul of the traditional approach to training pilots. Collegiate aviation institutions that are interested in upgrading their training aircraft fleets will need to design new curricula…
ERIC Educational Resources Information Center
Bryant, Doug
This paper, titled "The Components of Emotional Intelligence and the Relationship to Sales Performance," presents two general approaches to studying emotional intelligence. The first is a broad model approach that considers abilities as well as a series of personality traits. The second is based on ability models. The possible correlation between…
An Approach to Teaching General Chemistry II that Highlights the Interdisciplinary Nature of Science
ERIC Educational Resources Information Center
Sumter, Takita Felder; Owens, Patrick M.
2011-01-01
The need for a revised curriculum within the life sciences has been well-established. One strategy to improve student preparation in the life sciences is to redesign introductory courses like biology, chemistry, and physics so that they better reflect their disciplinary interdependence. We describe a medically relevant, context-based approach to…
Figuration & Frequency: A Usage-Based Approach to Metaphor
ERIC Educational Resources Information Center
Sanford, Daniel
2010-01-01
Two of the major claims of the cognitivist approach to metaphor, the paradigm which has emerged as dominant over the last three decades, are (1) that metaphor is a conceptual, rather than strictly linguistic, phenomenon, and (2) that metaphor exemplifies processes which are at work in cognition more generally. This view of metaphor is here placed…
ERIC Educational Resources Information Center
Kim, Jiseon
2010-01-01
Classification testing has been widely used to make categorical decisions by determining whether an examinee has a certain degree of ability required by established standards. As computer technologies have developed, classification testing has become more computerized. Several approaches have been proposed and investigated in the context of…
Campus-Based Geographic Learning: A Field Oriented Teaching Scenario
ERIC Educational Resources Information Center
Jennings, Steven A.; Huber, Thomas P.
2003-01-01
The use of field classes and the need for university master planning are presented as a way to enhance learning. This field-oriented, goal-oriented approach to learning is proposed as a general model for university-level geographic education. This approach is presented for physical geography classes, but could also be applied to other subdivisions…
ERIC Educational Resources Information Center
Taylor, Josephine Ann
2007-01-01
Approaches to intercultural communication competence (ICC) generally argue the need for objective knowledge about another culture as well as knowledge about and the ability to achieve appropriate behaviors of that target culture. Most of these approaches continue to base themselves on a conception of culture as comprehensive but static.…
Singapore Primary Students' Pursuit of Multiple Achievement Goals: A Latent Profile Analysis
ERIC Educational Resources Information Center
Ning, Hoi Kwan
2018-01-01
Based on measures of approach and avoidance mastery and performance goals delineated in the 2 × 2 achievement goal framework, this study utilized a person-centered approach to examine Singapore primary students' (N = 819) multiple goals pursuit in the general school context. Latent profile analysis identified six types of students with distinct…
Analyzing the Efficacy of the Testing Effect Using Kahoot™ on Student Performance
ERIC Educational Resources Information Center
Iwamoto, Darren H.; Hargis, Jace; Taitano, Erik Jon; Vuong, Ky
2017-01-01
Lower than expected high-stakes examination scores were being observed in a first-year general psychology class. This research sought an alternate approach that would assist students in preparing for high-stakes examinations. The purpose of this study was to measure the effectiveness of an alternate teaching approach based on the testing effect to…
ERIC Educational Resources Information Center
Wilkie, Karina J.
2016-01-01
Algebra has been explicit in many school curriculum programs from the early years but there are competing views on what content and approaches are appropriate for different levels of schooling. This study investigated 12-13-year-old Australian students' algebraic thinking in a hybrid environment of functional and equation-based approaches to…
An Inquiry-Based Approach to Critical Literacy: Pedagogical Nuances of a Second Grade Classroom
ERIC Educational Resources Information Center
Beach, Pamela; Cleovoulou, Yiola
2014-01-01
This case study explores the pedagogy and practices of an elementary school teacher who combines inquiry pedagogy and critical literacy. The authors gathered data for this analysis by conducting two interviews with a classroom teacher and observing classroom practices 12 times over a 6 month period. Through a general inductive approach to…
Geometric Lagrangian approach to the physical degree of freedom count in field theory
NASA Astrophysics Data System (ADS)
Díaz, Bogar; Montesinos, Merced
2018-05-01
To circumvent some technical difficulties faced by the geometric Lagrangian approach to the physical degree of freedom count presented in the work of Díaz, Higuita, and Montesinos [J. Math. Phys. 55, 122901 (2014)] that prevent its direct implementation to field theory, in this paper, we slightly modify the geometric Lagrangian approach in such a way that its resulting version works perfectly for field theory (and for particle systems, of course). As in previous work, the current approach also allows us to directly get the Lagrangian constraints, a new Lagrangian formula for the counting of the number of physical degrees of freedom, the gauge transformations, and the number of first- and second-class constraints for any action principle based on a Lagrangian depending on the fields and their first derivatives without performing any Dirac's canonical analysis. An advantage of this approach over the previous work is that it also allows us to handle the reducibility of the constraints and to get the off-shell gauge transformations. The theoretical framework is illustrated in 3-dimensional generalized general relativity (Palatini and Witten's exotic actions), Chern-Simons theory, 4-dimensional BF theory, and 4-dimensional general relativity given by Palatini's action with a cosmological constant.
Assessing the Effectiveness of Web-Based Tutorials Using Pre-and Post-Test Measurements
ERIC Educational Resources Information Center
Guy, Retta Sweat; Lownes-Jackson, Millicent
2012-01-01
Computer technology in general and the Internet in particular have facilitated as well as motivated the development of Web-based tutorials (MacKinnon & Williams, 2006). The current research study describes a pedagogical approach that exploits the use of self-paced, Web-based tutorials for assisting students with reviewing grammar and mechanics…
ERIC Educational Resources Information Center
Almuntasheri, S.; Gillies, R. M.; Wright, T.
2016-01-01
Despite a general consensus on the educational effectiveness of inquiry-based instruction, the enacted type of inquiry in science classrooms remains debatable in many countries including Saudi Arabia. This study compared guided-inquiry based teachers' professional development to teacher-directed approach in supporting Saudi students to understand…
Narrating practice: reflective accounts and the textual construction of reality.
Taylor, Carolyn
2003-05-01
Two approaches dominate current thinking in health and welfare: evidence-based practice and reflective practice. Whilst there is debate about the merits of evidence-based practice, reflective practice is generally accepted with critical debate as an important educational tool. Where critique does exist it tends to adopt a Foucauldian approach, focusing on the surveillance and self-regulatory aspects of reflective practice. This article acknowledges the critical purchase on the concept of reflective practice offered by Foucauldian approaches but argues that microsociological and discourse analytic approaches can further illuminate the subject and thus serve as a complement to them. The claims of proponents of reflective practice are explored, in opposition to the technical-rational approach of evidence-based practice. Reflective practice tends to adopt a naive or romantic realist position and fails to acknowledge the ways in which reflective accounts construct the world of practice. Microsociological approaches can help us to understand reflective accounts as examples of case-talk, constructed in a narrative form in the same way as case records and presentations.
Premnath, Kannan N; Pattison, Martin J; Banerjee, Sanjoy
2009-02-01
In this paper, we present a framework based on the generalized lattice Boltzmann equation (GLBE) using multiple relaxation times with forcing term for eddy capturing simulation of wall-bounded turbulent flows. Due to its flexibility in using disparate relaxation times, the GLBE is well suited to maintaining numerical stability on coarser grids and in obtaining improved solution fidelity of near-wall turbulent fluctuations. The subgrid scale (SGS) turbulence effects are represented by the standard Smagorinsky eddy viscosity model, which is modified by using the van Driest wall-damping function to account for reduction of turbulent length scales near walls. In order to be able to simulate a wider class of problems, we introduce forcing terms, which can represent the effects of general nonuniform forms of forces, in the natural moment space of the GLBE. Expressions for the strain rate tensor used in the SGS model are derived in terms of the nonequilibrium moments of the GLBE to include such forcing terms, which comprise a generalization of those presented in a recent work [Yu, Comput. Fluids 35, 957 (2006)]. Variable resolutions are introduced into this extended GLBE framework through a conservative multiblock approach. The approach, whose optimized implementation is also discussed, is assessed for two canonical flow problems bounded by walls, viz., fully developed turbulent channel flow at a shear or friction Reynolds number (Re) of 183.6 based on the channel half-width and three-dimensional (3D) shear-driven flows in a cubical cavity at a Re of 12 000 based on the side length of the cavity. Comparisons of detailed computed near-wall turbulent flow structure, given in terms of various turbulence statistics, with available data, including those from direct numerical simulations (DNS) and experiments showed good agreement. The GLBE approach also exhibited markedly better stability characteristics and avoided spurious near-wall turbulent fluctuations on coarser grids when compared with the single-relaxation-time (SRT)-based approach. Moreover, its implementation showed excellent parallel scalability on a large parallel cluster with over a thousand processors.
Anticipating the emergence of infectious diseases
Drake, John M.; Rohani, Pejman
2017-01-01
In spite of medical breakthroughs, the emergence of pathogens continues to pose threats to both human and animal populations. We present candidate approaches for anticipating disease emergence prior to large-scale outbreaks. Through use of ideas from the theories of dynamical systems and stochastic processes we develop approaches which are not specific to a particular disease system or model, but instead have general applicability. The indicators of disease emergence detailed in this paper can be classified into two parallel approaches: a set of early-warning signals based around the theory of critical slowing down and a likelihood-based approach. To test the reliability of these two approaches we contrast theoretical predictions with simulated data. We find good support for our methods across a range of different model structures and parameter values. PMID:28679666
Language acquisition from a biolinguistic perspective.
Crain, Stephen; Koring, Loes; Thornton, Rosalind
2017-10-01
This paper describes the biolinguistic approach to language acquisition. We contrast the biolinguistic approach with a usage-based approach. We argue that the biolinguistic approach is superior because it provides more accurate and more extensive generalizations about the properties of human languages, as well as a better account of how children acquire human languages. To distinguish between these accounts, we focus on how child and adult language differ both in sentence production and in sentence understanding. We argue that the observed differences resist explanation using the cognitive mechanisms that are invoked by the usage-based approach. In contrast, the biolinguistic approach explains the qualitative parametric differences between child and adult language. Explaining how child and adult language differ and demonstrating that children perceive unity despite apparent diversity are two of the hallmarks of the biolinguistic approach to language acquisition. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Zhou, Rurui; Li, Yu; Lu, Di; Liu, Haixing; Zhou, Huicheng
2016-09-01
This paper investigates the use of an epsilon-dominance non-dominated sorted genetic algorithm II (ɛ-NSGAII) as a sampling approach with an aim to improving sampling efficiency for multiple metrics uncertainty analysis using Generalized Likelihood Uncertainty Estimation (GLUE). The effectiveness of ɛ-NSGAII based sampling is demonstrated compared with Latin hypercube sampling (LHS) through analyzing sampling efficiency, multiple metrics performance, parameter uncertainty and flood forecasting uncertainty with a case study of flood forecasting uncertainty evaluation based on Xinanjiang model (XAJ) for Qing River reservoir, China. Results obtained demonstrate the following advantages of the ɛ-NSGAII based sampling approach in comparison to LHS: (1) The former performs more effective and efficient than LHS, for example the simulation time required to generate 1000 behavioral parameter sets is shorter by 9 times; (2) The Pareto tradeoffs between metrics are demonstrated clearly with the solutions from ɛ-NSGAII based sampling, also their Pareto optimal values are better than those of LHS, which means better forecasting accuracy of ɛ-NSGAII parameter sets; (3) The parameter posterior distributions from ɛ-NSGAII based sampling are concentrated in the appropriate ranges rather than uniform, which accords with their physical significance, also parameter uncertainties are reduced significantly; (4) The forecasted floods are close to the observations as evaluated by three measures: the normalized total flow outside the uncertainty intervals (FOUI), average relative band-width (RB) and average deviation amplitude (D). The flood forecasting uncertainty is also reduced a lot with ɛ-NSGAII based sampling. This study provides a new sampling approach to improve multiple metrics uncertainty analysis under the framework of GLUE, and could be used to reveal the underlying mechanisms of parameter sets under multiple conflicting metrics in the uncertainty analysis process.
Tips from the toolkit: 1 - know yourself.
Steer, Neville
2010-01-01
High performance organisations review their strategy and business processes as part of usual business operations. If you are new to the field of general practice, do you have a career plan for the next 5-10 years? If you are an experienced general practitioner, are you using much the same business model and processes as when you started out? The following article sets out some ideas you might use to have a fresh approach to your professional career. It is based on The Royal Australian College of General Practitioners' 'General practice management toolkit'.
NASA Astrophysics Data System (ADS)
Hassan Asemani, Mohammad; Johari Majd, Vahid
2015-12-01
This paper addresses a robust H∞ fuzzy observer-based tracking design problem for uncertain Takagi-Sugeno fuzzy systems with external disturbances. To have a practical observer-based controller, the premise variables of the system are assumed to be not measurable in general, which leads to a more complex design process. The tracker is synthesised based on a fuzzy Lyapunov function approach and non-parallel distributed compensation (non-PDC) scheme. Using the descriptor redundancy approach, the robust stability conditions are derived in the form of strict linear matrix inequalities (LMIs) even in the presence of uncertainties in the system, input, and output matrices simultaneously. Numerical simulations are provided to show the effectiveness of the proposed method.
A Voronoi interior adjacency-based approach for generating a contour tree
NASA Astrophysics Data System (ADS)
Chen, Jun; Qiao, Chaofei; Zhao, Renliang
2004-05-01
A contour tree is a good graphical tool for representing the spatial relations of contour lines and has found many applications in map generalization, map annotation, terrain analysis, etc. A new approach for generating contour trees by introducing a Voronoi-based interior adjacency set concept is proposed in this paper. The immediate interior adjacency set is employed to identify all of the children contours of each contour without contour elevations. It has advantages over existing methods such as the point-in-polygon method and the region growing-based method. This new approach can be used for spatial data mining and knowledge discovering, such as the automatic extraction of terrain features and construction of multi-resolution digital elevation model.
Model-based Acceleration Control of Turbofan Engines with a Hammerstein-Wiener Representation
NASA Astrophysics Data System (ADS)
Wang, Jiqiang; Ye, Zhifeng; Hu, Zhongzhi; Wu, Xin; Dimirovsky, Georgi; Yue, Hong
2017-05-01
Acceleration control of turbofan engines is conventionally designed through either schedule-based or acceleration-based approach. With the widespread acceptance of model-based design in aviation industry, it becomes necessary to investigate the issues associated with model-based design for acceleration control. In this paper, the challenges for implementing model-based acceleration control are explained; a novel Hammerstein-Wiener representation of engine models is introduced; based on the Hammerstein-Wiener model, a nonlinear generalized minimum variance type of optimal control law is derived; the feature of the proposed approach is that it does not require the inversion operation that usually upsets those nonlinear control techniques. The effectiveness of the proposed control design method is validated through a detailed numerical study.
Template-based procedures for neural network interpretation.
Alexander, J A.; Mozer, M C.
1999-04-01
Although neural networks often achieve impressive learning and generalization performance, their internal workings are typically all but impossible to decipher. This characteristic of the networks, their opacity, is one of the disadvantages of connectionism compared to more traditional, rule-oriented approaches to artificial intelligence. Without a thorough understanding of the network behavior, confidence in a system's results is lowered, and the transfer of learned knowledge to other processing systems - including humans - is precluded. Methods that address the opacity problem by casting network weights in symbolic terms are commonly referred to as rule extraction techniques. This work describes a principled approach to symbolic rule extraction from standard multilayer feedforward networks based on the notion of weight templates, parameterized regions of weight space corresponding to specific symbolic expressions. With an appropriate choice of representation, we show how template parameters may be efficiently identified and instantiated to yield the optimal match to the actual weights of a unit. Depending on the requirements of the application domain, the approach can accommodate n-ary disjunctions and conjunctions with O(k) complexity, simple n-of-m expressions with O(k(2)) complexity, or more general classes of recursive n-of-m expressions with O(k(L+2)) complexity, where k is the number of inputs to an unit and L the recursion level of the expression class. Compared to other approaches in the literature, our method of rule extraction offers benefits in simplicity, computational performance, and overall flexibility. Simulation results on a variety of problems demonstrate the application of our procedures as well as the strengths and the weaknesses of our general approach.
Efficient computation of PDF-based characteristics from diffusion MR signal.
Assemlal, Haz-Edine; Tschumperlé, David; Brun, Luc
2008-01-01
We present a general method for the computation of PDF-based characteristics of the tissue micro-architecture in MR imaging. The approach relies on the approximation of the MR signal by a series expansion based on Spherical Harmonics and Laguerre-Gaussian functions, followed by a simple projection step that is efficiently done in a finite dimensional space. The resulting algorithm is generic, flexible and is able to compute a large set of useful characteristics of the local tissues structure. We illustrate the effectiveness of this approach by showing results on synthetic and real MR datasets acquired in a clinical time-frame.
Quantitative photoacoustic imaging in the acoustic regime using SPIM
NASA Astrophysics Data System (ADS)
Beigl, Alexander; Elbau, Peter; Sadiq, Kamran; Scherzer, Otmar
2018-05-01
While in standard photoacoustic imaging the propagation of sound waves is modeled by the standard wave equation, our approach is based on a generalized wave equation with variable sound speed and material density, respectively. In this paper we present an approach for photoacoustic imaging, which in addition to the recovery of the absorption density parameter, the imaging parameter of standard photoacoustics, also allows us to reconstruct the spatially varying sound speed and density, respectively, of the medium. We provide analytical reconstruction formulas for all three parameters based in a linearized model based on single plane illumination microscopy (SPIM) techniques.
Assessing the Role of Online Technologies in Project-Based Learning
ERIC Educational Resources Information Center
Ravitz, Jason; Blazevski, Juliane
2014-01-01
This study examines the relationships between teacher-reported use of online resources, and preparedness, implementation challenges, and time spent implementing project- or problem-based learning, or approaches that are similar to what we call "PBL" in general. Variables were measured using self-reports from those who teach in reform…
A Model for Student Adoption of Online Interactivity
ERIC Educational Resources Information Center
Karamanos, Neophytos; Gibbs, Paul
2012-01-01
Acknowledging the general difficulty of new e-learning pedagogical approaches in achieving wide acceptance and use, the study described in this article examines a class of MBA students' adoption of a proposed online interactive learning environment. To this end, a web-based, case-based constructivist learning environment was developed, embedding…
The Potential of Multivariate Analysis in Assessing Students' Attitude to Curriculum Subjects
ERIC Educational Resources Information Center
Gaotlhobogwe, Michael; Laugharne, Janet; Durance, Isabelle
2011-01-01
Background: Understanding student attitudes to curriculum subjects is central to providing evidence-based options to policy makers in education. Purpose: We illustrate how quantitative approaches used in the social sciences and based on multivariate analysis (categorical Principal Components Analysis, Clustering Analysis and General Linear…
SNP-based genotyping in lentil: linking sequence information with phenotypes
USDA-ARS?s Scientific Manuscript database
Lentil (Lens culinaris) has been late to enter the world of high throughput molecular analysis due to a general lack of genomic resources. Using a 454 sequencing-based approach, SNPs have been identified in genes across the lentil genome. Several hundred have been turned into single SNP KASP assay...
Galvanin, Federico; Ballan, Carlo C; Barolo, Massimiliano; Bezzo, Fabrizio
2013-08-01
The use of pharmacokinetic (PK) and pharmacodynamic (PD) models is a common and widespread practice in the preliminary stages of drug development. However, PK-PD models may be affected by structural identifiability issues intrinsically related to their mathematical formulation. A preliminary structural identifiability analysis is usually carried out to check if the set of model parameters can be uniquely determined from experimental observations under the ideal assumptions of noise-free data and no model uncertainty. However, even for structurally identifiable models, real-life experimental conditions and model uncertainty may strongly affect the practical possibility to estimate the model parameters in a statistically sound way. A systematic procedure coupling the numerical assessment of structural identifiability with advanced model-based design of experiments formulations is presented in this paper. The objective is to propose a general approach to design experiments in an optimal way, detecting a proper set of experimental settings that ensure the practical identifiability of PK-PD models. Two simulated case studies based on in vitro bacterial growth and killing models are presented to demonstrate the applicability and generality of the methodology to tackle model identifiability issues effectively, through the design of feasible and highly informative experiments.
A general modeling framework for describing spatially structured population dynamics
Sample, Christine; Fryxell, John; Bieri, Joanna; Federico, Paula; Earl, Julia; Wiederholt, Ruscena; Mattsson, Brady; Flockhart, Tyler; Nicol, Sam; Diffendorfer, James E.; Thogmartin, Wayne E.; Erickson, Richard A.; Norris, D. Ryan
2017-01-01
Variation in movement across time and space fundamentally shapes the abundance and distribution of populations. Although a variety of approaches model structured population dynamics, they are limited to specific types of spatially structured populations and lack a unifying framework. Here, we propose a unified network-based framework sufficiently novel in its flexibility to capture a wide variety of spatiotemporal processes including metapopulations and a range of migratory patterns. It can accommodate different kinds of age structures, forms of population growth, dispersal, nomadism and migration, and alternative life-history strategies. Our objective was to link three general elements common to all spatially structured populations (space, time and movement) under a single mathematical framework. To do this, we adopt a network modeling approach. The spatial structure of a population is represented by a weighted and directed network. Each node and each edge has a set of attributes which vary through time. The dynamics of our network-based population is modeled with discrete time steps. Using both theoretical and real-world examples, we show how common elements recur across species with disparate movement strategies and how they can be combined under a unified mathematical framework. We illustrate how metapopulations, various migratory patterns, and nomadism can be represented with this modeling approach. We also apply our network-based framework to four organisms spanning a wide range of life histories, movement patterns, and carrying capacities. General computer code to implement our framework is provided, which can be applied to almost any spatially structured population. This framework contributes to our theoretical understanding of population dynamics and has practical management applications, including understanding the impact of perturbations on population size, distribution, and movement patterns. By working within a common framework, there is less chance that comparative analyses are colored by model details rather than general principles
ERIC Educational Resources Information Center
Gulyaev, Sergei A.; Stonyer, Heather R.
2002-01-01
Develops an integrated approach based on the use of general systems theory (GST) and the concept of 'mapping' scientific knowledge to provide students with tools for a more holistic understanding of science. Uses GST as the core methodology for understanding science and its complexity. Discusses the role of scientific community in producing…
ERIC Educational Resources Information Center
Pesce, Sebastien
2011-01-01
My aim in this paper is to show the relevance of an "effective semiotics"; that is, a field study based upon Peirce's semiotics. The general context of this investigation is educational semiotics rather than semiotics of teaching: I am concerned with a general approach of educational processes, not with skills and curricula. My paper is…
ERIC Educational Resources Information Center
Kronenberg, J.; And Others
1994-01-01
Describes anorexia nervosa as condition variable in etiology and resistant to treatment, which may lead to mortality in 5% of treated cases. Notes that efforts have been made for treating disorder in nonstigmatizing medical units outside psychiatric hospitals. Describes, through presentation of short case vignette, advantages of treating…
ERIC Educational Resources Information Center
Geel, Regula; Mure, Johannes; Backes-Gellner, Uschi
2011-01-01
According to standard human capital theory, firm-financed training cannot be explained if the skills obtained are general in nature. Nevertheless, in German-speaking countries, firms invest heavily in apprenticeship training although the skills are assumed to be general. In our paper, we study the extent to which apprenticeship training is general…
Developing a General Outcome Measure of Growth in Social Skills for Infants and Toddlers
ERIC Educational Resources Information Center
Carta, Judith; Greenwood, Charles; Luze, Gayle; Cline, Gabriel; Kuntz, Susan
2004-01-01
Proficiency in social interaction with adults and peers is an important outcome in early childhood. The development of an experimental measure for assessing growth in social skills in children birth to 3 years is described. Based on the general outcome measurement (GOM) approach (e.g., Deno, 1997), the measure is intended for use by early…
Developing a General Outcome Measure Off Growth in Social Skills for Infants and Toddlers
ERIC Educational Resources Information Center
Carta, Judith; Greenwood, Charles; Luze, Gayle; Cline, Gabriel; Kuntz, Susan
2004-01-01
Proficiency in social interaction with adults and peers is an important outcome in early childhood. The development of an experimental measure for assessing growth in social skills in children birth to 3 years is described. Based on the general outcome measurement (GOM) approach (e.g., Deno, 1997), the measure is intended for use by early…
USDA-ARS?s Scientific Manuscript database
Current treatments for losing weight based mainly on diet and exercise are, in general, unsuccessful. So, as an alternative to the general strategy of one-size-fits-all, a more individualized approach is proposed through the so-called Personalized Medicine in which genotype data are used to personal...
Cartographic generalization of urban street networks based on gravitational field theory
NASA Astrophysics Data System (ADS)
Liu, Gang; Li, Yongshu; Li, Zheng; Guo, Jiawei
2014-05-01
The automatic generalization of urban street networks is a constant and important aspect of geographical information science. Previous studies show that the dual graph for street-street relationships more accurately reflects the overall morphological properties and importance of streets than do other methods. In this study, we construct a dual graph to represent street-street relationship and propose an approach to generalize street networks based on gravitational field theory. We retain the global structural properties and topological connectivity of an original street network and borrow from gravitational field theory to define the gravitational force between nodes. The concept of multi-order neighbors is introduced and the gravitational force is taken as the measure of the importance contribution between nodes. The importance of a node is defined as the result of the interaction between a given node and its multi-order neighbors. Degree distribution is used to evaluate the level of maintaining the global structure and topological characteristics of a street network and to illustrate the efficiency of the suggested method. Experimental results indicate that the proposed approach can be used in generalizing street networks and retaining their density characteristics, connectivity and global structure.
The influence of environmental conditions on safety management in hospitals: a qualitative study.
Alingh, Carien W; van Wijngaarden, Jeroen D H; Huijsman, Robbert; Paauwe, Jaap
2018-05-02
Hospitals are confronted with increasing safety demands from a diverse set of stakeholders, including governmental organisations, professional associations, health insurance companies, patient associations and the media. However, little is known about the effects of these institutional and competitive pressures on hospital safety management. Previous research has shown that organisations generally shape their safety management approach along the lines of control- or commitment-based management. Using a heuristic framework, based on the contextually-based human resource theory, we analysed how environmental pressures affect the safety management approach used by hospitals. A qualitative study was conducted into hospital care in the Netherlands. Five hospitals were selected for participation, based on organisational characteristics as well as variation in their reputation for patient safety. We interviewed hospital managers and staff with a central role in safety management. A total of 43 semi-structured interviews were conducted with 48 respondents. The heuristic framework was used as an initial model for analysing the data, though new codes emerged from the data as well. In order to ensure safe care delivery, institutional and competitive stakeholders often impose detailed safety requirements, strong forces for compliance and growing demands for accountability. As a consequence, hospitals experience a decrease in the room to manoeuvre. Hence, organisations increasingly choose a control-based management approach to make sure that safety demands are met. In contrast, in case of more abstract safety demands and an organisational culture which favours patient safety, hospitals generally experience more leeway. This often results in a stronger focus on commitment-based management. Institutional and competitive conditions as well as strategic choices that hospitals make have resulted in various combinations of control- and commitment-based safety management. A balanced approach is required. A strong focus on control-based management generates extrinsic motivation in employees but may, at the same time, undermine or even diminish intrinsic motivation to work on patient safety. Emphasising commitment-based management may, in contrast, strengthen intrinsic motivation but increases the risk of priorities being set elsewhere. Currently, external pressures frequently lead to the adoption of control-based management. A balanced approach requires a shift towards more trust-based safety demands.
Data base design for a worldwide multicrop information system
NASA Technical Reports Server (NTRS)
Driggers, W. G.; Downs, J. M.; Hickman, J. R.; Packard, R. L. (Principal Investigator)
1979-01-01
A description of the USDA Application Test System data base design approach and resources is presented. The data is described in detail by category, with emphasis on those characteristics which influenced the design most. It was concluded that the use of a generalized data base in support of crop assessment is a sound concept. The IDMS11 minicomputer base system is recommended for this purpose.
A generalized locomotion CPG architecture based on oscillatory building blocks.
Yang, Zhijun; França, Felipe M G
2003-07-01
Neural oscillation is one of the most extensively investigated topics of artificial neural networks. Scientific approaches to the functionalities of both natural and artificial intelligences are strongly related to mechanisms underlying oscillatory activities. This paper concerns itself with the assumption of the existence of central pattern generators (CPGs), which are the plausible neural architectures with oscillatory capabilities, and presents a discrete and generalized approach to the functionality of locomotor CPGs of legged animals. Based on scheduling by multiple edge reversal (SMER), a primitive and deterministic distributed algorithm, it is shown how oscillatory building block (OBB) modules can be created and, hence, how OBB-based networks can be formulated as asymmetric Hopfield-like neural networks for the generation of complex coordinated rhythmic patterns observed among pairs of biological motor neurons working during different gait patterns. It is also shown that the resulting Hopfield-like network possesses the property of reproducing the whole spectrum of different gaits intrinsic to the target locomotor CPGs. Although the new approach is not restricted to the understanding of the neurolocomotor system of any particular animal, hexapodal and quadrupedal gait patterns are chosen as illustrations given the wide interest expressed by the ongoing research in the area.
Brindis, Ralph G; Douglas, Pamela S; Hendel, Robert C; Peterson, Eric D; Wolk, Michael J; Allen, Joseph M; Patel, Manesh R; Raskin, Ira E; Hendel, Robert C; Bateman, Timothy M; Cerqueira, Manuel D; Gibbons, Raymond J; Gillam, Linda D; Gillespie, John A; Hendel, Robert C; Iskandrian, Ami E; Jerome, Scott D; Krumholz, Harlan M; Messer, Joseph V; Spertus, John A; Stowers, Stephen A
2005-10-18
Under the auspices of the American College of Cardiology Foundation (ACCF) and the American Society of Nuclear Cardiology (ASNC), an appropriateness review was conducted for radionuclide cardiovascular imaging (RNI), specifically gated single-photon emission computed tomography myocardial perfusion imaging (SPECT MPI). The review assessed the risks and benefits of the imaging test for several indications or clinical scenarios and scored them based on a scale of 1 to 9, where the upper range (7 to 9) implies that the test is generally acceptable and is a reasonable approach, and the lower range (1 to 3) implies that the test is generally not acceptable and is not a reasonable approach. The mid range (4 to 6) implies that the test may be generally acceptable and may be a reasonable approach for the indication. The indications for this review were primarily drawn from existing clinical practice guidelines and modified based on discussion by the ACCF Appropriateness Criteria Working Group and the Technical Panel members who rated the indications. The method for this review was based on the RAND/UCLA approach for evaluating appropriateness, which blends scientific evidence and practice experience. A modified Delphi technique was used to obtain first- and second-round ratings of 52 clinical indications. The ratings were done by a Technical Panel with diverse membership, including nuclear cardiologists, referring physicians (including an echocardiographer), health services researchers, and a payer (chief medical officer). These results are expected to have a significant impact on physician decision making and performance, reimbursement policy, and future research directions. Periodic assessment and updating of criteria will be undertaken as needed.
A general method to correct PET data for tissue metabolites using a dual-scan approach.
Gunn, R N; Yap, J T; Wells, P; Osman, S; Price, P; Jones, T; Cunningham, V J
2000-04-01
This article presents and analyses a general method of correcting for the presence of radiolabeled metabolites from a parent radiotracer in tissue during PET scanning. The method is based on a dual-scan approach, i.e., parent scan together with an independent supplementary scan in which the radiolabeled metabolite of interest itself is administered. The method corrects for the presence of systemically derived radiolabeled metabolite delivered to the tissues of interest through the blood. Data from the supplementary scan are analyzed to obtain the tissue impulse response function for the metabolite. The time course of the radiolabeled metabolite in plasma in the parent scan is convolved with its tissue impulse response function to derive a correction term. This is not a simple subtraction technique but 1 that takes account of the different time-activity curves of the radiolabeled metabolite in the 2 scans. The method, its implications, and its limitations are discussed with respect to [11C]thymidine and its principal metabolite 11CO2. The general method, based on a dual-scan approach, can be used to correct for radiolabeled metabolites in tissues of interest during PET scanning. The correction accounts for radiolabeled metabolites that are derived systemically and delivered to the tissues of interest through the blood.
Modelling uncertainty in incompressible flow simulation using Galerkin based generalized ANOVA
NASA Astrophysics Data System (ADS)
Chakraborty, Souvik; Chowdhury, Rajib
2016-11-01
This paper presents a new algorithm, referred to here as Galerkin based generalized analysis of variance decomposition (GG-ANOVA) for modelling input uncertainties and its propagation in incompressible fluid flow. The proposed approach utilizes ANOVA to represent the unknown stochastic response. Further, the unknown component functions of ANOVA are represented using the generalized polynomial chaos expansion (PCE). The resulting functional form obtained by coupling the ANOVA and PCE is substituted into the stochastic Navier-Stokes equation (NSE) and Galerkin projection is employed to decompose it into a set of coupled deterministic 'Navier-Stokes alike' equations. Temporal discretization of the set of coupled deterministic equations is performed by employing Adams-Bashforth scheme for convective term and Crank-Nicolson scheme for diffusion term. Spatial discretization is performed by employing finite difference scheme. Implementation of the proposed approach has been illustrated by two examples. In the first example, a stochastic ordinary differential equation has been considered. This example illustrates the performance of proposed approach with change in nature of random variable. Furthermore, convergence characteristics of GG-ANOVA has also been demonstrated. The second example investigates flow through a micro channel. Two case studies, namely the stochastic Kelvin-Helmholtz instability and stochastic vortex dipole, have been investigated. For all the problems results obtained using GG-ANOVA are in excellent agreement with benchmark solutions.
Relational similarity-based model of data part 1: foundations and query systems
NASA Astrophysics Data System (ADS)
Belohlavek, Radim; Vychodil, Vilem
2017-10-01
We present a general rank-aware model of data which supports handling of similarity in relational databases. The model is based on the assumption that in many cases it is desirable to replace equalities on values in data tables by similarity relations expressing degrees to which the values are similar. In this context, we study various phenomena which emerge in the model, including similarity-based queries and similarity-based data dependencies. Central notion in our model is that of a ranked data table over domains with similarities which is our counterpart to the notion of relation on relation scheme from the classical relational model. Compared to other approaches which cover related problems, we do not propose a similarity-based or ranking module on top of the classical relational model. Instead, we generalize the very core of the model by replacing the classical, two-valued logic upon which the classical model is built by a more general logic involving a scale of truth degrees that, in addition to the classical truth degrees 0 and 1, contains intermediate truth degrees. While the classical truth degrees 0 and 1 represent nonequality and equality of values, and subsequently mismatch and match of queries, the intermediate truth degrees in the new model represent similarity of values and partial match of queries. Moreover, the truth functions of many-valued logical connectives in the new model serve to aggregate degrees of similarity. The presented approach is conceptually clean, logically sound, and retains most properties of the classical model while enabling us to employ new types of queries and data dependencies. Most importantly, similarity is not handled in an ad hoc way or by putting a "similarity module" atop the classical model in our approach. Rather, it is consistently viewed as a notion that generalizes and replaces equality in the very core of the relational model. We present fundamentals of the formal model and two equivalent query systems which are analogues of the classical relational algebra and domain relational calculus with range declarations. In the sequel to this paper, we deal with similarity-based dependencies.
Selecting Indicator Portfolios for Marine Species and Food Webs: A Puget Sound Case Study
Kershner, Jessi; Samhouri, Jameal F.; James, C. Andrew; Levin, Phillip S.
2011-01-01
Ecosystem-based management (EBM) has emerged as a promising approach for maintaining the benefits humans want and need from the ocean, yet concrete approaches for implementing EBM remain scarce. A key challenge lies in the development of indicators that can provide useful information on ecosystem status and trends, and assess progress towards management goals. In this paper, we describe a generalized framework for the methodical and transparent selection of ecosystem indicators. We apply the framework to the second largest estuary in the United States – Puget Sound, Washington – where one of the most advanced EBM processes is currently underway. Rather than introduce a new method, this paper integrates a variety of familiar approaches into one step-by-step approach that will lead to more consistent and reliable reporting on ecosystem condition. Importantly, we demonstrate how a framework linking indicators to policy goals, as well as a clearly defined indicator evaluation and scoring process, can result in a portfolio of useful and complementary indicators based on the needs of different users (e.g., policy makers and scientists). Although the set of indicators described in this paper is specific to marine species and food webs, we provide a general approach that could be applied to any set of management objectives or ecological system. PMID:21991305
Relativistic GLONASS and geodesy
NASA Astrophysics Data System (ADS)
Mazurova, E. M.; Kopeikin, S. M.; Karpik, A. P.
2016-12-01
GNSS technology is playing a major role in applications to civil, industrial and scientific areas. Nowadays, there are two fully functional GNSS: American GPS and Russian GLONASS. Their data processing algorithms have been historically based on the Newtonian theory of space and time with only a few relativistic effects taken into account as small corrections preventing the system from degradation on a fairly long time. Continuously growing accuracy of geodetic measurements and atomic clocks suggests reconsidering the overall approach to the GNSS theoretical model based on the Einstein theory of general relativity. This is essentially more challenging but fundamentally consistent theoretical approach to relativistic space geodesy. In this paper, we overview the basic principles of the relativistic GNSS model and explain the advantages of such a system for GLONASS and other positioning systems. Keywords: relativistic GLONASS, Einstein theory of general relativity.
Protocol vulnerability detection based on network traffic analysis and binary reverse engineering.
Wen, Shameng; Meng, Qingkun; Feng, Chao; Tang, Chaojing
2017-01-01
Network protocol vulnerability detection plays an important role in many domains, including protocol security analysis, application security, and network intrusion detection. In this study, by analyzing the general fuzzing method of network protocols, we propose a novel approach that combines network traffic analysis with the binary reverse engineering method. For network traffic analysis, the block-based protocol description language is introduced to construct test scripts, while the binary reverse engineering method employs the genetic algorithm with a fitness function designed to focus on code coverage. This combination leads to a substantial improvement in fuzz testing for network protocols. We build a prototype system and use it to test several real-world network protocol implementations. The experimental results show that the proposed approach detects vulnerabilities more efficiently and effectively than general fuzzing methods such as SPIKE.
Andrić, Filip; Héberger, Károly
2015-02-06
Lipophilicity (logP) represents one of the most studied and most frequently used fundamental physicochemical properties. At present there are several possibilities for its quantitative expression and many of them stems from chromatographic experiments. Numerous attempts have been made to compare different computational methods, chromatographic methods vs. computational approaches, as well as chromatographic methods and direct shake-flask procedure without definite results or these findings are not accepted generally. In the present work numerous chromatographically derived lipophilicity measures in combination with diverse computational methods were ranked and clustered using the novel variable discrimination and ranking approaches based on the sum of ranking differences and the generalized pair correlation method. Available literature logP data measured on HILIC, and classical reversed-phase combining different classes of compounds have been compared with most frequently used multivariate data analysis techniques (principal component and hierarchical cluster analysis) as well as with the conclusions in the original sources. Chromatographic lipophilicity measures obtained under typical reversed-phase conditions outperform the majority of computationally estimated logPs. Oppositely, in the case of HILIC none of the many proposed chromatographic indices overcomes any of the computationally assessed logPs. Only two of them (logkmin and kmin) may be selected as recommended chromatographic lipophilicity measures. Both ranking approaches, sum of ranking differences and generalized pair correlation method, although based on different backgrounds, provides highly similar variable ordering and grouping leading to the same conclusions. Copyright © 2015. Published by Elsevier B.V.
Massa, K; Olsen, A; Sheshe, A; Ntakamulenga, R; Ndawi, B; Magnussen, P
2009-11-01
Control programmes generally use a school-based strategy of mass drug administration to reduce morbidity of schistosomiasis and soil-transmitted helminthiasis (STH) in school-aged populations. The success of school-based programmes depends on treatment coverage. The community-directed treatment (ComDT) approach has been implemented in the control of onchocerciasis and lymphatic filariasis in Africa and improves treatment coverage. This study compared the treatment coverage between the ComDT approach and the school-based treatment approach, where non-enrolled school-aged children were invited for treatment, in the control of schistosomiasis and STH among enrolled and non-enrolled school-aged children. Coverage during the first treatment round among enrolled children was similar for the two approaches (ComDT: 80.3% versus school: 82.1%, P=0.072). However, for the non-enrolled children the ComDT approach achieved a significantly higher coverage than the school-based approach (80.0 versus 59.2%, P<0.001). Similar treatment coverage levels were attained at the second treatment round. Again, equal levels of treatment coverage were found between the two approaches for the enrolled school-aged children, while the ComDT approach achieved a significantly higher coverage in the non-enrolled children. The results of this study showed that the ComDT approach can obtain significantly higher treatment coverage among the non-enrolled school-aged children compared to the school-based treatment approach for the control of schistosomiasis and STH.
Computer-based, Jeopardy™-like game in general chemistry for engineering majors
NASA Astrophysics Data System (ADS)
Ling, S. S.; Saffre, F.; Kadadha, M.; Gater, D. L.; Isakovic, A. F.
2013-03-01
We report on the design of Jeopardy™-like computer game for enhancement of learning of general chemistry for engineering majors. While we examine several parameters of student achievement and attitude, our primary concern is addressing the motivation of students, which tends to be low in a traditionally run chemistry lectures. The effect of the game-playing is tested by comparing paper-based game quiz, which constitutes a control group, and computer-based game quiz, constituting a treatment group. Computer-based game quizzes are Java™-based applications that students run once a week in the second part of the last lecture of the week. Overall effectiveness of the semester-long program is measured through pretest-postest conceptual testing of general chemistry. The objective of this research is to determine to what extent this ``gamification'' of the course delivery and course evaluation processes may be beneficial to the undergraduates' learning of science in general, and chemistry in particular. We present data addressing gender-specific difference in performance, as well as background (pre-college) level of general science and chemistry preparation. We outline the plan how to extend such approach to general physics courses and to modern science driven electives, and we offer live, in-lectures examples of our computer gaming experience. We acknowledge support from Khalifa University, Abu Dhabi
Universal Quantification in a Constraint-Based Planner
NASA Technical Reports Server (NTRS)
Golden, Keith; Frank, Jeremy; Clancy, Daniel (Technical Monitor)
2002-01-01
Constraints and universal quantification are both useful in planning, but handling universally quantified constraints presents some novel challenges. We present a general approach to proving the validity of universally quantified constraints. The approach essentially consists of checking that the constraint is not violated for all members of the universe. We show that this approach can sometimes be applied even when variable domains are infinite, and we present some useful special cases where this can be done efficiently.
Generic approach to access barriers in dehydrogenation reactions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, Liang; Vilella, Laia; Abild-Pedersen, Frank
The introduction of linear energy correlations, which explicitly relate adsorption energies of reaction intermediates and activation energies in heterogeneous catalysis, has proven to be a key component in the computational search for new and promising catalysts. A simple linear approach to estimate activation energies still requires a significant computational effort. To simplify this process and at the same time incorporate the need for enhanced complexity of reaction intermediates, we generalize a recently proposed approach that evaluates transition state energies based entirely on bond-order conservation arguments. Here, we show that similar variation of the local electronic structure along the reaction coordinatemore » introduces a set of general functions that accurately defines the transition state energy and are transferable to other reactions with similar bonding nature. With such an approach, more complex reaction intermediates can be targeted with an insignificant increase in computational effort and without loss of accuracy.« less
Generic approach to access barriers in dehydrogenation reactions
Yu, Liang; Vilella, Laia; Abild-Pedersen, Frank
2018-03-08
The introduction of linear energy correlations, which explicitly relate adsorption energies of reaction intermediates and activation energies in heterogeneous catalysis, has proven to be a key component in the computational search for new and promising catalysts. A simple linear approach to estimate activation energies still requires a significant computational effort. To simplify this process and at the same time incorporate the need for enhanced complexity of reaction intermediates, we generalize a recently proposed approach that evaluates transition state energies based entirely on bond-order conservation arguments. Here, we show that similar variation of the local electronic structure along the reaction coordinatemore » introduces a set of general functions that accurately defines the transition state energy and are transferable to other reactions with similar bonding nature. With such an approach, more complex reaction intermediates can be targeted with an insignificant increase in computational effort and without loss of accuracy.« less
Positional cloning in maize (Zea mays subsp. mays, Poaceae)1
Gallavotti, Andrea; Whipple, Clinton J.
2015-01-01
• Premise of the study: Positional (or map-based) cloning is a common approach to identify the molecular lesions causing mutant phenotypes. Despite its large and complex genome, positional cloning has been recently shown to be feasible in maize, opening up a diverse collection of mutants to molecular characterization. • Methods and Results: Here we outline a general protocol for positional cloning in maize. While the general strategy is similar to that used in other plant species, we focus on the unique resources and approaches that should be considered when applied to maize mutants. • Conclusions: Positional cloning approaches are appropriate for maize mutants and quantitative traits, opening up to molecular characterization the large array of genetic diversity in this agronomically important species. The cloning approach described should be broadly applicable to other species as more plant genomes become available. PMID:25606355
Parsimonious nonstationary flood frequency analysis
NASA Astrophysics Data System (ADS)
Serago, Jake M.; Vogel, Richard M.
2018-02-01
There is now widespread awareness of the impact of anthropogenic influences on extreme floods (and droughts) and thus an increasing need for methods to account for such influences when estimating a frequency distribution. We introduce a parsimonious approach to nonstationary flood frequency analysis (NFFA) based on a bivariate regression equation which describes the relationship between annual maximum floods, x, and an exogenous variable which may explain the nonstationary behavior of x. The conditional mean, variance and skewness of both x and y = ln (x) are derived, and combined with numerous common probability distributions including the lognormal, generalized extreme value and log Pearson type III models, resulting in a very simple and general approach to NFFA. Our approach offers several advantages over existing approaches including: parsimony, ease of use, graphical display, prediction intervals, and opportunities for uncertainty analysis. We introduce nonstationary probability plots and document how such plots can be used to assess the improved goodness of fit associated with a NFFA.
Organic Chemistry and the Native Plants of the Sonoran Desert: Conversion of Jojoba Oil to Biodiesel
ERIC Educational Resources Information Center
Daconta, Lisa V.; Minger, Timothy; Nedelkova, Valentina; Zikopoulos, John N.
2015-01-01
A new, general approach to the organic chemistry laboratory is introduced that is based on learning about organic chemistry techniques and research methods by exploring the natural products found in local native plants. As an example of this approach for the Sonoran desert region, the extraction of jojoba oil and its transesterification to…
Algorithmic Approaches for Place Recognition in Featureless, Walled Environments
2015-01-01
inertial measurement unit LIDAR light detection and ranging RANSAC random sample consensus SLAM simultaneous localization and mapping SUSAN smallest...algorithm 38 21 Typical input image for general junction based algorithm 39 22 Short exposure image of hallway junction taken by LIDAR 40 23...discipline of simultaneous localization and mapping ( SLAM ) has been studied intensively over the past several years. Many technical approaches
ERIC Educational Resources Information Center
Southern Connecticut State Coll., New Haven. Center for Interdisciplinary Creativity.
In this collection of papers Harold G. Cassidy outlines the conceptual framework for the conference which is based on a systems approach to development of practical action programs in education. A basic model is presented as a basis for shifting from the post-crisis to the pre-crisis approach to curriculum development and educational…
ERIC Educational Resources Information Center
Martinez, Angel; Lasser, Jon
2013-01-01
The process of creating child-developed board games in a counseling setting may promote social, emotional, and behavioral development in children. Using this creative approach, counselors can actively work with children to address referred concerns and build skills that may generalize outside of counseling sessions. A description of the method is…
Numerical Approach to Spatial Deterministic-Stochastic Models Arising in Cell Biology.
Schaff, James C; Gao, Fei; Li, Ye; Novak, Igor L; Slepchenko, Boris M
2016-12-01
Hybrid deterministic-stochastic methods provide an efficient alternative to a fully stochastic treatment of models which include components with disparate levels of stochasticity. However, general-purpose hybrid solvers for spatially resolved simulations of reaction-diffusion systems are not widely available. Here we describe fundamentals of a general-purpose spatial hybrid method. The method generates realizations of a spatially inhomogeneous hybrid system by appropriately integrating capabilities of a deterministic partial differential equation solver with a popular particle-based stochastic simulator, Smoldyn. Rigorous validation of the algorithm is detailed, using a simple model of calcium 'sparks' as a testbed. The solver is then applied to a deterministic-stochastic model of spontaneous emergence of cell polarity. The approach is general enough to be implemented within biologist-friendly software frameworks such as Virtual Cell.
On the role of general system theory for functional neuroimaging.
Stephan, Klaas Enno
2004-12-01
One of the most important goals of neuroscience is to establish precise structure-function relationships in the brain. Since the 19th century, a major scientific endeavour has been to associate structurally distinct cortical regions with specific cognitive functions. This was traditionally accomplished by correlating microstructurally defined areas with lesion sites found in patients with specific neuropsychological symptoms. Modern neuroimaging techniques with high spatial resolution have promised an alternative approach, enabling non-invasive measurements of regionally specific changes of brain activity that are correlated with certain components of a cognitive process. Reviewing classic approaches towards brain structure-function relationships that are based on correlational approaches, this article argues that these approaches are not sufficient to provide an understanding of the operational principles of a dynamic system such as the brain but must be complemented by models based on general system theory. These models reflect the connectional structure of the system under investigation and emphasize context-dependent couplings between the system elements in terms of effective connectivity. The usefulness of system models whose parameters are fitted to measured functional imaging data for testing hypotheses about structure-function relationships in the brain and their potential for clinical applications is demonstrated by several empirical examples.
On the role of general system theory for functional neuroimaging
Stephan, Klaas Enno
2004-01-01
One of the most important goals of neuroscience is to establish precise structure–function relationships in the brain. Since the 19th century, a major scientific endeavour has been to associate structurally distinct cortical regions with specific cognitive functions. This was traditionally accomplished by correlating microstructurally defined areas with lesion sites found in patients with specific neuropsychological symptoms. Modern neuroimaging techniques with high spatial resolution have promised an alternative approach, enabling non-invasive measurements of regionally specific changes of brain activity that are correlated with certain components of a cognitive process. Reviewing classic approaches towards brain structure–function relationships that are based on correlational approaches, this article argues that these approaches are not sufficient to provide an understanding of the operational principles of a dynamic system such as the brain but must be complemented by models based on general system theory. These models reflect the connectional structure of the system under investigation and emphasize context-dependent couplings between the system elements in terms of effective connectivity. The usefulness of system models whose parameters are fitted to measured functional imaging data for testing hypotheses about structure–function relationships in the brain and their potential for clinical applications is demonstrated by several empirical examples. PMID:15610393
A stochastic approach for automatic generation of urban drainage systems.
Möderl, M; Butler, D; Rauch, W
2009-01-01
Typically, performance evaluation of new developed methodologies is based on one or more case studies. The investigation of multiple real world case studies is tedious and time consuming. Moreover extrapolating conclusions from individual investigations to a general basis is arguable and sometimes even wrong. In this article a stochastic approach is presented to evaluate new developed methodologies on a broader basis. For the approach the Matlab-tool "Case Study Generator" is developed which generates a variety of different virtual urban drainage systems automatically using boundary conditions e.g. length of urban drainage system, slope of catchment surface, etc. as input. The layout of the sewer system is based on an adapted Galton-Watson branching process. The sub catchments are allocated considering a digital terrain model. Sewer system components are designed according to standard values. In total, 10,000 different virtual case studies of urban drainage system are generated and simulated. Consequently, simulation results are evaluated using a performance indicator for surface flooding. Comparison between results of the virtual and two real world case studies indicates the promise of the method. The novelty of the approach is that it is possible to get more general conclusions in contrast to traditional evaluations with few case studies.
Li, Hong Zhi; Tao, Wei; Gao, Ting; Li, Hui; Lu, Ying Hua; Su, Zhong Min
2011-01-01
We propose a generalized regression neural network (GRNN) approach based on grey relational analysis (GRA) and principal component analysis (PCA) (GP-GRNN) to improve the accuracy of density functional theory (DFT) calculation for homolysis bond dissociation energies (BDE) of Y-NO bond. As a demonstration, this combined quantum chemistry calculation with the GP-GRNN approach has been applied to evaluate the homolysis BDE of 92 Y-NO organic molecules. The results show that the ull-descriptor GRNN without GRA and PCA (F-GRNN) and with GRA (G-GRNN) approaches reduce the root-mean-square (RMS) of the calculated homolysis BDE of 92 organic molecules from 5.31 to 0.49 and 0.39 kcal mol(-1) for the B3LYP/6-31G (d) calculation. Then the newly developed GP-GRNN approach further reduces the RMS to 0.31 kcal mol(-1). Thus, the GP-GRNN correction on top of B3LYP/6-31G (d) can improve the accuracy of calculating the homolysis BDE in quantum chemistry and can predict homolysis BDE which cannot be obtained experimentally.
Meerpohl, Joerg J; Schell, Lisa K; Bassler, Dirk; Gallus, Silvano; Kleijnen, Jos; Kulig, Michael; La Vecchia, Carlo; Marušić, Ana; Ravaud, Philippe; Reis, Andreas; Schmucker, Christine; Strech, Daniel; Urrútia, Gerard; Antes, Gerd
2015-01-01
Background Dissemination bias in clinical research severely impedes informed decision-making not only for healthcare professionals and patients, but also for funders, research ethics committees, regulatory bodies and other stakeholder groups that make health-related decisions. Decisions based on incomplete and biased evidence cannot only harm people, but may also have huge financial implications by wasting resources on ineffective or harmful diagnostic and therapeutic measures, and unnecessary research. Owing to involvement of multiple stakeholders, it remains easy for any single group to assign responsibility for resolving the problem to others. Objective To develop evidence-informed general and targeted recommendations addressing the various stakeholders involved in knowledge generation and dissemination to help overcome the problem of dissemination bias on the basis of previously collated evidence. Methods Based on findings from systematic reviews, document analyses and surveys, we developed general and targeted draft recommendations. During a 2-day workshop in summer 2013, these draft recommendations were discussed with external experts and key stakeholders, and refined following a rigorous and transparent methodological approach. Results Four general, overarching recommendations applicable to all or most stakeholder groups were formulated, addressing (1) awareness raising, (2) implementation of targeted recommendations, (3) trial registration and results posting, and (4) systematic approaches to evidence synthesis. These general recommendations are complemented and specified by 47 targeted recommendations tailored towards funding agencies, pharmaceutical and device companies, research institutions, researchers (systematic reviewers and trialists), research ethics committees, trial registries, journal editors and publishers, regulatory agencies, benefit (health technology) assessment institutions and legislators. Conclusions Despite various recent examples of dissemination bias and several initiatives to reduce it, the problem of dissemination bias has not been resolved. Tailored recommendations based on a comprehensive approach will hopefully help increase transparency in biomedical research by overcoming the failure to disseminate negative findings. PMID:25943371
The educational and awareness purposes of the Paideia approach for heritage management
NASA Astrophysics Data System (ADS)
Carbone, F.; Oosterbeek, L.; Costa, C.
2012-06-01
The need to raise awareness among the communities about the challenge of resource use - and, more generally, about the principles of sustainability - is the reason why the United Nations General Assembly proclaimed, in December 2002, the United Nations Decade of Education for Sustainable Development, 2005-2014 (DESD). For operators and managers of cultural and natural heritage, it represents a profound challenge to their ability to transmit the content of scientific knowledge to the general public in order to empower everyone on the preservation of cultural and natural resources, and to raise awareness about the potential that mankind has at its disposal. In this context, the application of the PAIDEIA APPROACH for the management of cultural heritage is the key to the recovery of socio-economic values intrinsic to these resources. This approach to management is based on the enhancement of cultural (namely archaeological) and natural heritage for social benefit and it involves the tourist trade as a vehicle of knowledge transmission, intercultural dialogue and socio-economic sustainable development.
Blangiardo, Marta; Finazzi, Francesco; Cameletti, Michela
2016-08-01
Exposure to high levels of air pollutant concentration is known to be associated with respiratory problems which can translate into higher morbidity and mortality rates. The link between air pollution and population health has mainly been assessed considering air quality and hospitalisation or mortality data. However, this approach limits the analysis to individuals characterised by severe conditions. In this paper we evaluate the link between air pollution and respiratory diseases using general practice drug prescriptions for chronic respiratory diseases, which allow to draw conclusions based on the general population. We propose a two-stage statistical approach: in the first stage we specify a space-time model to estimate the monthly NO2 concentration integrating several data sources characterised by different spatio-temporal resolution; in the second stage we link the concentration to the β2-agonists prescribed monthly by general practices in England and we model the prescription rates through a small area approach. Copyright © 2016 Elsevier Ltd. All rights reserved.
Bioinspired Methodology for Artificial Olfaction
Raman, Baranidharan; Hertz, Joshua L.; Benkstein, Kurt D.; Semancik, Steve
2008-01-01
Artificial olfaction is a potential tool for noninvasive chemical monitoring. Application of “electronic noses” typically involves recognition of “pretrained” chemicals, while long-term operation and generalization of training to allow chemical classification of “unknown” analytes remain challenges. The latter analytical capability is critically important, as it is unfeasible to pre-expose the sensor to every analyte it might encounter. Here, we demonstrate a biologically inspired approach where the recognition and generalization problems are decoupled and resolved in a hierarchical fashion. Analyte composition is refined in a progression from general (e.g., target is a hydrocarbon) to precise (e.g., target is ethane), using highly optimized response features for each step. We validate this approach using a MEMS-based chemiresistive microsensor array. We show that this approach, a unique departure from existing methodologies in artificial olfaction, allows the recognition module to better mitigate sensor-aging effects and to better classify unknowns, enhancing the utility of chemical sensors for real-world applications. PMID:18855409
Machine learning and computer vision approaches for phenotypic profiling.
Grys, Ben T; Lo, Dara S; Sahin, Nil; Kraus, Oren Z; Morris, Quaid; Boone, Charles; Andrews, Brenda J
2017-01-02
With recent advances in high-throughput, automated microscopy, there has been an increased demand for effective computational strategies to analyze large-scale, image-based data. To this end, computer vision approaches have been applied to cell segmentation and feature extraction, whereas machine-learning approaches have been developed to aid in phenotypic classification and clustering of data acquired from biological images. Here, we provide an overview of the commonly used computer vision and machine-learning methods for generating and categorizing phenotypic profiles, highlighting the general biological utility of each approach. © 2017 Grys et al.
Single-particle dynamics of the Anderson model: a local moment approach
NASA Astrophysics Data System (ADS)
Glossop, Matthew T.; Logan, David E.
2002-07-01
A non-perturbative local moment approach to single-particle dynamics of the general asymmetric Anderson impurity model is developed. The approach encompasses all energy scales and interaction strengths. It captures thereby strong coupling Kondo behaviour, including the resultant universal scaling behaviour of the single-particle spectrum; as well as the mixed valence and essentially perturbative empty orbital regimes. The underlying approach is physically transparent and innately simple, and as such is capable of practical extension to lattice-based models within the framework of dynamical mean-field theory.
Machine learning and computer vision approaches for phenotypic profiling
Morris, Quaid
2017-01-01
With recent advances in high-throughput, automated microscopy, there has been an increased demand for effective computational strategies to analyze large-scale, image-based data. To this end, computer vision approaches have been applied to cell segmentation and feature extraction, whereas machine-learning approaches have been developed to aid in phenotypic classification and clustering of data acquired from biological images. Here, we provide an overview of the commonly used computer vision and machine-learning methods for generating and categorizing phenotypic profiles, highlighting the general biological utility of each approach. PMID:27940887
A fuzzy case based reasoning tool for model based approach to rocket engine health monitoring
NASA Technical Reports Server (NTRS)
Krovvidy, Srinivas; Nolan, Adam; Hu, Yong-Lin; Wee, William G.
1992-01-01
In this system we develop a fuzzy case based reasoner that can build a case representation for several past anomalies detected, and we develop case retrieval methods that can be used to index a relevant case when a new problem (case) is presented using fuzzy sets. The choice of fuzzy sets is justified by the uncertain data. The new problem can be solved using knowledge of the model along with the old cases. This system can then be used to generalize the knowledge from previous cases and use this generalization to refine the existing model definition. This in turn can help to detect failures using the model based algorithms.
Design issues of a multimode interference-based 3-dB splitter.
Themistos, Christos; Rahman, B M Azizur
2002-11-20
We have investigated important issues such as the power loss, the loss imbalance the fabrication tolerances, and the wavelength dependence for the design of a multimode interference-based 3-dB splitter on deeply etched InP waveguides under general, restricted, and symmetric interference mechanisms. For this investigation, we used the finite-element-based beam propagation approach. Results are presented.
Mathematical models for nonparametric inferences from line transect data
Burnham, K.P.; Anderson, D.R.
1976-01-01
A general mathematical theory of line transects is develoepd which supplies a framework for nonparametric density estimation based on either right angle or sighting distances. The probability of observing a point given its right angle distance (y) from the line is generalized to an arbitrary function g(y). Given only that g(O) = 1, it is shown there are nonparametric approaches to density estimation using the observed right angle distances. The model is then generalized to include sighting distances (r). Let f(y/r) be the conditional distribution of right angle distance given sighting distance. It is shown that nonparametric estimation based only on sighting distances requires we know the transformation of r given by f(O/r).
Phases, phase equilibria, and phase rules in low-dimensional systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frolov, T., E-mail: timfrol@berkeley.edu; Mishin, Y., E-mail: ymishin@gmu.edu
2015-07-28
We present a unified approach to thermodynamic description of one, two, and three dimensional phases and phase transformations among them. The approach is based on a rigorous definition of a phase applicable to thermodynamic systems of any dimensionality. Within this approach, the same thermodynamic formalism can be applied for the description of phase transformations in bulk systems, interfaces, and line defects separating interface phases. For both lines and interfaces, we rigorously derive an adsorption equation, the phase coexistence equations, and other thermodynamic relations expressed in terms of generalized line and interface excess quantities. As a generalization of the Gibbs phasemore » rule for bulk phases, we derive phase rules for lines and interfaces and predict the maximum number of phases than may coexist in systems of the respective dimensionality.« less
Nonlinear dynamics of laser systems with elements of a chaos: Advanced computational code
NASA Astrophysics Data System (ADS)
Buyadzhi, V. V.; Glushkov, A. V.; Khetselius, O. Yu; Kuznetsova, A. A.; Buyadzhi, A. A.; Prepelitsa, G. P.; Ternovsky, V. B.
2017-10-01
A general, uniform chaos-geometric computational approach to analysis, modelling and prediction of the non-linear dynamics of quantum and laser systems (laser and quantum generators system etc) with elements of the deterministic chaos is briefly presented. The approach is based on using the advanced generalized techniques such as the wavelet analysis, multi-fractal formalism, mutual information approach, correlation integral analysis, false nearest neighbour algorithm, the Lyapunov’s exponents analysis, and surrogate data method, prediction models etc There are firstly presented the numerical data on the topological and dynamical invariants (in particular, the correlation, embedding, Kaplan-York dimensions, the Lyapunov’s exponents, Kolmogorov’s entropy and other parameters) for laser system (the semiconductor GaAs/GaAlAs laser with a retarded feedback) dynamics in a chaotic and hyperchaotic regimes.
Dangel, Bärbel; Korporal, Johannes
2003-02-01
Activating nursing based on the criteria of the long-term care insurance may be understood as a second specific and nursing approach of rehabilitation beneath medical rehabilitation. Activating nursing is unspecific, characterized by the norms and guidelines of the long-term care insurance, but defined as the general norm of practical nursing. A professional nursing definition for a specific concept is lacking just as funding of nursing science. Adhering to activating nursing as a nursing complement to medical rehabilitation in the framework of long-term care insurance requires professional development and funding. Furthermore, more support of social law is necessary, which depends on professional nursing and nursing science-based indication and the intervention approach. The article develops an approach--based on a study about rehabilitation of people in need for care--and reflects on implementation and acceptance by people in the need of care.
Dutta, Shuchismita; Zardecki, Christine; Goodsell, David S.; Berman, Helen M.
2010-01-01
The Research Collaboratory for Structural Bioinformatics Protein Data Bank (RCSB PDB) supports scientific research and education worldwide by providing an essential resource of information on biomolecular structures. In addition to serving as a deposition, data-processing and distribution center for PDB data, the RCSB PDB offers resources and online materials that different audiences can use to customize their structural biology instruction. These include resources for general audiences that present macromolecular structure in the context of a biological theme, method-based materials for researchers who take a more traditional approach to the presentation of structural science, and materials that mix theme-based and method-based approaches for educators and students. Through these efforts the RCSB PDB aims to enable optimal use of structural data by researchers, educators and students designing and understanding experiments in biology, chemistry and medicine, and by general users making informed decisions about their life and health. PMID:20877496
Megger, Dominik A; Pott, Leona L; Rosowski, Kristin; Zülch, Birgit; Tautges, Stephanie; Bracht, Thilo; Sitek, Barbara
2017-01-01
Tandem mass tags (TMT) are usually introduced at the levels of isolated proteins or peptides. Here, for the first time, we report the labeling of whole cells and a critical evaluation of its performance in comparison to conventional labeling approaches. The obtained results indicated that TMT protein labeling using intact cells is generally possible, if it is coupled to a subsequent enrichment using anti-TMT antibody. The quantitative results were similar to those obtained after labeling of isolated proteins and both were found to be slightly complementary to peptide labeling. Furthermore, when using NHS-based TMT, no specificity towards cell surface proteins was observed in the case of cell labeling. In summary, the conducted study revealed first evidence for the general possibility of TMT cell labeling and highlighted limitations of NHS-based labeling reagents. Future studies should therefore focus on the synthesis and investigation of membrane impermeable TMTs to increase specificity towards cell surface proteins.
Chen, Lipeng; Borrelli, Raffaele; Zhao, Yang
2017-11-22
The dynamics of a coupled electron-boson system is investigated by employing a multitude of the Davydov D 1 trial states, also known as the multi-D 1 Ansatz, and a second trial state based on a superposition of the time-dependent generalized coherent state (GCS Ansatz). The two Ansätze are applied to study population dynamics in the spin-boson model and the Holstein molecular crystal model, and a detailed comparison with numerically exact results obtained by the (multilayer) multiconfiguration time-dependent Hartree method and the hierarchy equations of motion approach is drawn. It is found that the two methodologies proposed here have significantly improved over that with the single D 1 Ansatz, yielding quantitatively accurate results even in the critical cases of large energy biases and large transfer integrals. The two methodologies provide new effective tools for accurate, efficient simulation of many-body quantum dynamics thanks to a relatively small number of parameters which characterize the electron-nuclear wave functions. The wave-function-based approaches are capable of tracking explicitly detailed bosonic dynamics, which is absent by construct in approaches based on the reduced density matrix. The efficiency and flexibility of our methods are also advantages as compared with numerically exact approaches such as QUAPI and HEOM, especially at low temperatures and in the strong coupling regime.
Broad-based visual benefits from training with an integrated perceptual-learning video game.
Deveau, Jenni; Lovcik, Gary; Seitz, Aaron R
2014-06-01
Perception is the window through which we understand all information about our environment, and therefore deficits in perception due to disease, injury, stroke or aging can have significant negative impacts on individuals' lives. Research in the field of perceptual learning has demonstrated that vision can be improved in both normally seeing and visually impaired individuals, however, a limitation of most perceptual learning approaches is their emphasis on isolating particular mechanisms. In the current study, we adopted an integrative approach where the goal is not to achieve highly specific learning but instead to achieve general improvements to vision. We combined multiple perceptual learning approaches that have individually contributed to increasing the speed, magnitude and generality of learning into a perceptual-learning based video-game. Our results demonstrate broad-based benefits of vision in a healthy adult population. Transfer from the game includes; improvements in acuity (measured with self-paced standard eye-charts), improvement along the full contrast sensitivity function, and improvements in peripheral acuity and contrast thresholds. The use of this type of this custom video game framework built up from psychophysical approaches takes advantage of the benefits found from video game training while maintaining a tight link to psychophysical designs that enable understanding of mechanisms of perceptual learning and has great potential both as a scientific tool and as therapy to help improve vision. Copyright © 2014 Elsevier B.V. All rights reserved.
Duda, Piotr; Jaworski, Maciej; Rutkowski, Leszek
2018-03-01
One of the greatest challenges in data mining is related to processing and analysis of massive data streams. Contrary to traditional static data mining problems, data streams require that each element is processed only once, the amount of allocated memory is constant and the models incorporate changes of investigated streams. A vast majority of available methods have been developed for data stream classification and only a few of them attempted to solve regression problems, using various heuristic approaches. In this paper, we develop mathematically justified regression models working in a time-varying environment. More specifically, we study incremental versions of generalized regression neural networks, called IGRNNs, and we prove their tracking properties - weak (in probability) and strong (with probability one) convergence assuming various concept drift scenarios. First, we present the IGRNNs, based on the Parzen kernels, for modeling stationary systems under nonstationary noise. Next, we extend our approach to modeling time-varying systems under nonstationary noise. We present several types of concept drifts to be handled by our approach in such a way that weak and strong convergence holds under certain conditions. Finally, in the series of simulations, we compare our method with commonly used heuristic approaches, based on forgetting mechanism or sliding windows, to deal with concept drift. Finally, we apply our concept in a real life scenario solving the problem of currency exchange rates prediction.
On the Use of Kronecker Operators for the Solution of Generalized Stochastic Petri Nets
NASA Technical Reports Server (NTRS)
Ciardo, Gianfranco; Tilgner, Marco
1996-01-01
We discuss how to describe the Markov chain underlying a generalized stochastic Petri net using Kronecker operators on smaller matrices. We extend previous approaches by allowing both an extensive type of marking-dependent behavior for the transitions and the presence of immediate synchronizations. The derivation of the results is thoroughly formalized, including the use of Kronecker operators in the treatment of the vanishing markings and the computation of impulse-based reward measures. We use our techniques to analyze a model whose solution using conventional methods would fail because of the state-space explosion. In the conclusion, we point out ideas to parallelize our approach.
To what extent information technology can be really useful in education?
NASA Astrophysics Data System (ADS)
Kalashnikov, N. P.; Olchak, A. S.; Scherbachev, O. V.
2017-01-01
Authors consider particular cases when evidently beneficial (in general) introduction of information technologies into educational process come across certain psychological limitations, turning its benefits into losses. The evolution of approach to education - from traditional to IT-based is traced. The examples are provided when the exaggerated IT-component of educational process leads to evident losses in both professional education and general cultural background of students. The authors are discussing certain compromise solutions between conservative and modernistic educational approaches. In the authors opinion the healthy portion of traditional conservative educational technologies may bring only benefits for the newer generations of the globalized IT-society.
ERIC Educational Resources Information Center
Botts, Dawn C.; Losardo, Angela S.; Tillery, Christina Y.; Werts, Margaret G.
2014-01-01
This replication study focused on the effectiveness of two different intervention approaches, activity-based intervention and embedded direct instruction, on the acquisition, generalization, and maintenance of phonological awareness, a key area of emergent literacy, by preschool children with language delays. Five male preschool participants with…
Soft Assembling Project-Based Learning and Leadership in Japan
ERIC Educational Resources Information Center
Knight, Kevin; Murphey, Tim
2017-01-01
In this article, we initially focus on how the conceptualization of leadership by Knight (2013a) in his leadership seminars became the basis for choosing a project-based learning (PBL) approach. We then consider how soft assembling can enhance the leadership project activities of student teams and group-work in general classes. Soft assembling…
Liu, Jing-fu; Liu, Rui; Yin, Yong-guang; Jiang, Gui-bin
2009-03-28
Capable of preserving the sizes and shapes of nanomaterials during the phase transferring, Triton X-114 based cloud point extraction provides a general, simple, and cost-effective route for reversible concentration/separation or dispersion of various nanomaterials in the aqueous phase.
Dynamic Attack Tree Tool for Risk Assessments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Black, Karl
2012-03-13
DATT enables interactive visualization, qualitative analysis and recording of cyber and other forms of risk. It facilitates dynamic risk-based approaches (as opposed to static compliance-based) to security and risk management in general. DATT allows decision makers to consistently prioritize risk mitigation strategies and quickly see where attention is most needed across the enterprise.
Learning quadratic receptive fields from neural responses to natural stimuli.
Rajan, Kanaka; Marre, Olivier; Tkačik, Gašper
2013-07-01
Models of neural responses to stimuli with complex spatiotemporal correlation structure often assume that neurons are selective for only a small number of linear projections of a potentially high-dimensional input. In this review, we explore recent modeling approaches where the neural response depends on the quadratic form of the input rather than on its linear projection, that is, the neuron is sensitive to the local covariance structure of the signal preceding the spike. To infer this quadratic dependence in the presence of arbitrary (e.g., naturalistic) stimulus distribution, we review several inference methods, focusing in particular on two information theory-based approaches (maximization of stimulus energy and of noise entropy) and two likelihood-based approaches (Bayesian spike-triggered covariance and extensions of generalized linear models). We analyze the formal relationship between the likelihood-based and information-based approaches to demonstrate how they lead to consistent inference. We demonstrate the practical feasibility of these procedures by using model neurons responding to a flickering variance stimulus.
Landmark-based elastic registration using approximating thin-plate splines.
Rohr, K; Stiehl, H S; Sprengel, R; Buzug, T M; Weese, J; Kuhn, M H
2001-06-01
We consider elastic image registration based on a set of corresponding anatomical point landmarks and approximating thin-plate splines. This approach is an extension of the original interpolating thin-plate spline approach and allows to take into account landmark localization errors. The extension is important for clinical applications since landmark extraction is always prone to error. Our approach is based on a minimizing functional and can cope with isotropic as well as anisotropic landmark errors. In particular, in the latter case it is possible to include different types of landmarks, e.g., unique point landmarks as well as arbitrary edge points. Also, the scheme is general with respect to the image dimension and the order of smoothness of the underlying functional. Optimal affine transformations as well as interpolating thin-plate splines are special cases of this scheme. To localize landmarks we use a semi-automatic approach which is based on three-dimensional (3-D) differential operators. Experimental results are presented for two-dimensional as well as 3-D tomographic images of the human brain.
A bounding-based solution approach for the continuous arc covering problem
NASA Astrophysics Data System (ADS)
Wei, Ran; Murray, Alan T.; Batta, Rajan
2014-04-01
Road segments, telecommunication wiring, water and sewer pipelines, canals and the like are important features of the urban environment. They are often conceived of and represented as network-based arcs. As a result of the usefulness and significance of arc-based features, there is a need to site facilities along arcs to serve demand. Examples of such facilities include surveillance equipment, cellular towers, refueling centers and emergency response stations, with the intent of being economically efficient as well as providing good service along the arcs. While this amounts to a continuous location problem by nature, various discretizations are generally relied upon to solve such problems. The result is potential for representation errors that negatively impact analysis and decision making. This paper develops a solution approach for the continuous arc covering problem that theoretically eliminates representation errors. The developed approach is applied to optimally place acoustic sensors and cellular base stations along a road network. The results demonstrate the effectiveness of this approach for ameliorating any error and uncertainty in the modeling process.
CHAMP: a locally adaptive unmixing-based hyperspectral anomaly detection algorithm
NASA Astrophysics Data System (ADS)
Crist, Eric P.; Thelen, Brian J.; Carrara, David A.
1998-10-01
Anomaly detection offers a means by which to identify potentially important objects in a scene without prior knowledge of their spectral signatures. As such, this approach is less sensitive to variations in target class composition, atmospheric and illumination conditions, and sensor gain settings than would be a spectral matched filter or similar algorithm. The best existing anomaly detectors generally fall into one of two categories: those based on local Gaussian statistics, and those based on linear mixing moles. Unmixing-based approaches better represent the real distribution of data in a scene, but are typically derived and applied on a global or scene-wide basis. Locally adaptive approaches allow detection of more subtle anomalies by accommodating the spatial non-homogeneity of background classes in a typical scene, but provide a poorer representation of the true underlying background distribution. The CHAMP algorithm combines the best attributes of both approaches, applying a linear-mixing model approach in a spatially adaptive manner. The algorithm itself, and teste results on simulated and actual hyperspectral image data, are presented in this paper.
ERIC Educational Resources Information Center
Association of American Colleges and Universities, 2016
2016-01-01
This report summarizes key findings from a national survey among chief academic officers at Association of American Colleges and Universities (AAC&U) member institutions and explores how institutions are defining common learning outcomes, trends related to general education design and the use of emerging, evidence-based teaching and learning…
ERIC Educational Resources Information Center
Renberg, Ellinor Salander; Hjelmeland, Heidi; Koposov, Roman
2008-01-01
Our aim was to build a model delineating the relationship between attitudes toward suicide and suicidal behavior and to assess equivalence by applying the model on data from different countries. Representative samples from the general population were approached in Sweden, Norway, and Russia with the Attitudes Toward Suicide (ATTS) questionnaire.…
Developing a Gesture-Based Game for Mentally Disabled People to Teach Basic Life Skills
ERIC Educational Resources Information Center
Nazirzadeh, Mohammad Javad; Çagiltay, Kürsat; Karasu, Necdet
2017-01-01
It is understood that, for mentally disabled people, it is hard to generalize skills and concepts from one setting to another. One approach to teach generalization is solving the problems related to their daily lives, which helps them to reinforce some of their behaviors that would occur in the natural environment. The aim of this study is to…
Liu, Chunbo; Pan, Feng; Li, Yun
2016-07-29
Glutamate is of great importance in food and pharmaceutical industries. There is still lack of effective statistical approaches for fault diagnosis in the fermentation process of glutamate. To date, the statistical approach based on generalized additive model (GAM) and bootstrap has not been used for fault diagnosis in fermentation processes, much less the fermentation process of glutamate with small samples sets. A combined approach of GAM and bootstrap was developed for the online fault diagnosis in the fermentation process of glutamate with small sample sets. GAM was first used to model the relationship between glutamate production and different fermentation parameters using online data from four normal fermentation experiments of glutamate. The fitted GAM with fermentation time, dissolved oxygen, oxygen uptake rate and carbon dioxide evolution rate captured 99.6 % variance of glutamate production during fermentation process. Bootstrap was then used to quantify the uncertainty of the estimated production of glutamate from the fitted GAM using 95 % confidence interval. The proposed approach was then used for the online fault diagnosis in the abnormal fermentation processes of glutamate, and a fault was defined as the estimated production of glutamate fell outside the 95 % confidence interval. The online fault diagnosis based on the proposed approach identified not only the start of the fault in the fermentation process, but also the end of the fault when the fermentation conditions were back to normal. The proposed approach only used a small sample sets from normal fermentations excitements to establish the approach, and then only required online recorded data on fermentation parameters for fault diagnosis in the fermentation process of glutamate. The proposed approach based on GAM and bootstrap provides a new and effective way for the fault diagnosis in the fermentation process of glutamate with small sample sets.
A knowledge-based approach to improving optimization techniques in system planning
NASA Technical Reports Server (NTRS)
Momoh, J. A.; Zhang, Z. Z.
1990-01-01
A knowledge-based (KB) approach to improve mathematical programming techniques used in the system planning environment is presented. The KB system assists in selecting appropriate optimization algorithms, objective functions, constraints and parameters. The scheme is implemented by integrating symbolic computation of rules derived from operator and planner's experience and is used for generalized optimization packages. The KB optimization software package is capable of improving the overall planning process which includes correction of given violations. The method was demonstrated on a large scale power system discussed in the paper.
Shape-based approach for the estimation of individual facial mimics in craniofacial surgery planning
NASA Astrophysics Data System (ADS)
Gladilin, Evgeny; Zachow, Stefan; Deuflhard, Peter; Hege, Hans-Christian
2002-05-01
Besides the static soft tissue prediction, the estimation of basic facial emotion expressions is another important criterion for the evaluation of craniofacial surgery planning. For a realistic simulation of facial mimics, an adequate biomechanical model of soft tissue including the mimic musculature is needed. In this work, we present an approach for the modeling of arbitrarily shaped muscles and the estimation of basic individual facial mimics, which is based on the geometrical model derived from the individual tomographic data and the general finite element modeling of soft tissue biomechanics.
Collective learning modeling based on the kinetic theory of active particles.
Burini, D; De Lillo, S; Gibelli, L
2016-03-01
This paper proposes a systems approach to the theory of perception and learning in populations composed of many living entities. Starting from a phenomenological description of these processes, a mathematical structure is derived which is deemed to incorporate their complexity features. The modeling is based on a generalization of kinetic theory methods where interactions are described by theoretical tools of game theory. As an application, the proposed approach is used to model the learning processes that take place in a classroom. Copyright © 2015 Elsevier B.V. All rights reserved.
Development Issues on Linked Data Weblog Enrichment
NASA Astrophysics Data System (ADS)
Ruiz-Rube, Iván; Cornejo, Carlos M.; Dodero, Juan Manuel; García, Vicente M.
In this paper, we describe the issues found during the development of LinkedBlog, a Linked Data extension for WordPress blogs. This extension enables to enrich text-based and video information contained in blog entries with RDF triples that are suitable to be stored, managed and exploited by other web-based applications. The issues have to do with the generality, usability, tracking, depth, security, trustiness and performance of the linked data enrichment process. The presented annotation approach aims at maintaining web-based contents independent from the underlying ontological model, by providing a loosely coupled RDFa-based approach in the linked data application. Finally, we detail how the performance of annotations can be improved through a semantic reasoner.
Sanchon-Lopez, Beatriz; Everett, Jeremy R
2016-09-02
A new, simple-to-implement and quantitative approach to assessing the confidence in NMR-based identification of known metabolites is introduced. The approach is based on a topological analysis of metabolite identification information available from NMR spectroscopy studies and is a development of the metabolite identification carbon efficiency (MICE) method. New topological metabolite identification indices are introduced, analyzed, and proposed for general use, including topological metabolite identification carbon efficiency (tMICE). Because known metabolite identification is one of the key bottlenecks in either NMR-spectroscopy- or mass spectrometry-based metabonomics/metabolomics studies, and given the fact that there is no current consensus on how to assess metabolite identification confidence, it is hoped that these new approaches and the topological indices will find utility.
Understanding similarity of groundwater systems with empirical copulas
NASA Astrophysics Data System (ADS)
Haaf, Ezra; Kumar, Rohini; Samaniego, Luis; Barthel, Roland
2016-04-01
Within the classification framework for groundwater systems that aims for identifying similarity of hydrogeological systems and transferring information from a well-observed to an ungauged system (Haaf and Barthel, 2015; Haaf and Barthel, 2016), we propose a copula-based method for describing groundwater-systems similarity. Copulas are an emerging method in hydrological sciences that make it possible to model the dependence structure of two groundwater level time series, independently of the effects of their marginal distributions. This study is based on Samaniego et al. (2010), which described an approach calculating dissimilarity measures from bivariate empirical copula densities of streamflow time series. Subsequently, streamflow is predicted in ungauged basins by transferring properties from similar catchments. The proposed approach is innovative because copula-based similarity has not yet been applied to groundwater systems. Here we estimate the pairwise dependence structure of 600 wells in Southern Germany using 10 years of weekly groundwater level observations. Based on these empirical copulas, dissimilarity measures are estimated, such as the copula's lower- and upper corner cumulated probability, copula-based Spearman's rank correlation - as proposed by Samaniego et al. (2010). For the characterization of groundwater systems, copula-based metrics are compared with dissimilarities obtained from precipitation signals corresponding to the presumed area of influence of each groundwater well. This promising approach provides a new tool for advancing similarity-based classification of groundwater system dynamics. Haaf, E., Barthel, R., 2015. Methods for assessing hydrogeological similarity and for classification of groundwater systems on the regional scale, EGU General Assembly 2015, Vienna, Austria. Haaf, E., Barthel, R., 2016. An approach for classification of hydrogeological systems at the regional scale based on groundwater hydrographs EGU General Assembly 2016, Vienna, Austria. Samaniego, L., Bardossy, A., Kumar, R., 2010. Streamflow prediction in ungauged catchments using copula-based dissimilarity measures. Water Resources Research, 46. DOI:10.1029/2008wr007695
Unger, Jakob; Schuster, Maria; Hecker, Dietmar J; Schick, Bernhard; Lohscheller, Jörg
2016-01-01
This work presents a computer-based approach to analyze the two-dimensional vocal fold dynamics of endoscopic high-speed videos, and constitutes an extension and generalization of a previously proposed wavelet-based procedure. While most approaches aim for analyzing sustained phonation conditions, the proposed method allows for a clinically adequate analysis of both dynamic as well as sustained phonation paradigms. The analysis procedure is based on a spatio-temporal visualization technique, the phonovibrogram, that facilitates the documentation of the visible laryngeal dynamics. From the phonovibrogram, a low-dimensional set of features is computed using a principle component analysis strategy that quantifies the type of vibration patterns, irregularity, lateral symmetry and synchronicity, as a function of time. Two different test bench data sets are used to validate the approach: (I) 150 healthy and pathologic subjects examined during sustained phonation. (II) 20 healthy and pathologic subjects that were examined twice: during sustained phonation and a glissando from a low to a higher fundamental frequency. In order to assess the discriminative power of the extracted features, a Support Vector Machine is trained to distinguish between physiologic and pathologic vibrations. The results for sustained phonation sequences are compared to the previous approach. Finally, the classification performance of the stationary analyzing procedure is compared to the transient analysis of the glissando maneuver. For the first test bench the proposed procedure outperformed the previous approach (proposed feature set: accuracy: 91.3%, sensitivity: 80%, specificity: 97%, previous approach: accuracy: 89.3%, sensitivity: 76%, specificity: 96%). Comparing the classification performance of the second test bench further corroborates that analyzing transient paradigms provides clear additional diagnostic value (glissando maneuver: accuracy: 90%, sensitivity: 100%, specificity: 80%, sustained phonation: accuracy: 75%, sensitivity: 80%, specificity: 70%). The incorporation of parameters describing the temporal evolvement of vocal fold vibration clearly improves the automatic identification of pathologic vibration patterns. Furthermore, incorporating a dynamic phonation paradigm provides additional valuable information about the underlying laryngeal dynamics that cannot be derived from sustained conditions. The proposed generalized approach provides a better overall classification performance than the previous approach, and hence constitutes a new advantageous tool for an improved clinical diagnosis of voice disorders. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Hodijah, A.; Sundari, S.; Nugraha, A. C.
2018-05-01
As a Local Government Agencies who perform public services, General Government Office already has utilized Reporting Information System of Local Government Implementation (E-LPPD). However, E-LPPD has upgrade limitation for the integration processes that cannot accommodate General Government Offices’ needs in order to achieve Good Government Governance (GGG), while success stories of the ultimate goal of e-government implementation requires good governance practices. Currently, citizen demand public services as private sector do, which needs service innovation by utilizing the legacy system as a service based e-government implementation, while Service Oriented Architecture (SOA) to redefine a business processes as a set of IT enabled services and Enterprise Architecture from the Open Group Architecture Framework (TOGAF) as a comprehensive approach in redefining business processes as service innovation towards GGG. This paper takes a case study on Performance Evaluation of Local Government Implementation (EKPPD) system on General Government Office. The results show that TOGAF will guide the development of integrated business processes of EKPPD system that fits good governance practices to attain GGG with SOA methodology as technical approach.
Identification of Bouc-Wen hysteretic parameters based on enhanced response sensitivity approach
NASA Astrophysics Data System (ADS)
Wang, Li; Lu, Zhong-Rong
2017-05-01
This paper aims to identify parameters of Bouc-Wen hysteretic model using time-domain measured data. It follows a general inverse identification procedure, that is, identifying model parameters is treated as an optimization problem with the nonlinear least squares objective function. Then, the enhanced response sensitivity approach, which has been shown convergent and proper for such kind of problems, is adopted to solve the optimization problem. Numerical tests are undertaken to verify the proposed identification approach.
NASA Technical Reports Server (NTRS)
Milman, M. H.
1985-01-01
A factorization approach is presented for deriving approximations to the optimal feedback gain for the linear regulator-quadratic cost problem associated with time-varying functional differential equations with control delays. The approach is based on a discretization of the state penalty which leads to a simple structure for the feedback control law. General properties of the Volterra factors of Hilbert-Schmidt operators are then used to obtain convergence results for the feedback kernels.
Control of solar energy systems
NASA Astrophysics Data System (ADS)
Sizov, Iu. M.; Zakhidov, R. A.; Baranov, V. G.
Two approaches to the control of large solar energy systems, i.e., programmed control and control systems relying on the use of orientation transducers and feedback, are briefly reviewed, with particular attention given to problems associated with these control systems. A new control system for large solar power plants is then proposed which is based on a combination of these approaches. The general design of the control system is shown and its principle of operation described. The efficiency and cost effectiveness of the approach proposed here are demonstrated.
Dudoit, Sandrine; Gilbert, Houston N.; van der Laan, Mark J.
2014-01-01
Summary This article proposes resampling-based empirical Bayes multiple testing procedures for controlling a broad class of Type I error rates, defined as generalized tail probability (gTP) error rates, gTP(q, g) = Pr(g(Vn, Sn) > q), and generalized expected value (gEV) error rates, gEV(g) = E[g(Vn, Sn)], for arbitrary functions g(Vn, Sn) of the numbers of false positives Vn and true positives Sn. Of particular interest are error rates based on the proportion g(Vn, Sn) = Vn/(Vn + Sn) of Type I errors among the rejected hypotheses, such as the false discovery rate (FDR), FDR = E[Vn/(Vn + Sn)]. The proposed procedures offer several advantages over existing methods. They provide Type I error control for general data generating distributions, with arbitrary dependence structures among variables. Gains in power are achieved by deriving rejection regions based on guessed sets of true null hypotheses and null test statistics randomly sampled from joint distributions that account for the dependence structure of the data. The Type I error and power properties of an FDR-controlling version of the resampling-based empirical Bayes approach are investigated and compared to those of widely-used FDR-controlling linear step-up procedures in a simulation study. The Type I error and power trade-off achieved by the empirical Bayes procedures under a variety of testing scenarios allows this approach to be competitive with or outperform the Storey and Tibshirani (2003) linear step-up procedure, as an alternative to the classical Benjamini and Hochberg (1995) procedure. PMID:18932138
Activity-based costing and its application in a Turkish university hospital.
Yereli, Ayşe Necef
2009-03-01
Resource management in hospitals is of increasing importance in today's global economy. Traditional accounting systems have become inadequate for managing hospital resources and accurately determining service costs. Conversely, the activity-based costing approach to hospital accounting is an effective cost management model that determines costs and evaluates financial performance across departments. Obtaining costs that are more accurate can enable hospitals to analyze and interpret costing decisions and make more accurate budgeting decisions. Traditional and activity-based costing approaches were compared using a cost analysis of gall bladder surgeries in the general surgery department of one university hospital in Manisa, Turkey. Copyright (c) AORN, Inc, 2009.
Dealing with office emergencies. Stepwise approach for family physicians.
Sempowski, Ian P.; Brison, Robert J.
2002-01-01
OBJECTIVE: To develop a simple stepwise approach to initial management of emergencies in family physicians' offices; to review how to prepare health care teams and equipment; and to illustrate a general approach to three of the most common office emergencies. QUALITY OF EVIDENCE: MEDLINE was searched from January 1980 to December 2001. Articles were selected based on their clinical relevance, quality of evidence, and date of publication. We reviewed American family medicine, pediatric, dental, and dermatologic articles, but found that the area has not been well studied from a Canadian family medicine perspective. Consensus statements by specialty professional groups were used to identify accepted emergency medical treatments. MAIN MESSAGE: Family medicine offices are frequently poorly equipped and inadequately prepared to deal with emergencies. Straightforward emergency response plans can be designed and tailored to an office's risk profile. A systematic team approach and effective use of skills, support staff, and equipment is important. The general approach can be modified for specific patients or conditions. CONCLUSION: Family physicians can plan ahead and use a team approach to develop a simple stepwise response to emergency situations in the office. PMID:12371305
NASA Astrophysics Data System (ADS)
Weng, B. S.; Yan, D. H.; Wang, H.; Liu, J. H.; Yang, Z. Y.; Qin, T. L.; Yin, J.
2015-08-01
Drought is firstly a resource issue, and with its development it evolves into a disaster issue. Drought events usually occur in a determinate but a random manner. Drought has become one of the major factors to affect sustainable socioeconomic development. In this paper, we propose the generalized drought assessment index (GDAI) based on water resources systems for assessing drought events. The GDAI considers water supply and water demand using a distributed hydrological model. We demonstrate the use of the proposed index in the Dongliao River basin in northeastern China. The results simulated by the GDAI are compared to observed drought disaster records in the Dongliao River basin. In addition, the temporal distribution of drought events and the spatial distribution of drought frequency from the GDAI are compared with the traditional approaches in general (i.e., standard precipitation index, Palmer drought severity index and rate of water deficit index). Then, generalized drought times, generalized drought duration, and generalized drought severity were calculated by theory of runs. Application of said runs at various drought levels (i.e., mild drought, moderate drought, severe drought, and extreme drought) during the period 1960-2010 shows that the centers of gravity of them all distribute in the middle reaches of Dongliao River basin, and change with time. The proposed methodology may help water managers in water-stressed regions to quantify the impact of drought, and consequently, to make decisions for coping with drought.
Patients who make terrible therapeutic choices.
Curzer, Howard J
2014-01-01
The traditional approaches to dental ethics include appeals to principles, duties (deontology), and consequences (utilitarianism). These approaches are often inadequate when faced with the case of a patient who refuses reasonable treatment and does not share the same ethical framework the dentist is using. An approach based on virtue ethics may be helpful in this and other cases. Virtue ethics is a tradition going back to Plato and Aristotle. It depends on forming a holistic character supporting general appropriate behavior. By correctly diagnosing the real issues at stake in a patient's inappropriate oral health choices and working to build effective habits, dentists can sometimes respond to ethical challenges that remain intractable given rule-based methods.
Pattern-Directed Attention in Uncertain Frequency Detection.
1983-10-14
performance when compared to a single frequency condition even if the listeners are aware that more than one signal can occur ( Creelman , 1960; Green...be missed. On the-other hand, the multiple band approach, introduced by Green (1958) and modified by Creelman (1960), assumes that listeners base...multiple-band approaches ( Creelman , 1960; Green, 1961; Macmillan & Schwartz, 1975). In general, the two views are difficult to distinguish empirically, and
New Approaches to Coding Information using Inverse Scattering Transform
NASA Astrophysics Data System (ADS)
Frumin, L. L.; Gelash, A. A.; Turitsyn, S. K.
2017-06-01
Remarkable mathematical properties of the integrable nonlinear Schrödinger equation (NLSE) can offer advanced solutions for the mitigation of nonlinear signal distortions in optical fiber links. Fundamental optical soliton, continuous, and discrete eigenvalues of the nonlinear spectrum have already been considered for the transmission of information in fiber-optic channels. Here, we propose to apply signal modulation to the kernel of the Gelfand-Levitan-Marchenko equations that offers the advantage of a relatively simple decoder design. First, we describe an approach based on exploiting the general N -soliton solution of the NLSE for simultaneous coding of N symbols involving 4 ×N coding parameters. As a specific elegant subclass of the general schemes, we introduce a soliton orthogonal frequency division multiplexing (SOFDM) method. This method is based on the choice of identical imaginary parts of the N -soliton solution eigenvalues, corresponding to equidistant soliton frequencies, making it similar to the conventional OFDM scheme, thus, allowing for the use of the efficient fast Fourier transform algorithm to recover the data. Then, we demonstrate how to use this new approach to control signal parameters in the case of the continuous spectrum.
Matsubara, Takashi; Torikai, Hiroyuki
2016-04-01
Modeling and implementation approaches for the reproduction of input-output relationships in biological nervous tissues contribute to the development of engineering and clinical applications. However, because of high nonlinearity, the traditional modeling and implementation approaches encounter difficulties in terms of generalization ability (i.e., performance when reproducing an unknown data set) and computational resources (i.e., computation time and circuit elements). To overcome these difficulties, asynchronous cellular automaton-based neuron (ACAN) models, which are described as special kinds of cellular automata that can be implemented as small asynchronous sequential logic circuits have been proposed. This paper presents a novel type of such ACAN and a theoretical analysis of its excitability. This paper also presents a novel network of such neurons, which can mimic input-output relationships of biological and nonlinear ordinary differential equation model neural networks. Numerical analyses confirm that the presented network has a higher generalization ability than other major modeling and implementation approaches. In addition, Field-Programmable Gate Array-implementations confirm that the presented network requires lower computational resources.
Rational design of aptazyme riboswitches for efficient control of gene expression in mammalian cells
Zhong, Guocai; Wang, Haimin; Bailey, Charles C; Gao, Guangping; Farzan, Michael
2016-01-01
Efforts to control mammalian gene expression with ligand-responsive riboswitches have been hindered by lack of a general method for generating efficient switches in mammalian systems. Here we describe a rational-design approach that enables rapid development of efficient cis-acting aptazyme riboswitches. We identified communication-module characteristics associated with aptazyme functionality through analysis of a 32-aptazyme test panel. We then developed a scoring system that predicts an aptazymes’s activity by integrating three characteristics of communication-module bases: hydrogen bonding, base stacking, and distance to the enzymatic core. We validated the power and generality of this approach by designing aptazymes responsive to three distinct ligands, each with markedly wider dynamic ranges than any previously reported. These aptayzmes efficiently regulated adeno-associated virus (AAV)-vectored transgene expression in cultured mammalian cells and mice, highlighting one application of these broadly usable regulatory switches. Our approach enables efficient, protein-independent control of gene expression by a range of small molecules. DOI: http://dx.doi.org/10.7554/eLife.18858.001 PMID:27805569
NASA Astrophysics Data System (ADS)
Gauthier, Jean-Christophe; Robichaud, Louis-Rafaël; Fortin, Vincent; Vallée, Réal; Bernier, Martin
2018-06-01
The quest for a compact and efficient broadband laser source able to probe the numerous fundamental molecular absorption lines in the mid-infrared (3-8 µm) for various applications has been going on for more than a decade. While robust commercial fiber-based supercontinuum (SC) systems have started to appear on the market, they still exhibit poor energy conversion into the mid-infrared (typically under 30%) and are generally not producing wavelengths exceeding 4.7 µm. Here, we present an overview of the results obtained from a novel approach to SC generation based on spectral broadening inside of an erbium-doped fluoride fiber amplifier seeded directly at 2.8 µm, allowing mid-infrared conversion efficiencies reaching up to 95% and spectral coverage approaching the transparency limit of ZrF4 (4.2 µm) and InF3 (5.5 µm) fibers. The general concept of the approach and the physical mechanisms involved are presented alongside the various configurations of the system to adjust the output characteristics in terms of spectral coverage and output power for different applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Staschus, K.
1985-01-01
In this dissertation, efficient algorithms for electric-utility capacity expansion planning with renewable energy are developed. The algorithms include a deterministic phase that quickly finds a near-optimal expansion plan using derating and a linearized approximation to the time-dependent availability of nondispatchable energy sources. A probabilistic second phase needs comparatively few computer-time consuming probabilistic simulation iterations to modify this solution towards the optimal expansion plan. For the deterministic first phase, two algorithms, based on a Lagrangian Dual decomposition and a Generalized Benders Decomposition, are developed. The probabilistic second phase uses a Generalized Benders Decomposition approach. Extensive computational tests of the algorithms aremore » reported. Among the deterministic algorithms, the one based on Lagrangian Duality proves fastest. The two-phase approach is shown to save up to 80% in computing time as compared to a purely probabilistic algorithm. The algorithms are applied to determine the optimal expansion plan for the Tijuana-Mexicali subsystem of the Mexican electric utility system. A strong recommendation to push conservation programs in the desert city of Mexicali results from this implementation.« less
Henchion, M; McCarthy, M; Resconi, V C; Berry, D P; McParland, S
2016-05-01
The relative weighting on traits within breeding goals are generally determined by bio-economic models or profit functions. While such methods have generally delivered profitability gains to producers, and are being expanded to consider non-market values, current approaches generally do not consider the numerous and diverse stakeholders that affect, or are affected, by such tools. Based on principles of respondent anonymity, iteration, controlled feedback and statistical aggregation of feedback, a Delphi study was undertaken to gauge stakeholder opinion of the importance of detailed milk quality traits within an overall dairy breeding goal for profit, with the aim of assessing its suitability as a complementary, participatory approach to defining breeding goals. The questionnaires used over two survey rounds asked stakeholders: (a) their opinion on incorporating an explicit sub-index for milk quality into a national breeding goal; (b) the importance they would assign to a pre-determined list of milk quality traits and (c) the (relative) weighting they would give such a milk quality sub-index. Results from the survey highlighted a good degree of consensus among stakeholders on the issues raised. Similarly, revelation of the underlying assumptions and knowledge used by stakeholders to make their judgements illustrated their ability to consider a range of perspectives when evaluating traits, and to reconsider their answers based on the responses and rationales given by others, which demonstrated social learning. Finally, while the relative importance assigned by stakeholders in the Delphi survey (4% to 10%) and the results of calculations based on selection index theory of the relative emphasis that should be placed on milk quality to halt any deterioration (16%) are broadly in line, the difference indicates the benefit of considering more than one approach to determining breeding goals. This study thus illustrates the role of the Delphi technique, as a complementary approach to traditional approaches, to defining breeding goals. This has implications for how breeding goals will be defined and in determining who should be involved in the decision-making process.
Van Royen, Paul; Beyer, Martin; Chevallier, Patrick; Eilat-Tsanani, Sophia; Lionis, Christos; Peremans, Lieve; Petek, Davorina; Rurik, Imre; Soler, Jean Karl; Stoffers, Henri E J H; Topsever, Pinar; Ungan, Mehmet; Hummers-Pradier, Eva
2010-06-01
The recently published 'Research Agenda for General Practice/Family Medicine and Primary Health Care in Europe' summarizes the evidence relating to the core competencies and characteristics of the Wonca Europe definition of GP/FM, and its implications for general practitioners/family doctors, researchers and policy makers. The European Journal of General Practice publishes a series of articles based on this document. In a first article, background, objectives, and methodology were discussed. In a second article, the results for the two core competencies 'primary care management' and 'community orientation' were presented. This article reflects on the three core competencies, which deal with person related aspects of GP/FM, i.e. 'person centred care', 'comprehensive approach' and 'holistic approach'. Though there is an important body of opinion papers and (non-systematic) reviews, all person related aspects remain poorly defined and researched. Validated instruments to measure these competencies are lacking. Concerning patient-centredness, most research examined patient and doctor preferences and experiences. Studies on comprehensiveness mostly focus on prevention/care of specific diseases. For all domains, there has been limited research conducted on its implications or outcomes.
Sugihara, Masahiro
2010-01-01
In survival analysis, treatment effects are commonly evaluated based on survival curves and hazard ratios as causal treatment effects. In observational studies, these estimates may be biased due to confounding factors. The inverse probability of treatment weighted (IPTW) method based on the propensity score is one of the approaches utilized to adjust for confounding factors between binary treatment groups. As a generalization of this methodology, we developed an exact formula for an IPTW log-rank test based on the generalized propensity score for survival data. This makes it possible to compare the group differences of IPTW Kaplan-Meier estimators of survival curves using an IPTW log-rank test for multi-valued treatments. As causal treatment effects, the hazard ratio can be estimated using the IPTW approach. If the treatments correspond to ordered levels of a treatment, the proposed method can be easily extended to the analysis of treatment effect patterns with contrast statistics. In this paper, the proposed method is illustrated with data from the Kyushu Lipid Intervention Study (KLIS), which investigated the primary preventive effects of pravastatin on coronary heart disease (CHD). The results of the proposed method suggested that pravastatin treatment reduces the risk of CHD and that compliance to pravastatin treatment is important for the prevention of CHD. (c) 2009 John Wiley & Sons, Ltd.
Huberts, W; Donders, W P; Delhaas, T; van de Vosse, F N
2014-12-01
Patient-specific modeling requires model personalization, which can be achieved in an efficient manner by parameter fixing and parameter prioritization. An efficient variance-based method is using generalized polynomial chaos expansion (gPCE), but it has not been applied in the context of model personalization, nor has it ever been compared with standard variance-based methods for models with many parameters. In this work, we apply the gPCE method to a previously reported pulse wave propagation model and compare the conclusions for model personalization with that of a reference analysis performed with Saltelli's efficient Monte Carlo method. We furthermore differentiate two approaches for obtaining the expansion coefficients: one based on spectral projection (gPCE-P) and one based on least squares regression (gPCE-R). It was found that in general the gPCE yields similar conclusions as the reference analysis but at much lower cost, as long as the polynomial metamodel does not contain unnecessary high order terms. Furthermore, the gPCE-R approach generally yielded better results than gPCE-P. The weak performance of the gPCE-P can be attributed to the assessment of the expansion coefficients using the Smolyak algorithm, which might be hampered by the high number of model parameters and/or by possible non-smoothness in the output space. Copyright © 2014 John Wiley & Sons, Ltd.
A call for differentiated approaches to delivering HIV services to key populations.
Macdonald, Virginia; Verster, Annette; Baggaley, Rachel
2017-07-21
Key populations (KPs) are disproportionally affected by HIV and have low rates of access to HIV testing and treatment services compared to the broader population. WHO promotes the use of differentiated approaches for reaching and recruiting KP into the HIV services continuum. These approaches may help increase access to KPs who are often criminalized or stigmatized. By catering to the specific needs of each KP individual, differentiated approaches may increase service acceptability, quality and coverage, reduce costs and support KP members in leading the HIV response among their communities. WHO recommends the implementation of community-based and lay provider administered HIV testing services. Together, these approaches reduce barriers and costs associated with other testing strategies, allow greater ownership in HIV programmes for KP members and reach more people than do facility-based services. Despite this evidence availability and support for them is limited. Peer-driven interventions have been shown to be effective in engaging, recruiting and supporting clients. Some programmes employ HIV-positive or non-PLHIV "peer navigators" and other staff to provide case management, enrolment and/or re-enrolment in care and treatment services. However, a better understanding of the impact, cost effectiveness and potential burden on peer volunteers is required. Task shifting and non-facility-based service locations for antiretroviral therapy (ART) initiation and maintenance and antiretroviral (ARV) distribution are recommended in both the consolidated HIV treatment and KP guidelines of WHO. These approaches are accepted in generalized epidemics and for the general population where successful models exist; however, few organizations provide or initiate ART at KP community-based services. The application of a differentiated service approach for KP could increase the number of people who know their status and receive effective and sustained prevention and treatment for HIV. However, while community-based and lay provider testing are effective and affordable, they are not implemented to scale. Furthermore regulatory barriers to legitimizing lay and peer providers as part of healthcare delivery systems need to be overcome in many settings. WHO recommendations on task shifting and decentralization of ART treatment and care are often not applied to KP settings.
Representative Structural Element - A New Paradigm for Multi-Scale Structural Modeling
2016-07-05
developed by NASA Glenn Research Center based on Aboudi’s micromechanics theories [5] that provides a wide range of capabilities for modeling ...to use appropriate models for related problems based on the capability of corresponding approaches. Moreover, the analyses will give a general...interface of heterogeneous materials but also help engineers to use appropriate models for related problems based on the capability of corresponding
Acting on Information: Representing Actions That Manipulate Information
NASA Technical Reports Server (NTRS)
Golden, Keith
1999-01-01
Information manipulation is the creation of new information based on existing information sources. This paper discusses problems that arise when planning for information manipulation, and proposes a novel action representation, called ADLIM, that addresses these problems, including: How to represent information in a way sufficient to express the effects of actions that modify the information. I present a simple, yet expressive, representation of information goals and effects that generalizes earlier work on representing sensing actions; How to concisely represent actions that copy information, or produce new information that is based on existing information sources. I show how this is a generalization of the frame problem, and present a solution based on generalized frame effects; and How to generate a pipeline of information-processing commands that will produce an output containing exactly the desired information. I present a new approach to goal regression.
SPIRiT: Iterative Self-consistent Parallel Imaging Reconstruction from Arbitrary k-Space
Lustig, Michael; Pauly, John M.
2010-01-01
A new approach to autocalibrating, coil-by-coil parallel imaging reconstruction is presented. It is a generalized reconstruction framework based on self consistency. The reconstruction problem is formulated as an optimization that yields the most consistent solution with the calibration and acquisition data. The approach is general and can accurately reconstruct images from arbitrary k-space sampling patterns. The formulation can flexibly incorporate additional image priors such as off-resonance correction and regularization terms that appear in compressed sensing. Several iterative strategies to solve the posed reconstruction problem in both image and k-space domain are presented. These are based on a projection over convex sets (POCS) and a conjugate gradient (CG) algorithms. Phantom and in-vivo studies demonstrate efficient reconstructions from undersampled Cartesian and spiral trajectories. Reconstructions that include off-resonance correction and nonlinear ℓ1-wavelet regularization are also demonstrated. PMID:20665790
Model-based reconfiguration: Diagnosis and recovery
NASA Technical Reports Server (NTRS)
Crow, Judy; Rushby, John
1994-01-01
We extend Reiter's general theory of model-based diagnosis to a theory of fault detection, identification, and reconfiguration (FDIR). The generality of Reiter's theory readily supports an extension in which the problem of reconfiguration is viewed as a close analog of the problem of diagnosis. Using a reconfiguration predicate 'rcfg' analogous to the abnormality predicate 'ab,' we derive a strategy for reconfiguration by transforming the corresponding strategy for diagnosis. There are two obvious benefits of this approach: algorithms for diagnosis can be exploited as algorithms for reconfiguration and we have a theoretical framework for an integrated approach to FDIR. As a first step toward realizing these benefits we show that a class of diagnosis engines can be used for reconfiguration and we discuss algorithms for integrated FDIR. We argue that integrating recovery and diagnosis is an essential next step if this technology is to be useful for practical applications.
Derivation of Hunt equation for suspension distribution using Shannon entropy theory
NASA Astrophysics Data System (ADS)
Kundu, Snehasis
2017-12-01
In this study, the Hunt equation for computing suspension concentration in sediment-laden flows is derived using Shannon entropy theory. Considering the inverse of the void ratio as a random variable and using principle of maximum entropy, probability density function and cumulative distribution function of suspension concentration is derived. A new and more general cumulative distribution function for the flow domain is proposed which includes several specific other models of CDF reported in literature. This general form of cumulative distribution function also helps to derive the Rouse equation. The entropy based approach helps to estimate model parameters using suspension data of sediment concentration which shows the advantage of using entropy theory. Finally model parameters in the entropy based model are also expressed as functions of the Rouse number to establish a link between the parameters of the deterministic and probabilistic approaches.
A Model-Based Prognostics Approach Applied to Pneumatic Valves
NASA Technical Reports Server (NTRS)
Daigle, Matthew J.; Goebel, Kai
2011-01-01
Within the area of systems health management, the task of prognostics centers on predicting when components will fail. Model-based prognostics exploits domain knowledge of the system, its components, and how they fail by casting the underlying physical phenomena in a physics-based model that is derived from first principles. Uncertainty cannot be avoided in prediction, therefore, algorithms are employed that help in managing these uncertainties. The particle filtering algorithm has become a popular choice for model-based prognostics due to its wide applicability, ease of implementation, and support for uncertainty management. We develop a general model-based prognostics methodology within a robust probabilistic framework using particle filters. As a case study, we consider a pneumatic valve from the Space Shuttle cryogenic refueling system. We develop a detailed physics-based model of the pneumatic valve, and perform comprehensive simulation experiments to illustrate our prognostics approach and evaluate its effectiveness and robustness. The approach is demonstrated using historical pneumatic valve data from the refueling system.
Effective spatial database support for acquiring spatial information from remote sensing images
NASA Astrophysics Data System (ADS)
Jin, Peiquan; Wan, Shouhong; Yue, Lihua
2009-12-01
In this paper, a new approach to maintain spatial information acquiring from remote-sensing images is presented, which is based on Object-Relational DBMS. According to this approach, the detected and recognized results of targets are stored and able to be further accessed in an ORDBMS-based spatial database system, and users can access the spatial information using the standard SQL interface. This approach is different from the traditional ArcSDE-based method, because the spatial information management module is totally integrated into the DBMS and becomes one of the core modules in the DBMS. We focus on three issues, namely the general framework for the ORDBMS-based spatial database system, the definitions of the add-in spatial data types and operators, and the process to develop a spatial Datablade on Informix. The results show that the ORDBMS-based spatial database support for image-based target detecting and recognition is easy and practical to be implemented.
Escaping the healthcare leadership cul-de-sac.
Edmonstone, John Duncan
2017-02-06
Purpose This paper aims to propose that healthcare is dominated by a managerialist ideology, powerfully shaped by business schools and embodied in the Masters in Business Administration. It suggests that there may be unconscious collusion between universities, healthcare employers and student leaders and managers. Design/methodology/approach Based on a review of relevant literature, the paper examines critiques of managerialism generally and explores the assumptions behind leadership development. It draws upon work which suggests that leading in healthcare organisations is fundamentally different and proposes that leadership development should be more practice-based. Findings The way forward for higher education institutions is to include work- or practice-based approaches alongside academic approaches. Practical implications The paper suggests that there is a challenge for higher education institutions to adopt and integrate practice-based development methods into their programme designs. Originality/value The paper provides a challenge to the future role of higher education institutions in developing leadership in healthcare.
Quantum description of light propagation in generalized media
NASA Astrophysics Data System (ADS)
Häyrynen, Teppo; Oksanen, Jani
2016-02-01
Linear quantum input-output relation based models are widely applied to describe the light propagation in a lossy medium. The details of the interaction and the associated added noise depend on whether the device is configured to operate as an amplifier or an attenuator. Using the traveling wave (TW) approach, we generalize the linear material model to simultaneously account for both the emission and absorption processes and to have point-wise defined noise field statistics and intensity dependent interaction strengths. Thus, our approach describes the quantum input-output relations of linear media with net attenuation, amplification or transparency without pre-selection of the operation point. The TW approach is then applied to investigate materials at thermal equilibrium, inverted materials, the transparency limit where losses are compensated, and the saturating amplifiers. We also apply the approach to investigate media in nonuniform states which can be e.g. consequences of a temperature gradient over the medium or a position dependent inversion of the amplifier. Furthermore, by using the generalized model we investigate devices with intensity dependent interactions and show how an initial thermal field transforms to a field having coherent statistics due to gain saturation.
Trait-based approaches for understanding microbial biodiversity and ecosystem functioning
Krause, Sascha; Le Roux, Xavier; Niklaus, Pascal A.; Van Bodegom, Peter M.; Lennon, Jay T.; Bertilsson, Stefan; Grossart, Hans-Peter; Philippot, Laurent; Bodelier, Paul L. E.
2014-01-01
In ecology, biodiversity-ecosystem functioning (BEF) research has seen a shift in perspective from taxonomy to function in the last two decades, with successful application of trait-based approaches. This shift offers opportunities for a deeper mechanistic understanding of the role of biodiversity in maintaining multiple ecosystem processes and services. In this paper, we highlight studies that have focused on BEF of microbial communities with an emphasis on integrating trait-based approaches to microbial ecology. In doing so, we explore some of the inherent challenges and opportunities of understanding BEF using microbial systems. For example, microbial biologists characterize communities using gene phylogenies that are often unable to resolve functional traits. Additionally, experimental designs of existing microbial BEF studies are often inadequate to unravel BEF relationships. We argue that combining eco-physiological studies with contemporary molecular tools in a trait-based framework can reinforce our ability to link microbial diversity to ecosystem processes. We conclude that such trait-based approaches are a promising framework to increase the understanding of microbial BEF relationships and thus generating systematic principles in microbial ecology and more generally ecology. PMID:24904563
An ensemble method for extracting adverse drug events from social media.
Liu, Jing; Zhao, Songzheng; Zhang, Xiaodi
2016-06-01
Because adverse drug events (ADEs) are a serious health problem and a leading cause of death, it is of vital importance to identify them correctly and in a timely manner. With the development of Web 2.0, social media has become a large data source for information on ADEs. The objective of this study is to develop a relation extraction system that uses natural language processing techniques to effectively distinguish between ADEs and non-ADEs in informal text on social media. We develop a feature-based approach that utilizes various lexical, syntactic, and semantic features. Information-gain-based feature selection is performed to address high-dimensional features. Then, we evaluate the effectiveness of four well-known kernel-based approaches (i.e., subset tree kernel, tree kernel, shortest dependency path kernel, and all-paths graph kernel) and several ensembles that are generated by adopting different combination methods (i.e., majority voting, weighted averaging, and stacked generalization). All of the approaches are tested using three data sets: two health-related discussion forums and one general social networking site (i.e., Twitter). When investigating the contribution of each feature subset, the feature-based approach attains the best area under the receiver operating characteristics curve (AUC) values, which are 78.6%, 72.2%, and 79.2% on the three data sets. When individual methods are used, we attain the best AUC values of 82.1%, 73.2%, and 77.0% using the subset tree kernel, shortest dependency path kernel, and feature-based approach on the three data sets, respectively. When using classifier ensembles, we achieve the best AUC values of 84.5%, 77.3%, and 84.5% on the three data sets, outperforming the baselines. Our experimental results indicate that ADE extraction from social media can benefit from feature selection. With respect to the effectiveness of different feature subsets, lexical features and semantic features can enhance the ADE extraction capability. Kernel-based approaches, which can stay away from the feature sparsity issue, are qualified to address the ADE extraction problem. Combining different individual classifiers using suitable combination methods can further enhance the ADE extraction effectiveness. Copyright © 2016 Elsevier B.V. All rights reserved.
Numerical Approach to Spatial Deterministic-Stochastic Models Arising in Cell Biology
Gao, Fei; Li, Ye; Novak, Igor L.; Slepchenko, Boris M.
2016-01-01
Hybrid deterministic-stochastic methods provide an efficient alternative to a fully stochastic treatment of models which include components with disparate levels of stochasticity. However, general-purpose hybrid solvers for spatially resolved simulations of reaction-diffusion systems are not widely available. Here we describe fundamentals of a general-purpose spatial hybrid method. The method generates realizations of a spatially inhomogeneous hybrid system by appropriately integrating capabilities of a deterministic partial differential equation solver with a popular particle-based stochastic simulator, Smoldyn. Rigorous validation of the algorithm is detailed, using a simple model of calcium ‘sparks’ as a testbed. The solver is then applied to a deterministic-stochastic model of spontaneous emergence of cell polarity. The approach is general enough to be implemented within biologist-friendly software frameworks such as Virtual Cell. PMID:27959915
Nonequilibrium Statistical Operator Method and Generalized Kinetic Equations
NASA Astrophysics Data System (ADS)
Kuzemsky, A. L.
2018-01-01
We consider some principal problems of nonequilibrium statistical thermodynamics in the framework of the Zubarev nonequilibrium statistical operator approach. We present a brief comparative analysis of some approaches to describing irreversible processes based on the concept of nonequilibrium Gibbs ensembles and their applicability to describing nonequilibrium processes. We discuss the derivation of generalized kinetic equations for a system in a heat bath. We obtain and analyze a damped Schrödinger-type equation for a dynamical system in a heat bath. We study the dynamical behavior of a particle in a medium taking the dissipation effects into account. We consider the scattering problem for neutrons in a nonequilibrium medium and derive a generalized Van Hove formula. We show that the nonequilibrium statistical operator method is an effective, convenient tool for describing irreversible processes in condensed matter.
Using complex networks towards information retrieval and diagnostics in multidimensional imaging
NASA Astrophysics Data System (ADS)
Banerjee, Soumya Jyoti; Azharuddin, Mohammad; Sen, Debanjan; Savale, Smruti; Datta, Himadri; Dasgupta, Anjan Kr; Roy, Soumen
2015-12-01
We present a fresh and broad yet simple approach towards information retrieval in general and diagnostics in particular by applying the theory of complex networks on multidimensional, dynamic images. We demonstrate a successful use of our method with the time series generated from high content thermal imaging videos of patients suffering from the aqueous deficient dry eye (ADDE) disease. Remarkably, network analyses of thermal imaging time series of contact lens users and patients upon whom Laser-Assisted in situ Keratomileusis (Lasik) surgery has been conducted, exhibit pronounced similarity with results obtained from ADDE patients. We also propose a general framework for the transformation of multidimensional images to networks for futuristic biometry. Our approach is general and scalable to other fluctuation-based devices where network parameters derived from fluctuations, act as effective discriminators and diagnostic markers.
Using complex networks towards information retrieval and diagnostics in multidimensional imaging.
Banerjee, Soumya Jyoti; Azharuddin, Mohammad; Sen, Debanjan; Savale, Smruti; Datta, Himadri; Dasgupta, Anjan Kr; Roy, Soumen
2015-12-02
We present a fresh and broad yet simple approach towards information retrieval in general and diagnostics in particular by applying the theory of complex networks on multidimensional, dynamic images. We demonstrate a successful use of our method with the time series generated from high content thermal imaging videos of patients suffering from the aqueous deficient dry eye (ADDE) disease. Remarkably, network analyses of thermal imaging time series of contact lens users and patients upon whom Laser-Assisted in situ Keratomileusis (Lasik) surgery has been conducted, exhibit pronounced similarity with results obtained from ADDE patients. We also propose a general framework for the transformation of multidimensional images to networks for futuristic biometry. Our approach is general and scalable to other fluctuation-based devices where network parameters derived from fluctuations, act as effective discriminators and diagnostic markers.
Using complex networks towards information retrieval and diagnostics in multidimensional imaging
Banerjee, Soumya Jyoti; Azharuddin, Mohammad; Sen, Debanjan; Savale, Smruti; Datta, Himadri; Dasgupta, Anjan Kr; Roy, Soumen
2015-01-01
We present a fresh and broad yet simple approach towards information retrieval in general and diagnostics in particular by applying the theory of complex networks on multidimensional, dynamic images. We demonstrate a successful use of our method with the time series generated from high content thermal imaging videos of patients suffering from the aqueous deficient dry eye (ADDE) disease. Remarkably, network analyses of thermal imaging time series of contact lens users and patients upon whom Laser-Assisted in situ Keratomileusis (Lasik) surgery has been conducted, exhibit pronounced similarity with results obtained from ADDE patients. We also propose a general framework for the transformation of multidimensional images to networks for futuristic biometry. Our approach is general and scalable to other fluctuation-based devices where network parameters derived from fluctuations, act as effective discriminators and diagnostic markers. PMID:26626047
Portfolio optimization using median-variance approach
NASA Astrophysics Data System (ADS)
Wan Mohd, Wan Rosanisah; Mohamad, Daud; Mohamed, Zulkifli
2013-04-01
Optimization models have been applied in many decision-making problems particularly in portfolio selection. Since the introduction of Markowitz's theory of portfolio selection, various approaches based on mathematical programming have been introduced such as mean-variance, mean-absolute deviation, mean-variance-skewness and conditional value-at-risk (CVaR) mainly to maximize return and minimize risk. However most of the approaches assume that the distribution of data is normal and this is not generally true. As an alternative, in this paper, we employ the median-variance approach to improve the portfolio optimization. This approach has successfully catered both types of normal and non-normal distribution of data. With this actual representation, we analyze and compare the rate of return and risk between the mean-variance and the median-variance based portfolio which consist of 30 stocks from Bursa Malaysia. The results in this study show that the median-variance approach is capable to produce a lower risk for each return earning as compared to the mean-variance approach.
Multi Sensor Fusion Using Fitness Adaptive Differential Evolution
NASA Astrophysics Data System (ADS)
Giri, Ritwik; Ghosh, Arnob; Chowdhury, Aritra; Das, Swagatam
The rising popularity of multi-source, multi-sensor networks supports real-life applications calls for an efficient and intelligent approach to information fusion. Traditional optimization techniques often fail to meet the demands. The evolutionary approach provides a valuable alternative due to its inherent parallel nature and its ability to deal with difficult problems. We present a new evolutionary approach based on a modified version of Differential Evolution (DE), called Fitness Adaptive Differential Evolution (FiADE). FiADE treats sensors in the network as distributed intelligent agents with various degrees of autonomy. Existing approaches based on intelligent agents cannot completely answer the question of how their agents could coordinate their decisions in a complex environment. The proposed approach is formulated to produce good result for the problems that are high-dimensional, highly nonlinear, and random. The proposed approach gives better result in case of optimal allocation of sensors. The performance of the proposed approach is compared with an evolutionary algorithm coordination generalized particle model (C-GPM).
A quantitative approach to measure road network information based on edge diversity
NASA Astrophysics Data System (ADS)
Wu, Xun; Zhang, Hong; Lan, Tian; Cao, Weiwei; He, Jing
2015-12-01
The measure of map information has been one of the key issues in assessing cartographic quality and map generalization algorithms. It is also important for developing efficient approaches to transfer geospatial information. Road network is the most common linear object in real world. Approximately describe road network information will benefit road map generalization, navigation map production and urban planning. Most of current approaches focused on node diversities and supposed that all the edges are the same, which is inconsistent to real-life condition, and thus show limitations in measuring network information. As real-life traffic flow are directed and of different quantities, the original undirected vector road map was first converted to a directed topographic connectivity map. Then in consideration of preferential attachment in complex network study and rich-club phenomenon in social network, the from and to weights of each edge are assigned. The from weight of a given edge is defined as the connectivity of its end node to the sum of the connectivities of all the neighbors of the from nodes of the edge. After getting the from and to weights of each edge, edge information, node information and the whole network structure information entropies could be obtained based on information theory. The approach has been applied to several 1 square mile road network samples. Results show that information entropies based on edge diversities could successfully describe the structural differences of road networks. This approach is a complementarity to current map information measurements, and can be extended to measure other kinds of geographical objects.
NASA Astrophysics Data System (ADS)
Skrzypek, Josef; Mesrobian, Edmond; Gungner, David J.
1989-03-01
The development of autonomous land vehicles (ALV) capable of operating in an unconstrained environment has proven to be a formidable research effort. The unpredictability of events in such an environment calls for the design of a robust perceptual system, an impossible task requiring the programming of a system bases on the expectation of future, unconstrained events. Hence, the need for a "general purpose" machine vision system that is capable of perceiving and understanding images in an unconstrained environment in real-time. The research undertaken at the UCLA Machine Perception Laboratory addresses this need by focusing on two specific issues: 1) the long term goals for machine vision research as a joint effort between the neurosciences and computer science; and 2) a framework for evaluating progress in machine vision. In the past, vision research has been carried out independently within different fields including neurosciences, psychology, computer science, and electrical engineering. Our interdisciplinary approach to vision research is based on the rigorous combination of computational neuroscience, as derived from neurophysiology and neuropsychology, with computer science and electrical engineering. The primary motivation behind our approach is that the human visual system is the only existing example of a "general purpose" vision system and using a neurally based computing substrate, it can complete all necessary visual tasks in real-time.
Chylek, Lily A.; Harris, Leonard A.; Tung, Chang-Shung; Faeder, James R.; Lopez, Carlos F.
2013-01-01
Rule-based modeling was developed to address the limitations of traditional approaches for modeling chemical kinetics in cell signaling systems. These systems consist of multiple interacting biomolecules (e.g., proteins), which themselves consist of multiple parts (e.g., domains, linear motifs, and sites of phosphorylation). Consequently, biomolecules that mediate information processing generally have the potential to interact in multiple ways, with the number of possible complexes and post-translational modification states tending to grow exponentially with the number of binary interactions considered. As a result, only large reaction networks capture all possible consequences of the molecular interactions that occur in a cell signaling system, which is problematic because traditional modeling approaches for chemical kinetics (e.g., ordinary differential equations) require explicit network specification. This problem is circumvented through representation of interactions in terms of local rules. With this approach, network specification is implicit and model specification is concise. Concise representation results in a coarse graining of chemical kinetics, which is introduced because all reactions implied by a rule inherit the rate law associated with that rule. Coarse graining can be appropriate if interactions are modular, and the coarseness of a model can be adjusted as needed. Rules can be specified using specialized model-specification languages, and recently developed tools designed for specification of rule-based models allow one to leverage powerful software engineering capabilities. A rule-based model comprises a set of rules, which can be processed by general-purpose simulation and analysis tools to achieve different objectives (e.g., to perform either a deterministic or stochastic simulation). PMID:24123887
2012-01-01
systems . For some specific sensor requirements in the domains considered here, for example, assessing system behavior and component state in gas turbine ...Cost Objectives. In general , the implication of the suitability and life cycle cost [LCC] driven objectives for integrated instrumentation/sensor system ...section should be considered. In general , the systems engineering approach provided clear benefits in defining user significant IISS system requirements and
Dash, Tanya; Kar, Bhoomika R.
2014-01-01
Background. Bilingualism results in an added advantage with respect to cognitive control. The interaction between bilingual language control and general purpose cognitive control systems can also be understood by studying executive control among individuals with bilingual aphasia. Objectives. The current study examined the subcomponents of cognitive control in bilingual aphasia. A case study approach was used to investigate whether cognitive control and language control are two separate systems and how factors related to bilingualism interact with control processes. Methods. Four individuals with bilingual aphasia performed a language background questionnaire, picture description task, and two experimental tasks (nonlinguistic negative priming task and linguistic and nonlinguistic versions of flanker task). Results. A descriptive approach was used to analyse the data using reaction time and accuracy measures. The cumulative distribution function plots were used to visualize the variations in performance across conditions. The results highlight the distinction between general purpose cognitive control and bilingual language control mechanisms. Conclusion. All participants showed predominant use of the reactive control mechanism to compensate for the limited resources system. Independent yet interactive systems for bilingual language control and general purpose cognitive control were postulated based on the experimental data derived from individuals with bilingual aphasia. PMID:24982591
Vijayalaxmi; Scarfi, Maria R.
2014-01-01
The escalated use of various wireless communication devices, which emit non-ionizing radiofrequency (RF) fields, have raised concerns among the general public regarding the potential adverse effects on human health. During the last six decades, researchers have used different parameters to investigate the effects of in vitro and in vivo exposures of animals and humans or their cells to RF fields. Data reported in peer-reviewed scientific publications were contradictory: some indicated effects while others did not. International organizations have considered all of these data as well as the observations reported in human epidemiological investigations to set-up the guidelines or standards (based on the quality of published studies and the “weight of scientific evidence” approach) for RF exposures in occupationally exposed individuals and the general public. Scientists with relevant expertise in various countries have also considered the published data to provide the required scientific information for policy-makers to develop and disseminate authoritative health information to the general public regarding RF exposures. This paper is a compilation of the conclusions, on the biological effects of RF exposures, from various national and international expert groups, based on their analyses. In general, the expert groups suggested a reduction in exposure levels, precautionary approach, and further research. PMID:25211777
Loya, Salvador Rodriguez; Kawamoto, Kensaku; Chatwin, Chris; Huser, Vojtech
2014-12-01
The use of a service-oriented architecture (SOA) has been identified as a promising approach for improving health care by facilitating reliable clinical decision support (CDS). A review of the literature through October 2013 identified 44 articles on this topic. The review suggests that SOA related technologies such as Business Process Model and Notation (BPMN) and Service Component Architecture (SCA) have not been generally adopted to impact health IT systems' performance for better care solutions. Additionally, technologies such as Enterprise Service Bus (ESB) and architectural approaches like Service Choreography have not been generally exploited among researchers and developers. Based on the experience of other industries and our observation of the evolution of SOA, we found that the greater use of these approaches have the potential to significantly impact SOA implementations for CDS.
Loya, Salvador Rodriguez; Kawamoto, Kensaku; Chatwin, Chris; Huser, Vojtech
2017-01-01
The use of a service-oriented architecture (SOA) has been identified as a promising approach for improving health care by facilitating reliable clinical decision support (CDS). A review of the literature through October 2013 identified 44 articles on this topic. The review suggests that SOA related technologies such as Business Process Model and Notation (BPMN) and Service Component Architecture (SCA) have not been generally adopted to impact health IT systems’ performance for better care solutions. Additionally, technologies such as Enterprise Service Bus (ESB) and architectural approaches like Service Choreography have not been generally exploited among researchers and developers. Based on the experience of other industries and our observation of the evolution of SOA, we found that the greater use of these approaches have the potential to significantly impact SOA implementations for CDS PMID:25325996
Yang, S; Wang, D
2000-01-01
This paper presents a constraint satisfaction adaptive neural network, together with several heuristics, to solve the generalized job-shop scheduling problem, one of NP-complete constraint satisfaction problems. The proposed neural network can be easily constructed and can adaptively adjust its weights of connections and biases of units based on the sequence and resource constraints of the job-shop scheduling problem during its processing. Several heuristics that can be combined with the neural network are also presented. In the combined approaches, the neural network is used to obtain feasible solutions, the heuristic algorithms are used to improve the performance of the neural network and the quality of the obtained solutions. Simulations have shown that the proposed neural network and its combined approaches are efficient with respect to the quality of solutions and the solving speed.
Walkowski, Slawomir; Lundin, Mikael; Szymas, Janusz; Lundin, Johan
2015-01-01
The way of viewing whole slide images (WSI) can be tracked and analyzed. In particular, it can be useful to learn how medical students view WSIs during exams and how their viewing behavior is correlated with correctness of the answers they give. We used software-based view path tracking method that enabled gathering data about viewing behavior of multiple simultaneous WSI users. This approach was implemented and applied during two practical exams in oral pathology in 2012 (88 students) and 2013 (91 students), which were based on questions with attached WSIs. Gathered data were visualized and analyzed in multiple ways. As a part of extended analysis, we tried to use machine learning approaches to predict correctness of students' answers based on how they viewed WSIs. We compared the results of analyses for years 2012 and 2013 - done for a single question, for student groups, and for a set of questions. The overall patterns were generally consistent across these 3 years. Moreover, viewing behavior data appeared to have certain potential for predicting answers' correctness and some outcomes of machine learning approaches were in the right direction. However, general prediction results were not satisfactory in terms of precision and recall. Our work confirmed that the view path tracking method is useful for discovering viewing behavior of students analyzing WSIs. It provided multiple useful insights in this area, and general results of our analyses were consistent across two exams. On the other hand, predicting answers' correctness appeared to be a difficult task - students' answers seem to be often unpredictable.
ERIC Educational Resources Information Center
Mahaffy, Peter G.; Holme, Thomas A.; Martin-Visscher, Leah; Martin, Brian E.; Versprille, Ashley; Kirchhoff, Mary; McKenzie, Lallie; Town, Marcy
2017-01-01
As one approach to moving beyond transmitting "inert" ideas to chemistry students, we use the term "teaching from rich contexts" to describe implementations of case studies or context-based learning based on systems thinking that provide deep and rich opportunities for learning crosscutting concepts through contexts. This…
Prediction of Vehicle Mobility on Large-Scale Soft-Soil Terrain Maps Using Physics-Based Simulation
2016-08-04
soil type. The modeling approach is based on (i) a seamless integration of multibody dynamics and discrete element method (DEM) solvers, and (ii...ensure that the vehicle follows a desired path. The soil is modeled as a Discrete Element Model (DEM) with a general cohesive material model that is
ERIC Educational Resources Information Center
Nielson Vargas, Erika Koren
2017-01-01
Success in developmental education contexts requires support not just in cognitive skills, but also in affective areas. One approach showing promise in supporting students in affective areas is mindfulness training. Mindfulness-based interventions (MBIs) can support affective needs and provide coping strategies in general as well as in some…
ERIC Educational Resources Information Center
Gehring, Kathleen M.; Eastman, Deborah A.
2008-01-01
Many initiatives for the improvement of undergraduate science education call for inquiry-based learning that emphasizes investigative projects and reading of the primary literature. These approaches give students an understanding of science as a process and help them integrate content presented in courses. At the same time, general initiatives to…
A New Design Approach to Game-Based Learning
ERIC Educational Resources Information Center
Larsen, Lasse Juel
2012-01-01
This paper puts forward a new design perspective for game-based learning. The general idea is to abandon the long sought-after dream of designing a closed learning system, where students in both primary and secondary school could learn--without the interference of teachers--whatever subject they wanted while sitting in front of a computer. This…
Dharani, S; Rakkiyappan, R; Cao, Jinde; Alsaedi, Ahmed
2017-08-01
This paper explores the problem of synchronization of a class of generalized reaction-diffusion neural networks with mixed time-varying delays. The mixed time-varying delays under consideration comprise of both discrete and distributed delays. Due to the development and merits of digital controllers, sampled-data control is a natural choice to establish synchronization in continuous-time systems. Using a newly introduced integral inequality, less conservative synchronization criteria that assure the global asymptotic synchronization of the considered generalized reaction-diffusion neural network and mixed delays are established in terms of linear matrix inequalities (LMIs). The obtained easy-to-test LMI-based synchronization criteria depends on the delay bounds in addition to the reaction-diffusion terms, which is more practicable. Upon solving these LMIs by using Matlab LMI control toolbox, a desired sampled-data controller gain can be acuqired without any difficulty. Finally, numerical examples are exploited to express the validity of the derived LMI-based synchronization criteria.
Regional Frequency and Uncertainty Analysis of Extreme Precipitation in Bangladesh
NASA Astrophysics Data System (ADS)
Mortuza, M. R.; Demissie, Y.; Li, H. Y.
2014-12-01
Increased frequency of extreme precipitations, especially those with multiday durations, are responsible for recent urban floods and associated significant losses of lives and infrastructures in Bangladesh. Reliable and routinely updated estimation of the frequency of occurrence of such extreme precipitation events are thus important for developing up-to-date hydraulic structures and stormwater drainage system that can effectively minimize future risk from similar events. In this study, we have updated the intensity-duration-frequency (IDF) curves for Bangladesh using daily precipitation data from 1961 to 2010 and quantified associated uncertainties. Regional frequency analysis based on L-moments is applied on 1-day, 2-day and 5-day annual maximum precipitation series due to its advantages over at-site estimation. The regional frequency approach pools the information from climatologically similar sites to make reliable estimates of quantiles given that the pooling group is homogeneous and of reasonable size. We have used Region of influence (ROI) approach along with homogeneity measure based on L-moments to identify the homogenous pooling groups for each site. Five 3-parameter distributions (i.e., Generalized Logistic, Generalized Extreme value, Generalized Normal, Pearson Type Three, and Generalized Pareto) are used for a thorough selection of appropriate models that fit the sample data. Uncertainties related to the selection of the distributions and historical data are quantified using the Bayesian Model Averaging and Balanced Bootstrap approaches respectively. The results from this study can be used to update the current design and management of hydraulic structures as well as in exploring spatio-temporal variations of extreme precipitation and associated risk.
NASA Astrophysics Data System (ADS)
Linker, Thomas M.; Lee, Glenn S.; Beekman, Matt
2018-06-01
The semi-analytical methods of thermoelectric energy conversion efficiency calculation based on the cumulative properties approach and reduced variables approach are compared for 21 high performance thermoelectric materials. Both approaches account for the temperature dependence of the material properties as well as the Thomson effect, thus the predicted conversion efficiencies are generally lower than that based on the conventional thermoelectric figure of merit ZT for nearly all of the materials evaluated. The two methods also predict material energy conversion efficiencies that are in very good agreement which each other, even for large temperature differences (average percent difference of 4% with maximum observed deviation of 11%). The tradeoff between obtaining a reliable assessment of a material's potential for thermoelectric applications and the complexity of implementation of the three models, as well as the advantages of using more accurate modeling approaches in evaluating new thermoelectric materials, are highlighted.
Pilling, Valerie K; Brannon, Laura A
2007-01-01
Health communication appeals were utilized through a Web site simulation to evaluate the potential effectiveness of 3 intervention approaches to promote responsible drinking among college students. Within the Web site simulation, participants were exposed to a persuasive message designed to represent either the generalized social norms advertising approach (based on others' behavior), the personalized behavioral feedback approach (tailored to the individual's behavior), or the schema-based approach (tailored to the individual's self-schema, or personality). A control group was exposed to a message that was designed to be neutral (it was designed to discourage heavy drinking, but it did not represent any of the previously mentioned approaches). It was hypothesized that the more personalized the message was to the individual, the more favorable college students' attitudes would be toward the responsible drinking message. Participants receiving the more personalized messages did report more favorable attitudes toward the responsible drinking message.
Wang, Guoli; Ebrahimi, Nader
2014-01-01
Non-negative matrix factorization (NMF) is a powerful machine learning method for decomposing a high-dimensional nonnegative matrix V into the product of two nonnegative matrices, W and H, such that V ∼ W H. It has been shown to have a parts-based, sparse representation of the data. NMF has been successfully applied in a variety of areas such as natural language processing, neuroscience, information retrieval, image processing, speech recognition and computational biology for the analysis and interpretation of large-scale data. There has also been simultaneous development of a related statistical latent class modeling approach, namely, probabilistic latent semantic indexing (PLSI), for analyzing and interpreting co-occurrence count data arising in natural language processing. In this paper, we present a generalized statistical approach to NMF and PLSI based on Renyi's divergence between two non-negative matrices, stemming from the Poisson likelihood. Our approach unifies various competing models and provides a unique theoretical framework for these methods. We propose a unified algorithm for NMF and provide a rigorous proof of monotonicity of multiplicative updates for W and H. In addition, we generalize the relationship between NMF and PLSI within this framework. We demonstrate the applicability and utility of our approach as well as its superior performance relative to existing methods using real-life and simulated document clustering data. PMID:25821345
Devarajan, Karthik; Wang, Guoli; Ebrahimi, Nader
2015-04-01
Non-negative matrix factorization (NMF) is a powerful machine learning method for decomposing a high-dimensional nonnegative matrix V into the product of two nonnegative matrices, W and H , such that V ∼ W H . It has been shown to have a parts-based, sparse representation of the data. NMF has been successfully applied in a variety of areas such as natural language processing, neuroscience, information retrieval, image processing, speech recognition and computational biology for the analysis and interpretation of large-scale data. There has also been simultaneous development of a related statistical latent class modeling approach, namely, probabilistic latent semantic indexing (PLSI), for analyzing and interpreting co-occurrence count data arising in natural language processing. In this paper, we present a generalized statistical approach to NMF and PLSI based on Renyi's divergence between two non-negative matrices, stemming from the Poisson likelihood. Our approach unifies various competing models and provides a unique theoretical framework for these methods. We propose a unified algorithm for NMF and provide a rigorous proof of monotonicity of multiplicative updates for W and H . In addition, we generalize the relationship between NMF and PLSI within this framework. We demonstrate the applicability and utility of our approach as well as its superior performance relative to existing methods using real-life and simulated document clustering data.
NASA Astrophysics Data System (ADS)
Balaji, Nidish Narayanaa; Krishna, I. R. Praveen; Padmanabhan, C.
2018-05-01
The Harmonic Balance Method (HBM) is a frequency-domain based approximation approach used for obtaining the steady state periodic behavior of forced dynamical systems. Intrinsically these systems are non-autonomous and the method offers many computational advantages over time-domain methods when the fundamental period of oscillation is known (generally fixed as the forcing period itself or a corresponding sub-harmonic if such behavior is expected). In the current study, a modified approach, based on He's Energy Balance Method (EBM), is applied to obtain the periodic solutions of conservative systems. It is shown that by this approach, periodic solutions of conservative systems on iso-energy manifolds in the phase space can be obtained very efficiently. The energy level provides the additional constraint on the HBM formulation, which enables the determination of the period of the solutions. The method is applied to the linear harmonic oscillator, a couple of nonlinear oscillators, the elastic pendulum and the Henon-Heiles system. The approach is used to trace the bifurcations of the periodic solutions of the last two, being 2 degree-of-freedom systems demonstrating very rich dynamical behavior. In the process, the advantages offered by the current formulation of the energy balance is brought out. A harmonic perturbation approach is used to evaluate the stability of the solutions for the bifurcation diagram.
Clark-Wilson, Jo; Baxter, Doreen M.
2014-01-01
Background Background: Introduced in the 1980s, the neurofunctional approach (NFA) is one of the few interventions designed primarily for clients with severe deficits following traumatic brain injury (TBI). Specifically the NFA was intended for those individuals who were limited in their ability to solve novel problems or generalize skills from one setting to another and whose lack of insight limited their engagement in the rehabilitative process. Description of the approach Description of the approach: The NFA is a client-centred, goal-driven approach that incorporates the principles of skill learning and promotes the development of routines and competencies in practical activities required for everyday living. Programmes based on the NFA are developed specifically to meet each client’s unique needs, using a range of evidence-based interventions. Recent evidence Recent evidence: Recently the NFA has been found to be more effective than cognitive-retraining for some individuals with moderate-to-severe TBI who have deficits in activities of daily living. This paper aims to define the core features of the NFA, outline the theoretical basis on which it is founded and consider implications of the findings for rehabilitation after TBI in general. The NFA is highly relevant for clients living in the community who require a case manager to direct an integrated, rehabilitation programme or provide structured input for the long-term maintenance of skills. PMID:25153760
Mathematical models for non-parametric inferences from line transect data
Burnham, K.P.; Anderson, D.R.
1976-01-01
A general mathematical theory of line transects is developed which supplies a framework for nonparametric density estimation based on either right angle or sighting distances. The probability of observing a point given its right angle distance (y) from the line is generalized to an arbitrary function g(y). Given only that g(0) = 1, it is shown there are nonparametric approaches to density estimation using the observed right angle distances. The model is then generalized to include sighting distances (r). Let f(y I r) be the conditional distribution of right angle distance given sighting distance. It is shown that nonparametric estimation based only on sighting distances requires we know the transformation of r given by f(0 I r).
Turbelin, Clément; Boëlle, Pierre-Yves
2010-01-01
Web-based applications are a choice tool for general practice based epidemiological surveillance; however their use may disrupt the general practitioners (GPs) work process. In this article, we propose an alternative approach based on a desktop client application. This was developed for use in the French General Practitioners Sentinel Network. We developed a java application running as a client on the local GP computer. It allows reporting cases to a central server and provides feedback to the participating GPs. XML was used to describe surveillance protocols and questionnaires as well as instances of case descriptions. An evaluation of the users' feelings was carried out and the impact on the timeliness and completeness of surveillance data was measured. Better integration in the work process was reported, especially when the software was used at the time of consultation. Reports were received more frequently with less missing data. This study highlights the potential of allowing multiple ways of interaction with the surveillance system to increase participation of GPs and the quality of surveillance.
Interpreting the International Right to Health in a Human Rights-Based Approach to Health
2016-01-01
Abstract This article tracks the shifting place of the international right to health, and human rights-based approaches to health, in the scholarly literature and United Nations (UN). From 1993 to 1994, the focus began to move from the right to health toward human rights-based approaches to health, including human rights guidance adopted by UN agencies in relation to specific health issues. There is a compelling case for a human rights-based approach to health, but it runs the risk of playing down the right to health, as evidenced by an examination of some UN human rights guidance. The right to health has important and distinctive qualities that are not provided by other rights—consequently, playing down the right to health can diminish rights-based approaches to health, as well as the right to health itself. Because general comments, the reports of UN Special Rapporteurs, and UN agencies’ guidance are exercises in interpretation, I discuss methods of legal interpretation. I suggest that the International Covenant on Economic, Social and Cultural Rights permits distinctive interpretative methods within the boundaries established by the Vienna Convention on the Law of Treaties. I call for the right to health to be placed explicitly at the center of a rights-based approach and interpreted in accordance with public international law and international human rights law. PMID:28559680
Interpreting the International Right to Health in a Human Rights-Based Approach to Health.
Hunt, Paul
2016-12-01
This article tracks the shifting place of the international right to health, and human rights-based approaches to health, in the scholarly literature and United Nations (UN). From 1993 to 1994, the focus began to move from the right to health toward human rights-based approaches to health, including human rights guidance adopted by UN agencies in relation to specific health issues. There is a compelling case for a human rights-based approach to health, but it runs the risk of playing down the right to health, as evidenced by an examination of some UN human rights guidance. The right to health has important and distinctive qualities that are not provided by other rights-consequently, playing down the right to health can diminish rights-based approaches to health, as well as the right to health itself. Because general comments, the reports of UN Special Rapporteurs, and UN agencies' guidance are exercises in interpretation, I discuss methods of legal interpretation. I suggest that the International Covenant on Economic, Social and Cultural Rights permits distinctive interpretative methods within the boundaries established by the Vienna Convention on the Law of Treaties. I call for the right to health to be placed explicitly at the center of a rights-based approach and interpreted in accordance with public international law and international human rights law.
Pixel-based flood mapping from SAR imagery: a comparison of approaches
NASA Astrophysics Data System (ADS)
Landuyt, Lisa; Van Wesemael, Alexandra; Van Coillie, Frieke M. B.; Verhoest, Niko E. C.
2017-04-01
Due to their all-weather, day and night capabilities, SAR sensors have been shown to be particularly suitable for flood mapping applications. Thus, they can provide spatially-distributed flood extent data which are valuable for calibrating, validating and updating flood inundation models. These models are an invaluable tool for water managers, to take appropriate measures in times of high water levels. Image analysis approaches to delineate flood extent on SAR imagery are numerous. They can be classified into two categories, i.e. pixel-based and object-based approaches. Pixel-based approaches, e.g. thresholding, are abundant and in general computationally inexpensive. However, large discrepancies between these techniques exist and often subjective user intervention is needed. Object-based approaches require more processing but allow for the integration of additional object characteristics, like contextual information and object geometry, and thus have significant potential to provide an improved classification result. As means of benchmark, a selection of pixel-based techniques is applied on a ERS-2 SAR image of the 2006 flood event of River Dee, United Kingdom. This selection comprises Otsu thresholding, Kittler & Illingworth thresholding, the Fine To Coarse segmentation algorithm and active contour modelling. The different classification results are evaluated and compared by means of several accuracy measures, including binary performance measures.
NASA Astrophysics Data System (ADS)
Camporesi, Roberto
2011-06-01
We present an approach to the impulsive response method for solving linear constant-coefficient ordinary differential equations based on the factorization of the differential operator. The approach is elementary, we only assume a basic knowledge of calculus and linear algebra. In particular, we avoid the use of distribution theory, as well as of the other more advanced approaches: Laplace transform, linear systems, the general theory of linear equations with variable coefficients and the variation of constants method. The approach presented here can be used in a first course on differential equations for science and engineering majors.
Design for dependability: A simulation-based approach. Ph.D. Thesis, 1993
NASA Technical Reports Server (NTRS)
Goswami, Kumar K.
1994-01-01
This research addresses issues in simulation-based system level dependability analysis of fault-tolerant computer systems. The issues and difficulties of providing a general simulation-based approach for system level analysis are discussed and a methodology that address and tackle these issues is presented. The proposed methodology is designed to permit the study of a wide variety of architectures under various fault conditions. It permits detailed functional modeling of architectural features such as sparing policies, repair schemes, routing algorithms as well as other fault-tolerant mechanisms, and it allows the execution of actual application software. One key benefit of this approach is that the behavior of a system under faults does not have to be pre-defined as it is normally done. Instead, a system can be simulated in detail and injected with faults to determine its failure modes. The thesis describes how object-oriented design is used to incorporate this methodology into a general purpose design and fault injection package called DEPEND. A software model is presented that uses abstractions of application programs to study the behavior and effect of software on hardware faults in the early design stage when actual code is not available. Finally, an acceleration technique that combines hierarchical simulation, time acceleration algorithms and hybrid simulation to reduce simulation time is introduced.
Molitor, John
2012-03-01
Bayesian methods have seen an increase in popularity in a wide variety of scientific fields, including epidemiology. One of the main reasons for their widespread application is the power of the Markov chain Monte Carlo (MCMC) techniques generally used to fit these models. As a result, researchers often implicitly associate Bayesian models with MCMC estimation procedures. However, Bayesian models do not always require Markov-chain-based methods for parameter estimation. This is important, as MCMC estimation methods, while generally quite powerful, are complex and computationally expensive and suffer from convergence problems related to the manner in which they generate correlated samples used to estimate probability distributions for parameters of interest. In this issue of the Journal, Cole et al. (Am J Epidemiol. 2012;175(5):368-375) present an interesting paper that discusses non-Markov-chain-based approaches to fitting Bayesian models. These methods, though limited, can overcome some of the problems associated with MCMC techniques and promise to provide simpler approaches to fitting Bayesian models. Applied researchers will find these estimation approaches intuitively appealing and will gain a deeper understanding of Bayesian models through their use. However, readers should be aware that other non-Markov-chain-based methods are currently in active development and have been widely published in other fields.
Semi-analytical approach to estimate railroad tank car shell puncture
DOT National Transportation Integrated Search
2011-03-16
This paper describes the development of engineering-based equations to estimate the puncture resistance of railroad tank cars under a generalized shell or side impact scenario. Resistance to puncture is considered in terms of puncture velocity, which...
Plant architecture, growth and radiative transfer for terrestrial and space environments
NASA Technical Reports Server (NTRS)
Norman, John M.; Goel, Narendra S.
1993-01-01
The overall objective of this research was to develop a hardware implemented model that would incorporate realistic and dynamic descriptions of canopy architecture in physiologically based models of plant growth and functioning, with an emphasis on radiative transfer while accommodating other environmental constraints. The general approach has five parts: a realistic mathematical treatment of canopy architecture, a methodology for combining this general canopy architectural description with a general radiative transfer model, the inclusion of physiological and environmental aspects of plant growth, inclusion of plant phenology, and integration.
Toward Generalization of Iterative Small Molecule Synthesis
Lehmann, Jonathan W.; Blair, Daniel J.; Burke, Martin D.
2018-01-01
Small molecules have extensive untapped potential to benefit society, but access to this potential is too often restricted by limitations inherent to the customized approach currently used to synthesize this class of chemical matter. In contrast, the “building block approach”, i.e., generalized iterative assembly of interchangeable parts, has now proven to be a highly efficient and flexible way to construct things ranging all the way from skyscrapers to macromolecules to artificial intelligence algorithms. The structural redundancy found in many small molecules suggests that they possess a similar capacity for generalized building block-based construction. It is also encouraging that many customized iterative synthesis methods have been developed that improve access to specific classes of small molecules. There has also been substantial recent progress toward the iterative assembly of many different types of small molecules, including complex natural products, pharmaceuticals, biological probes, and materials, using common building blocks and coupling chemistry. Collectively, these advances suggest that a generalized building block approach for small molecule synthesis may be within reach. PMID:29696152
Qu, Yongzhi; He, David; Yoon, Jae; Van Hecke, Brandon; Bechhoefer, Eric; Zhu, Junda
2014-01-01
In recent years, acoustic emission (AE) sensors and AE-based techniques have been developed and tested for gearbox fault diagnosis. In general, AE-based techniques require much higher sampling rates than vibration analysis-based techniques for gearbox fault diagnosis. Therefore, it is questionable whether an AE-based technique would give a better or at least the same performance as the vibration analysis-based techniques using the same sampling rate. To answer the question, this paper presents a comparative study for gearbox tooth damage level diagnostics using AE and vibration measurements, the first known attempt to compare the gearbox fault diagnostic performance of AE- and vibration analysis-based approaches using the same sampling rate. Partial tooth cut faults are seeded in a gearbox test rig and experimentally tested in a laboratory. Results have shown that the AE-based approach has the potential to differentiate gear tooth damage levels in comparison with the vibration-based approach. While vibration signals are easily affected by mechanical resonance, the AE signals show more stable performance. PMID:24424467
Beregovykh, V V; Spitskiy, O R
2014-01-01
Risk-based approach is used for examination of impact of different factors on quality of medicinal products in technology transfer. A general diagram is offered for risk analysis execution in technology transfer from pharmaceutical development to production. When transferring technology to full- scale commercial production it is necessary to investigate and simulate production process application beforehand in new real conditions. The manufacturing process is the core factorfor risk analysis having the most impact on quality attributes of a medicinal product. Further importantfactors are linked to materials and products to be handled and manufacturing environmental conditions such as premises, equipment and personnel. Usage of risk-based approach in designing of multipurpose production facility of medicinal products is shown where quantitative risk analysis tool RAMM (Risk Analysis and Mitigation Matrix) was applied.
Nursing research: can a feminist perspective make any contribution?
Ehlers, V
1999-03-01
As more than 90% of the RSA's nurses are women and as at least 50% of the health care clients are also women, nursing research can definitely benefit by incorporating feminist research approaches. Specific feminist research issues which could be relevant to nursing research include: inherent themes in feminist research feminist research methodology gender stereotypes and nursing research gender-based stereotypes of researchers potential benefits of incorporating feminist research approaches in nursing research. Most formal models of nursing, and thus also most nursing research based on these models, ignore gender issues. Thus they ignore part of the social reality of nursing and might provide distorted images of nursing. A feminist approach to nursing research could enhance the reality-based gender issues relevant to nursing specifically, and health care generally, and contribute towards rendering effective health care within a multidisciplinary health care context.
Modeling of delays in PKPD: classical approaches and a tutorial for delay differential equations.
Koch, Gilbert; Krzyzanski, Wojciech; Pérez-Ruixo, Juan Jose; Schropp, Johannes
2014-08-01
In pharmacokinetics/pharmacodynamics (PKPD) the measured response is often delayed relative to drug administration, individuals in a population have a certain lifespan until they maturate or the change of biomarkers does not immediately affects the primary endpoint. The classical approach in PKPD is to apply transit compartment models (TCM) based on ordinary differential equations to handle such delays. However, an alternative approach to deal with delays are delay differential equations (DDE). DDEs feature additional flexibility and properties, realize more complex dynamics and can complementary be used together with TCMs. We introduce several delay based PKPD models and investigate mathematical properties of general DDE based models, which serve as subunits in order to build larger PKPD models. Finally, we review current PKPD software with respect to the implementation of DDEs for PKPD analysis.
Lai, Zhi-Hui; Leng, Yong-Gang
2015-08-28
A two-dimensional Duffing oscillator which can produce stochastic resonance (SR) is studied in this paper. We introduce its SR mechanism and present a generalized parameter-adjusted SR (GPASR) model of this oscillator for the necessity of parameter adjustments. The Kramers rate is chosen as the theoretical basis to establish a judgmental function for judging the occurrence of SR in this model; and to analyze and summarize the parameter-adjusted rules under unmatched signal amplitude, frequency, and/or noise-intensity. Furthermore, we propose the weak-signal detection approach based on this GPASR model. Finally, we employ two practical examples to demonstrate the feasibility of the proposed approach in practical engineering application.
Improving Grasp Skills Using Schema Structured Learning
NASA Technical Reports Server (NTRS)
Platt, Robert; Grupen, ROderic A.; Fagg, Andrew H.
2006-01-01
Abstract In the control-based approach to robotics, complex behavior is created by sequencing and combining control primitives. While it is desirable for the robot to autonomously learn the correct control sequence, searching through the large number of potential solutions can be time consuming. This paper constrains this search to variations of a generalized solution encoded in a framework known as an action schema. A new algorithm, SCHEMA STRUCTURED LEARNING, is proposed that repeatedly executes variations of the generalized solution in search of instantiations that satisfy action schema objectives. This approach is tested in a grasping task where Dexter, the UMass humanoid robot, learns which reaching and grasping controllers maximize the probability of grasp success.
A fractional approach to the Fermi-Pasta-Ulam problem
NASA Astrophysics Data System (ADS)
Machado, J. A. T.
2013-09-01
This paper studies the Fermi-Pasta-Ulam problem having in mind the generalization provided by Fractional Calculus (FC). The study starts by addressing the classical formulation, based on the standard integer order differential calculus and evaluates the time and frequency responses. A first generalization to be investigated consists in the direct replacement of the springs by fractional elements of the dissipative type. It is observed that the responses settle rapidly and no relevant phenomena occur. A second approach consists of replacing the springs by a blend of energy extracting and energy inserting elements of symmetrical fractional order with amplitude modulated by quadratic terms. The numerical results reveal a response close to chaotic behaviour.
QMR: A Quasi-Minimal Residual method for non-Hermitian linear systems
NASA Technical Reports Server (NTRS)
Freund, Roland W.; Nachtigal, Noel M.
1990-01-01
The biconjugate gradient (BCG) method is the natural generalization of the classical conjugate gradient algorithm for Hermitian positive definite matrices to general non-Hermitian linear systems. Unfortunately, the original BCG algorithm is susceptible to possible breakdowns and numerical instabilities. A novel BCG like approach is presented called the quasi-minimal residual (QMR) method, which overcomes the problems of BCG. An implementation of QMR based on a look-ahead version of the nonsymmetric Lanczos algorithm is proposed. It is shown how BCG iterates can be recovered stably from the QMR process. Some further properties of the QMR approach are given and an error bound is presented. Finally, numerical experiments are reported.
Modeling and Optimization for Morphing Wing Concept Generation
NASA Technical Reports Server (NTRS)
Skillen, Michael D.; Crossley, William A.
2007-01-01
This report consists of two major parts: 1) the approach to develop morphing wing weight equations, and 2) the approach to size morphing aircraft. Combined, these techniques allow the morphing aircraft to be sized with estimates of the morphing wing weight that are more credible than estimates currently available; aircraft sizing results prior to this study incorporated morphing wing weight estimates based on general heuristics for fixed-wing flaps (a comparable "morphing" component) but, in general, these results were unsubstantiated. This report will show that the method of morphing wing weight prediction does, in fact, drive the aircraft sizing code to different results and that accurate morphing wing weight estimates are essential to credible aircraft sizing results.
2015-03-26
albeit powerful , method available for exploring CAS. As discussed above, there are many useful mathematical tools appropriate for CAS modeling. Agent-based...cells, tele- phone calls, and sexual contacts approach power -law distributions. [48] Networks in general are robust against random failures, but...targeted failures can have powerful effects – provided the targeter has a good understanding of the network structure. Some argue (convincingly) that all
A kernel regression approach to gene-gene interaction detection for case-control studies.
Larson, Nicholas B; Schaid, Daniel J
2013-11-01
Gene-gene interactions are increasingly being addressed as a potentially important contributor to the variability of complex traits. Consequently, attentions have moved beyond single locus analysis of association to more complex genetic models. Although several single-marker approaches toward interaction analysis have been developed, such methods suffer from very high testing dimensionality and do not take advantage of existing information, notably the definition of genes as functional units. Here, we propose a comprehensive family of gene-level score tests for identifying genetic elements of disease risk, in particular pairwise gene-gene interactions. Using kernel machine methods, we devise score-based variance component tests under a generalized linear mixed model framework. We conducted simulations based upon coalescent genetic models to evaluate the performance of our approach under a variety of disease models. These simulations indicate that our methods are generally higher powered than alternative gene-level approaches and at worst competitive with exhaustive SNP-level (where SNP is single-nucleotide polymorphism) analyses. Furthermore, we observe that simulated epistatic effects resulted in significant marginal testing results for the involved genes regardless of whether or not true main effects were present. We detail the benefits of our methods and discuss potential genome-wide analysis strategies for gene-gene interaction analysis in a case-control study design. © 2013 WILEY PERIODICALS, INC.
A One-System Theory Which is Not Propositional.
Witnauer, James E; Urcelay, Gonzalo P; Miller, Ralph R
2009-04-01
We argue that the propositional and link-based approaches to human contingency learning represent different levels of analysis because propositional reasoning requires a basis, which is plausibly provided by a link-based architecture. Moreover, in their attempt to compare two general classes of models (link-based and propositional), Mitchell et al. have referred to only two generic models and ignore the large variety of different models within each class.
A dual Lewis base activation strategy for enantioselective carbene-catalyzed annulations.
Izquierdo, Javier; Orue, Ane; Scheidt, Karl A
2013-07-24
A dual activation strategy integrating N-heterocyclic carbene (NHC) catalysis and a second Lewis base has been developed. NHC-bound homoenolate equivalents derived from α,β-unsaturated aldehydes combine with transient reactive o-quinone methides in an enantioselective formal [4 + 3] fashion to access 2-benzoxopinones. The overall approach provides a general blueprint for the integration of carbene catalysis with additional Lewis base activation modes.
ERIC Educational Resources Information Center
Wu, Hsiao-Chi; Shen, Pei-Di; Chen, Yi-Fen; Tsai, Chia-Wen
2016-01-01
Web-based learning is generally a solitary process without teachers' on-the-spot assistance. In this study, a quasi-experiment was conducted to explore the effects of various combinations of Web-Based Cognitive Apprenticeship (WBCA) and Time Management (TM) on the development of students' computing skills. Three class cohorts of 124 freshmen in a…
Continuous data assimilation for downscaling large-footprint soil moisture retrievals
NASA Astrophysics Data System (ADS)
Altaf, Muhammad U.; Jana, Raghavendra B.; Hoteit, Ibrahim; McCabe, Matthew F.
2016-10-01
Soil moisture is a key component of the hydrologic cycle, influencing processes leading to runoff generation, infiltration and groundwater recharge, evaporation and transpiration. Generally, the measurement scale for soil moisture is found to be different from the modeling scales for these processes. Reducing this mismatch between observation and model scales in necessary for improved hydrological modeling. An innovative approach to downscaling coarse resolution soil moisture data by combining continuous data assimilation and physically based modeling is presented. In this approach, we exploit the features of Continuous Data Assimilation (CDA) which was initially designed for general dissipative dynamical systems and later tested numerically on the incompressible Navier-Stokes equation, and the Benard equation. A nudging term, estimated as the misfit between interpolants of the assimilated coarse grid measurements and the fine grid model solution, is added to the model equations to constrain the model's large scale variability by available measurements. Soil moisture fields generated at a fine resolution by a physically-based vadose zone model (HYDRUS) are subjected to data assimilation conditioned upon coarse resolution observations. This enables nudging of the model outputs towards values that honor the coarse resolution dynamics while still being generated at the fine scale. Results show that the approach is feasible to generate fine scale soil moisture fields across large extents, based on coarse scale observations. Application of this approach is likely in generating fine and intermediate resolution soil moisture fields conditioned on the radiometerbased, coarse resolution products from remote sensing satellites.
Expected Utility Based Decision Making under Z-Information and Its Application.
Aliev, Rashad R; Mraiziq, Derar Atallah Talal; Huseynov, Oleg H
2015-01-01
Real-world decision relevant information is often partially reliable. The reasons are partial reliability of the source of information, misperceptions, psychological biases, incompetence, and so forth. Z-numbers based formalization of information (Z-information) represents a natural language (NL) based value of a variable of interest in line with the related NL based reliability. What is important is that Z-information not only is the most general representation of real-world imperfect information but also has the highest descriptive power from human perception point of view as compared to fuzzy number. In this study, we present an approach to decision making under Z-information based on direct computation over Z-numbers. This approach utilizes expected utility paradigm and is applied to a benchmark decision problem in the field of economics.
Maslarska, Vania; Tencheva, Jasmina; Budevsky, Omortag
2003-01-01
Based on precise analysis of the acid-base equilibrium, a new approach in the treatment of experimental data from a potentiometric titration is proposed. A new general formula giving explicitly the relation V=f([H(+)]) is derived, valid for every acid-base titration, which includes mono- and polyfunctional protolytes and their mixtures. The present study is the first practical application of this formula for the simplest case, the analysis of one monofunctional protolyte. The collected mV data during the titration are converted into pH-values by means of an auto pH-calibration procedure, thus avoiding preliminary preparation of the measuring system. The mentioned pH-calibration method is applicable also in water-organic mixtures and allows the quantitative determination of sparingly soluble substances (particularly pharmaceuticals). The treatment of the data is performed by means of ready-to-use software products, which makes the proposed approach accessible for a wide range of applications.
Gorzalczany, Marian B; Rudzinski, Filip
2017-06-07
This paper presents a generalization of self-organizing maps with 1-D neighborhoods (neuron chains) that can be effectively applied to complex cluster analysis problems. The essence of the generalization consists in introducing mechanisms that allow the neuron chain--during learning--to disconnect into subchains, to reconnect some of the subchains again, and to dynamically regulate the overall number of neurons in the system. These features enable the network--working in a fully unsupervised way (i.e., using unlabeled data without a predefined number of clusters)--to automatically generate collections of multiprototypes that are able to represent a broad range of clusters in data sets. First, the operation of the proposed approach is illustrated on some synthetic data sets. Then, this technique is tested using several real-life, complex, and multidimensional benchmark data sets available from the University of California at Irvine (UCI) Machine Learning repository and the Knowledge Extraction based on Evolutionary Learning data set repository. A sensitivity analysis of our approach to changes in control parameters and a comparative analysis with an alternative approach are also performed.
An overview of very high level software design methods
NASA Technical Reports Server (NTRS)
Asdjodi, Maryam; Hooper, James W.
1988-01-01
Very High Level design methods emphasize automatic transfer of requirements to formal design specifications, and/or may concentrate on automatic transformation of formal design specifications that include some semantic information of the system into machine executable form. Very high level design methods range from general domain independent methods to approaches implementable for specific applications or domains. Applying AI techniques, abstract programming methods, domain heuristics, software engineering tools, library-based programming and other methods different approaches for higher level software design are being developed. Though one finds that a given approach does not always fall exactly in any specific class, this paper provides a classification for very high level design methods including examples for each class. These methods are analyzed and compared based on their basic approaches, strengths and feasibility for future expansion toward automatic development of software systems.
Poreba, M; Kasperkiewicz, P; Snipas, S J; Fasci, D; Salvesen, G S; Drag, M
2014-01-01
Traditional combinatorial peptidyl substrate library approaches generally utilize natural amino acids, limiting the usefulness of this tool in generating selective substrates for proteases that share similar substrate specificity profiles. To address this limitation, we synthesized a Hybrid Combinatorial Substrate Library (HyCoSuL) with the general formula of Ac-P4-P3-P2-Asp-ACC, testing the approach on a family of closely related proteases – the human caspases. The power of this library for caspase discrimination extends far beyond traditional PS-SCL approach, as in addition to 19 natural amino acids we also used 110 diverse unnatural amino acids that can more extensively explore the chemical space represented by caspase-active sites. Using this approach we identified and employed peptide-based substrates that provided excellent discrimination between individual caspases, allowing us to simultaneously resolve the individual contribution of the apical caspase-9 and the executioner caspase-3 and caspase-7 in the development of cytochrome-c-dependent apoptosis for the first time. PMID:24832467
Continuing education for general practice. 2. Systematic learning from experience.
al-Shehri, A; Stanley, I; Thomas, P
1993-01-01
Prompted by evidence that the recently-adopted arrangements for ongoing education among established general practitioners are unsatisfactory, the first of a pair of papers examined the theoretical basis of continuing education for general practice and proposed a model of self-directed learning in which the experience of established practitioners is connected, through the media of reading, reflection and audit, with competence for the role. In this paper a practical, systematic approach to self-directed learning by general practitioners is described based on the model. The contribution which appropriate participation in continuing medical education can make to enhancing learning from experience is outlined. PMID:8373649
Geng, Elvin H; Glidden, David V; Bangsberg, David R; Bwana, Mwebesa Bosco; Musinguzi, Nicholas; Nash, Denis; Metcalfe, John Z; Yiannoutsos, Constantin T; Martin, Jeffrey N; Petersen, Maya L
2012-05-15
Although clinic-based cohorts are most representative of the "real world," they are susceptible to loss to follow-up. Strategies for managing the impact of loss to follow-up are therefore needed to maximize the value of studies conducted in these cohorts. The authors evaluated adult patients starting antiretroviral therapy at an HIV/AIDS clinic in Uganda, where 29% of patients were lost to follow-up after 2 years (January 1, 2004-September 30, 2007). Unweighted, inverse probability of censoring weighted (IPCW), and sampling-based approaches (using supplemental data from a sample of lost patients subsequently tracked in the community) were used to identify the predictive value of sex on mortality. Directed acyclic graphs (DAGs) were used to explore the structural basis for bias in each approach. Among 3,628 patients, unweighted and IPCW analyses found men to have higher mortality than women, whereas the sampling-based approach did not. DAGs encoding knowledge about the data-generating process, including the fact that death is a cause of being classified as lost to follow-up in this setting, revealed "collider" bias in the unweighted and IPCW approaches. In a clinic-based cohort in Africa, unweighted and IPCW approaches-which rely on the "missing at random" assumption-yielded biased estimates. A sampling-based approach can in general strengthen epidemiologic analyses conducted in many clinic-based cohorts, including those examining other diseases.
Building Hierarchical Representations for Oracle Character and Sketch Recognition.
Jun Guo; Changhu Wang; Roman-Rangel, Edgar; Hongyang Chao; Yong Rui
2016-01-01
In this paper, we study oracle character recognition and general sketch recognition. First, a data set of oracle characters, which are the oldest hieroglyphs in China yet remain a part of modern Chinese characters, is collected for analysis. Second, typical visual representations in shape- and sketch-related works are evaluated. We analyze the problems suffered when addressing these representations and determine several representation design criteria. Based on the analysis, we propose a novel hierarchical representation that combines a Gabor-related low-level representation and a sparse-encoder-related mid-level representation. Extensive experiments show the effectiveness of the proposed representation in both oracle character recognition and general sketch recognition. The proposed representation is also complementary to convolutional neural network (CNN)-based models. We introduce a solution to combine the proposed representation with CNN-based models, and achieve better performances over both approaches. This solution has beaten humans at recognizing general sketches.
Effective domain-dependent reuse in medical knowledge bases.
Dojat, M; Pachet, F
1995-12-01
Knowledge reuse is now a critical issue for most developers of medical knowledge-based systems. As a rule, reuse is addressed from an ambitious, knowledge-engineering perspective that focuses on reusable general purpose knowledge modules, concepts, and methods. However, such a general goal fails to take into account the specific aspects of medical practice. From the point of view of the knowledge engineer, whose goal is to capture the specific features and intricacies of a given domain, this approach addresses the wrong level of generality. In this paper, we adopt a more pragmatic viewpoint, introducing the less ambitious goal of "domain-dependent limited reuse" and suggesting effective means of achieving it in practice. In a knowledge representation framework combining objects and production rules, we propose three mechanisms emerging from the combination of object-oriented programming and rule-based programming. We show these mechanisms contribute to achieve limited reuse and to introduce useful limited variations in medical expertise.
A BRIEF ORAL OVERVIEW OF ENVIRONMENTAL ECONOMICS
A brief 1 hour oral presentation to professional staff of Cincinnati Nature Center is intended to provide a lay audience with a general understanding of how market-based approaches to environmental protection can meet (or exceed) regulatory efforts at enforcing pollution standard...
Corporate incentives for promoting safety belt use : rationale, guidelines, and examples
DOT National Transportation Integrated Search
1982-10-01
This manual was designed to teach the corporate executive successful strategies for implementing and evaluating a successful industry-based program to motivate employee safety belt use. A rationale is given for the general approach; and specific guid...
ERIC Educational Resources Information Center
LeRose, Barbara; And Others
1979-01-01
The project is based on a general systems approach. Developmental stage theory is employed as a starting point, and the developmental "minitasks" which act as stair risers from one developmental level to another are carried through with the use of Bloom's cognitive taxonomy. (Author/DLS)
A CONSISTENT APPROACH FOR THE APPLICATION OF PHARMACOKINETIC MODELING IN CANCER RISK ASSESSMENT
Physiologically based pharmacokinetic (PBPK) modeling provides important capabilities for improving the reliability of the extrapolations across dose, species, and exposure route that are generally required in chemical risk assessment regardless of the toxic endpoint being consid...
A PetriNet-Based Approach for Supporting Traceability in Cyber-Physical Manufacturing Systems
Huang, Jiwei; Zhu, Yeping; Cheng, Bo; Lin, Chuang; Chen, Junliang
2016-01-01
With the growing popularity of complex dynamic activities in manufacturing processes, traceability of the entire life of every product has drawn significant attention especially for food, clinical materials, and similar items. This paper studies the traceability issue in cyber-physical manufacturing systems from a theoretical viewpoint. Petri net models are generalized for formulating dynamic manufacturing processes, based on which a detailed approach for enabling traceability analysis is presented. Models as well as algorithms are carefully designed, which can trace back the lifecycle of a possibly contaminated item. A practical prototype system for supporting traceability is designed, and a real-life case study of a quality control system for bee products is presented to validate the effectiveness of the approach. PMID:26999141
A PetriNet-Based Approach for Supporting Traceability in Cyber-Physical Manufacturing Systems.
Huang, Jiwei; Zhu, Yeping; Cheng, Bo; Lin, Chuang; Chen, Junliang
2016-03-17
With the growing popularity of complex dynamic activities in manufacturing processes, traceability of the entire life of every product has drawn significant attention especially for food, clinical materials, and similar items. This paper studies the traceability issue in cyber-physical manufacturing systems from a theoretical viewpoint. Petri net models are generalized for formulating dynamic manufacturing processes, based on which a detailed approach for enabling traceability analysis is presented. Models as well as algorithms are carefully designed, which can trace back the lifecycle of a possibly contaminated item. A practical prototype system for supporting traceability is designed, and a real-life case study of a quality control system for bee products is presented to validate the effectiveness of the approach.
Gold-standard evaluation of a folksonomy-based ontology learning model
NASA Astrophysics Data System (ADS)
Djuana, E.
2018-03-01
Folksonomy, as one result of collaborative tagging process, has been acknowledged for its potential in improving categorization and searching of web resources. However, folksonomy contains ambiguities such as synonymy and polysemy as well as different abstractions or generality problem. To maximize its potential, some methods for associating tags of folksonomy with semantics and structural relationships have been proposed such as using ontology learning method. This paper evaluates our previous work in ontology learning according to gold-standard evaluation approach in comparison to a notable state-of-the-art work and several baselines. The results show that our method is comparable to the state-of the art work which further validate our approach as has been previously validated using task-based evaluation approach.
NASA Technical Reports Server (NTRS)
Fink, Pamela K.
1991-01-01
Two intelligent tutoring systems were developed. These tutoring systems are being used to study the effectiveness of intelligent tutoring systems in training high performance tasks and the interrelationship of high performance and cognitive tasks. The two tutoring systems, referred to as the Console Operations Tutors, were built using the same basic approach to the design of an intelligent tutoring system. This design approach allowed researchers to more rapidly implement the cognitively based tutor, the OMS Leak Detect Tutor, by using the foundation of code generated in the development of the high performance based tutor, the Manual Select Keyboard (MSK). It is believed that the approach can be further generalized to develop a generic intelligent tutoring system implementation tool.
Cell delivery in regenerative medicine: the cell sheet engineering approach.
Yang, Joseph; Yamato, Masayuki; Nishida, Kohji; Ohki, Takeshi; Kanzaki, Masato; Sekine, Hidekazu; Shimizu, Tatsuya; Okano, Teruo
2006-11-28
Recently, cell-based therapies have developed as a foundation for regenerative medicine. General approaches for cell delivery have thus far involved the use of direct injection of single cell suspensions into the target tissues. Additionally, tissue engineering with the general paradigm of seeding cells into biodegradable scaffolds has also evolved as a method for the reconstruction of various tissues and organs. With success in clinical trials, regenerative therapies using these approaches have therefore garnered significant interest and attention. As a novel alternative, we have developed cell sheet engineering using temperature-responsive culture dishes, which allows for the non-invasive harvest of cultured cells as intact sheets along with their deposited extracellular matrix. Using this approach, cell sheets can be directly transplanted to host tissues without the use of scaffolding or carrier materials, or used to create in vitro tissue constructs via the layering of individual cell sheets. In addition to simple transplantation, cell sheet engineered constructs have also been applied for alternative therapies such as endoscopic transplantation, combinatorial tissue reconstruction, and polysurgery to overcome limitations of regenerative therapies and cell delivery using conventional approaches.
NASA Astrophysics Data System (ADS)
Costanzi, Stefano; Tikhonova, Irina G.; Harden, T. Kendall; Jacobson, Kenneth A.
2009-11-01
Accurate in silico models for the quantitative prediction of the activity of G protein-coupled receptor (GPCR) ligands would greatly facilitate the process of drug discovery and development. Several methodologies have been developed based on the properties of the ligands, the direct study of the receptor-ligand interactions, or a combination of both approaches. Ligand-based three-dimensional quantitative structure-activity relationships (3D-QSAR) techniques, not requiring knowledge of the receptor structure, have been historically the first to be applied to the prediction of the activity of GPCR ligands. They are generally endowed with robustness and good ranking ability; however they are highly dependent on training sets. Structure-based techniques generally do not provide the level of accuracy necessary to yield meaningful rankings when applied to GPCR homology models. However, they are essentially independent from training sets and have a sufficient level of accuracy to allow an effective discrimination between binders and nonbinders, thus qualifying as viable lead discovery tools. The combination of ligand and structure-based methodologies in the form of receptor-based 3D-QSAR and ligand and structure-based consensus models results in robust and accurate quantitative predictions. The contribution of the structure-based component to these combined approaches is expected to become more substantial and effective in the future, as more sophisticated scoring functions are developed and more detailed structural information on GPCRs is gathered.
Performance evaluation of an automatic MGRF-based lung segmentation approach
NASA Astrophysics Data System (ADS)
Soliman, Ahmed; Khalifa, Fahmi; Alansary, Amir; Gimel'farb, Georgy; El-Baz, Ayman
2013-10-01
The segmentation of the lung tissues in chest Computed Tomography (CT) images is an important step for developing any Computer-Aided Diagnostic (CAD) system for lung cancer and other pulmonary diseases. In this paper, we introduce a new framework for validating the accuracy of our developed Joint Markov-Gibbs based lung segmentation approach using 3D realistic synthetic phantoms. These phantoms are created using a 3D Generalized Gauss-Markov Random Field (GGMRF) model of voxel intensities with pairwise interaction to model the 3D appearance of the lung tissues. Then, the appearance of the generated 3D phantoms is simulated based on iterative minimization of an energy function that is based on the learned 3D-GGMRF image model. These 3D realistic phantoms can be used to evaluate the performance of any lung segmentation approach. The performance of our segmentation approach is evaluated using three metrics, namely, the Dice Similarity Coefficient (DSC), the modified Hausdorff distance, and the Average Volume Difference (AVD) between our segmentation and the ground truth. Our approach achieves mean values of 0.994±0.003, 8.844±2.495 mm, and 0.784±0.912 mm3, for the DSC, Hausdorff distance, and the AVD, respectively.
Fuggle, Peter; Bevington, Dickon; Cracknell, Liz; Hanley, James; Hare, Suzanne; Lincoln, John; Richardson, Garry; Stevens, Nina; Tovey, Heather; Zlotowitz, Sally
2015-07-01
AMBIT (Adolescent Mentalization-Based Integrative Treatment) is a developing team approach to working with hard-to-reach adolescents. The approach applies the principle of mentalization to relationships with clients, team relationships and working across agencies. It places a high priority on the need for locally developed evidence-based practice, and proposes that outcome evaluation needs to be explicitly linked with processes of team learning using a learning organization framework. A number of innovative methods of team learning are incorporated into the AMBIT approach, particularly a system of web-based wiki-formatted AMBIT manuals individualized for each participating team. The paper describes early development work of the model and illustrates ways of establishing explicit links between outcome evaluation, team learning and manualization by describing these methods as applied to two AMBIT-trained teams; one team working with young people on the edge of care (AMASS - the Adolescent Multi-Agency Support Service) and another working with substance use (CASUS - Child and Adolescent Substance Use Service in Cambridgeshire). Measurement of the primary outcomes for each team (which were generally very positive) facilitated team learning and adaptations of methods of practice that were consolidated through manualization. © The Author(s) 2014.
de Jong, Jan A Stavenga; Wierstra, Ronny F A; Hermanussen, José
2006-03-01
Research on individual learning approaches (or learning styles) is split in two traditions, one of which is biased towards academic learning, and the other towards learning from direct experience. In the reported study, the two traditions are linked by investigating the relationships between school-based (academic) and work-based (experiential) learning approaches of students in vocational education programs. Participants were 899 students of a Dutch school for secondary vocational education; 758 provided data on school-based learning, and 407 provided data on work-based learning, resulting in an overlap of 266 students from whom data were obtained on learning in both settings. Learning approaches in school and work settings were measured with questionnaires. Using factor analysis and cluster analysis, items and students were grouped, both with respect to school- and work-based learning. The study identified two academic learning dimensions (constructive learning and reproductive learning), and three experiential learning dimensions (analysis, initiative, and immersion). Construction and analysis were correlated positively, and reproduction and initiative negatively. Cluster analysis resulted in the identification of three school-based learning orientations and three work-based learning orientations. The relation between the two types of learning orientations, expressed in Cramér's V, appeared to be weak. It is concluded that learning approaches are relatively context specific, which implies that neither theoretical tradition can claim general applicability.
Emergence and space-time structure of lump solution to the (2+1)-dimensional generalized KP equation
NASA Astrophysics Data System (ADS)
Tan, Wei; Dai, Houping; Dai, Zhengde; Zhong, Wenyong
2017-11-01
A periodic breather-wave solution is obtained using homoclinic test approach and Hirota's bilinear method with a small perturbation parameter u0 for the (2+1)-dimensional generalized Kadomtsev-Petviashvili equation. Based on the periodic breather-wave, a lump solution is emerged by limit behaviour. Finally, three different forms of the space-time structure of the lump solution are investigated and discussed using the extreme value theory.
A systems approach to the physiology of weightlessness
NASA Technical Reports Server (NTRS)
White, Ronald J.; Leonard, Joel I.; Rummel, John A.; Leach, Carolyn S.
1991-01-01
A general systems approach to conducting and analyzing research on the human adaptation to weightlessness is presented. The research is aimed at clarifying the role that each of the major components of the human system plays following the transition to and from space. The approach utilizes a variety of mathematical models in order to pose and test alternative hypotheses concerned with the adaptation process. Certain aspects of the problem of fluid and electrolyte shifts in weightlessnes are considered, and an integrated hypothesis based on numerical simulation studies and experimental data is presented.
Matrix factorization-based data fusion for gene function prediction in baker's yeast and slime mold.
Zitnik, Marinka; Zupan, Blaž
2014-01-01
The development of effective methods for the characterization of gene functions that are able to combine diverse data sources in a sound and easily-extendible way is an important goal in computational biology. We have previously developed a general matrix factorization-based data fusion approach for gene function prediction. In this manuscript, we show that this data fusion approach can be applied to gene function prediction and that it can fuse various heterogeneous data sources, such as gene expression profiles, known protein annotations, interaction and literature data. The fusion is achieved by simultaneous matrix tri-factorization that shares matrix factors between sources. We demonstrate the effectiveness of the approach by evaluating its performance on predicting ontological annotations in slime mold D. discoideum and on recognizing proteins of baker's yeast S. cerevisiae that participate in the ribosome or are located in the cell membrane. Our approach achieves predictive performance comparable to that of the state-of-the-art kernel-based data fusion, but requires fewer data preprocessing steps.
Label-free functional nucleic acid sensors for detecting target agents
Lu, Yi; Xiang, Yu
2015-01-13
A general methodology to design label-free fluorescent functional nucleic acid sensors using a vacant site approach and an abasic site approach is described. In one example, a method for designing label-free fluorescent functional nucleic acid sensors (e.g., those that include a DNAzyme, aptamer or aptazyme) that have a tunable dynamic range through the introduction of an abasic site (e.g., dSpacer) or a vacant site into the functional nucleic acids. Also provided is a general method for designing label-free fluorescent aptamer sensors based on the regulation of malachite green (MG) fluorescence. A general method for designing label-free fluorescent catalytic and molecular beacons (CAMBs) is also provided. The methods demonstrated here can be used to design many other label-free fluorescent sensors to detect a wide range of analytes. Sensors and methods of using the disclosed sensors are also provided.
Mentoring medical students in your general practice.
Fraser, John
2016-05-01
Mentoring medical students in general practices is becoming more common in Australia due to formalised scholarship programs and informal approaches by students. This paper defines mentoring in Australian general practice. Practical suggestions are made on how to structure a mentorship program in your practice. Mentoring differs from leadership and teaching. It is a long-term relationship between a student and an experienced general practitioner. Avoiding summative assessment in mentorship is important to its success. Mentoring is about forming a safe place to confidentially discuss personal and professional issues between a mentor and student. This is based on defining roles and mutual trust. At the same time, students crave formative feedback. Unfortunately, present feedback models are based on teaching principles that can blur the differences between assessor, teacher and mentor. Mentorship can provide students with orientation and learning experiences so that they are prepared for practice as an intern.
On Target Localization Using Combined RSS and AoA Measurements
Beko, Marko; Dinis, Rui
2018-01-01
This work revises existing solutions for a problem of target localization in wireless sensor networks (WSNs), utilizing integrated measurements, namely received signal strength (RSS) and angle of arrival (AoA). The problem of RSS/AoA-based target localization became very popular in the research community recently, owing to its great applicability potential and relatively low implementation cost. Therefore, here, a comprehensive study of the state-of-the-art (SoA) solutions and their detailed analysis is presented. The beginning of this work starts by considering the SoA approaches based on convex relaxation techniques (more computationally complex in general), and it goes through other (less computationally complex) approaches, as well, such as the ones based on the generalized trust region sub-problems framework and linear least squares. Furthermore, a detailed analysis of the computational complexity of each solution is reviewed. Furthermore, an extensive set of simulation results is presented. Finally, the main conclusions are summarized, and a set of future aspects and trends that might be interesting for future research in this area is identified. PMID:29671832
Tzallas, A T; Karvelis, P S; Katsis, C D; Fotiadis, D I; Giannopoulos, S; Konitsiotis, S
2006-01-01
The aim of the paper is to analyze transient events in inter-ictal EEG recordings, and classify epileptic activity into focal or generalized epilepsy using an automated method. A two-stage approach is proposed. In the first stage the observed transient events of a single channel are classified into four categories: epileptic spike (ES), muscle activity (EMG), eye blinking activity (EOG), and sharp alpha activity (SAA). The process is based on an artificial neural network. Different artificial neural network architectures have been tried and the network having the lowest error has been selected using the hold out approach. In the second stage a knowledge-based system is used to produce diagnosis for focal or generalized epileptic activity. The classification of transient events reported high overall accuracy (84.48%), while the knowledge-based system for epilepsy diagnosis correctly classified nine out of ten cases. The proposed method is advantageous since it effectively detects and classifies the undesirable activity into appropriate categories and produces a final outcome related to the existence of epilepsy.
Linear mixed model for heritability estimation that explicitly addresses environmental variation.
Heckerman, David; Gurdasani, Deepti; Kadie, Carl; Pomilla, Cristina; Carstensen, Tommy; Martin, Hilary; Ekoru, Kenneth; Nsubuga, Rebecca N; Ssenyomo, Gerald; Kamali, Anatoli; Kaleebu, Pontiano; Widmer, Christian; Sandhu, Manjinder S
2016-07-05
The linear mixed model (LMM) is now routinely used to estimate heritability. Unfortunately, as we demonstrate, LMM estimates of heritability can be inflated when using a standard model. To help reduce this inflation, we used a more general LMM with two random effects-one based on genomic variants and one based on easily measured spatial location as a proxy for environmental effects. We investigated this approach with simulated data and with data from a Uganda cohort of 4,778 individuals for 34 phenotypes including anthropometric indices, blood factors, glycemic control, blood pressure, lipid tests, and liver function tests. For the genomic random effect, we used identity-by-descent estimates from accurately phased genome-wide data. For the environmental random effect, we constructed a covariance matrix based on a Gaussian radial basis function. Across the simulated and Ugandan data, narrow-sense heritability estimates were lower using the more general model. Thus, our approach addresses, in part, the issue of "missing heritability" in the sense that much of the heritability previously thought to be missing was fictional. Software is available at https://github.com/MicrosoftGenomics/FaST-LMM.
Dynamics of Pure Shape, Relativity, and the Problem of Time
NASA Astrophysics Data System (ADS)
Barbour, Julian
A new approach to the dynamics of the universe based on work by Ó Murchadha, Foster, Anderson and the author is presented. The only kinematics presupposed is the spatial geometry needed to define configuration spaces in purely relational terms. A new formulation of the relativity principle based on Poincarés analysis of the problem of absolute and relative motion (Machs principle) is given. The entire dynamics is based on shape and nothing else. It leads to much stronger predictions than standard Newtonian theory. For the dynamics of Riemannian 3-geometries on which matter fields also evolve, implementation of the new relativity principle establishes unexpected links between special relativity, general relativity and the gauge principle. They all emerge together as a self-consistent complex from a unified and completely relational approach to dynamics. A connection between time and scale invariance is established. In particular, the representation of general relativity as evolution of the shape of space leads to a unique dynamical definition of simultaneity. This opens up the prospect of a solution of the problem of time in quantum gravity on the basis of a fundamental dynamical principle.
Responder analysis without dichotomization.
Zhang, Zhiwei; Chu, Jianxiong; Rahardja, Dewi; Zhang, Hui; Tang, Li
2016-01-01
In clinical trials, it is common practice to categorize subjects as responders and non-responders on the basis of one or more clinical measurements under pre-specified rules. Such a responder analysis is often criticized for the loss of information in dichotomizing one or more continuous or ordinal variables. It is worth noting that a responder analysis can be performed without dichotomization, because the proportion of responders for each treatment can be derived from a model for the original clinical variables (used to define a responder) and estimated by substituting maximum likelihood estimators of model parameters. This model-based approach can be considerably more efficient and more effective for dealing with missing data than the usual approach based on dichotomization. For parameter estimation, the model-based approach generally requires correct specification of the model for the original variables. However, under the sharp null hypothesis, the model-based approach remains unbiased for estimating the treatment difference even if the model is misspecified. We elaborate on these points and illustrate them with a series of simulation studies mimicking a study of Parkinson's disease, which involves longitudinal continuous data in the definition of a responder.
Game-Informed Learning: Applying Computer Game Processes to Higher Education
ERIC Educational Resources Information Center
Begg, Michael; Dewhurst, David; Macleod, Hamish
2005-01-01
The term "game-based learning" has emerged as a general name for the use of computer games in education. Despite early work showing rich inferential learning taking place as a result of gameplay, most game-based learning has been geared towards using a game as a host into which curricular content can be embedded. This approach can be problematic,…
Adaptivity in Game-Based Learning: A New Perspective on Story
NASA Astrophysics Data System (ADS)
Berger, Florian; Müller, Wolfgang
Game-based learning as a novel form of e-learning still has issues in fundamental questions, the lack of a general model for adaptivity being one of them. Since adaptive techniques in traditional e-learning applications bear close similarity to certain interactive storytelling approaches, we propose a new notion of story as the joining element of arbitraty learning paths.
A General Approach to Measuring Test-Taking Effort on Computer-Based Tests
ERIC Educational Resources Information Center
Wise, Steven L.; Gao, Lingyun
2017-01-01
There has been an increased interest in the impact of unmotivated test taking on test performance and score validity. This has led to the development of new ways of measuring test-taking effort based on item response time. In particular, Response Time Effort (RTE) has been shown to provide an assessment of effort down to the level of individual…
Trifocal Tensor-Based Adaptive Visual Trajectory Tracking Control of Mobile Robots.
Chen, Jian; Jia, Bingxi; Zhang, Kaixiang
2017-11-01
In this paper, a trifocal tensor-based approach is proposed for the visual trajectory tracking task of a nonholonomic mobile robot equipped with a roughly installed monocular camera. The desired trajectory is expressed by a set of prerecorded images, and the robot is regulated to track the desired trajectory using visual feedback. Trifocal tensor is exploited to obtain the orientation and scaled position information used in the control system, and it works for general scenes owing to the generality of trifocal tensor. In the previous works, the start, current, and final images are required to share enough visual information to estimate the trifocal tensor. However, this requirement can be easily violated for perspective cameras with limited field of view. In this paper, key frame strategy is proposed to loosen this requirement, extending the workspace of the visual servo system. Considering the unknown depth and extrinsic parameters (installing position of the camera), an adaptive controller is developed based on Lyapunov methods. The proposed control strategy works for almost all practical circumstances, including both trajectory tracking and pose regulation tasks. Simulations are made based on the virtual experimentation platform (V-REP) to evaluate the effectiveness of the proposed approach.
NASA Astrophysics Data System (ADS)
Benedek, Judit; Papp, Gábor; Kalmár, János
2018-04-01
Beyond rectangular prism polyhedron, as a discrete volume element, can also be used to model the density distribution inside 3D geological structures. The calculation of the closed formulae given for the gravitational potential and its higher-order derivatives, however, needs twice more runtime than that of the rectangular prism computations. Although the more detailed the better principle is generally accepted it is basically true only for errorless data. As soon as errors are present any forward gravitational calculation from the model is only a possible realization of the true force field on the significance level determined by the errors. So if one really considers the reliability of input data used in the calculations then sometimes the "less" can be equivalent to the "more" in statistical sense. As a consequence the processing time of the related complex formulae can be significantly reduced by the optimization of the number of volume elements based on the accuracy estimates of the input data. New algorithms are proposed to minimize the number of model elements defined both in local and in global coordinate systems. Common gravity field modelling programs generate optimized models for every computation points ( dynamic approach), whereas the static approach provides only one optimized model for all. Based on the static approach two different algorithms were developed. The grid-based algorithm starts with the maximum resolution polyhedral model defined by 3-3 points of each grid cell and generates a new polyhedral surface defined by points selected from the grid. The other algorithm is more general; it works also for irregularly distributed data (scattered points) connected by triangulation. Beyond the description of the optimization schemes some applications of these algorithms in regional and local gravity field modelling are presented too. The efficiency of the static approaches may provide even more than 90% reduction in computation time in favourable situation without the loss of reliability of the calculated gravity field parameters.
The scientific learning approach using multimedia-based maze game to improve learning outcomes
NASA Astrophysics Data System (ADS)
Setiawan, Wawan; Hafitriani, Sarah; Prabawa, Harsa Wara
2016-02-01
The objective of curriculum 2013 is to improve the quality of education in Indonesia, which leads to improving the quality of learning. The scientific approach and supported empowerment media is one approach as massaged of curriculum 2013. This research aims to design a labyrinth game based multimedia and apply in the scientific learning approach. This study was conducted in one of the Vocational School in Subjects of Computer Network on 2 (two) classes of experimental and control. The method used Mix Method Research (MMR) which combines qualitative in multimedia design, and quantitative in the study of learning impact. The results of a survey showed that the general of vocational students like of network topology material (68%), like multimedia (74%), and in particular, like interactive multimedia games and flash (84%). Multimediabased maze game developed good eligibility based on media and material aspects of each value 840% and 82%. Student learning outcomes as a result of using a scientific approach to learning with a multimediabased labyrinth game increase with an average of gain index about (58%) and higher than conventional multimedia with index average gain of 0.41 (41%). Based on these results the scientific approach to learning by using multimediabased labyrinth game can improve the quality of learning and increase understanding of students. Multimedia of learning based labyrinth game, which developed, got a positive response from the students with a good qualification level (75%).
Integration of prior knowledge into dense image matching for video surveillance
NASA Astrophysics Data System (ADS)
Menze, M.; Heipke, C.
2014-08-01
Three-dimensional information from dense image matching is a valuable input for a broad range of vision applications. While reliable approaches exist for dedicated stereo setups they do not easily generalize to more challenging camera configurations. In the context of video surveillance the typically large spatial extent of the region of interest and repetitive structures in the scene render the application of dense image matching a challenging task. In this paper we present an approach that derives strong prior knowledge from a planar approximation of the scene. This information is integrated into a graph-cut based image matching framework that treats the assignment of optimal disparity values as a labelling task. Introducing the planar prior heavily reduces ambiguities together with the search space and increases computational efficiency. The results provide a proof of concept of the proposed approach. It allows the reconstruction of dense point clouds in more general surveillance camera setups with wider stereo baselines.
Nutrition care by general practitioners: Enhancing women's health during and after pregnancy.
Ball, Lauren; Wilkinson, Shelley
2016-08-01
The importance of healthy dietary behaviours during pregnancy and after birth is well recognised given the short-term and long-term effects on the health of mothers and infants. Pregnancy is an ideal time to implement health behaviour changes, as women are receptive to health messages at this time. The majority of pregnant women have regular, ongoing contact with general practitioners (GPs), particularly during early pregnancy. This paper provides an overview of the latest evidence regarding the nutrition requirements of women during and after birth, and describes simple ways that GPs can incorporate brief, effective nutrition care into standard consultations. Two approaches for enhancing the nutrition care provided by GPs are presented. These approaches are for GPs to feel confident in raising the topic of nutrition in standard consultations and being equipped with effective, evidence-based messages that can be incorporated into consultations. Collectively, these approaches promote healthy dietary behaviours for intergenerational benefits.
NASA Technical Reports Server (NTRS)
Ustinov, Eugene A.
2006-01-01
In a recent publication (Ustinov, 2002), we proposed an analytic approach to evaluation of radiative and geophysical weighting functions for remote sensing of a blackbody planetary atmosphere, based on general linearization approach applied to the case of nadir viewing geometry. In this presentation, the general linearization approach is applied to the limb viewing geometry. The expressions, similar to those obtained in (Ustinov, 2002), are obtained for weighting functions with respect to the distance along the line of sight. Further on, these expressions are converted to the expressions for weighting functions with respect to the vertical coordinate in the atmosphere. Finally, the numerical representation of weighting functions in the form of matrices of partial derivatives of grid limb radiances with respect to the grid values of atmospheric parameters is used for a convolution with the finite field of view of the instrument.
Grounding Robot Autonomy in Emotion and Self-awareness
NASA Astrophysics Data System (ADS)
Sanz, Ricardo; Hernández, Carlos; Hernando, Adolfo; Gómez, Jaime; Bermejo, Julita
Much is being done in an attempt to transfer emotional mechanisms from reverse-engineered biology into social robots. There are two basic approaches: the imitative display of emotion —e.g. to intend more human-like robots— and the provision of architectures with intrinsic emotion —in the hope of enhancing behavioral aspects. This paper focuses on the second approach, describing a core vision regarding the integration of cognitive, emotional and autonomic aspects in social robot systems. This vision has evolved as a result of the efforts in consolidating the models extracted from rat emotion research and their implementation in technical use cases based on a general systemic analysis in the framework of the ICEA and C3 projects. The desire for generality of the approach intends obtaining universal theories of integrated —autonomic, emotional, cognitive— behavior. The proposed conceptualizations and architectural principles are then captured in a theoretical framework: ASys — The Autonomous Systems Framework.
Reed, Richard L; Barton, Christopher A; Isherwood, Linda M; Baxter, Jodie M Oliver; Roeger, Leigh
2013-08-28
A robust research base is required in General Practice. The research output for General Practice is much less than those of other clinical disciplines. A major impediment to more research in this sector is difficulty with recruitment. Much of the research in this area focuses on barriers to effective recruitment and many projects have great difficulty with this process. This paper seeks to describe a systematic approach to recruitment for a randomized controlled trial that allowed the study team to recruit a substantial number of subjects from General Practice over a brief time period. A systematic approach to recruitment in this setting based on prior literature and the experience of the investigator team was incorporated into the design and implementation of the study. Five strategies were used to facilitate this process. These included designing the study to minimize the impact of the research on the day-to-day operations of the clinics, engagement of general practitioners in the research, making the research attractive to subjects, minimizing attrition and ensuring recruitment was a major focus of the management of the study. Outcomes of the recruitment process were measured as the proportion of practices that agreed to participate, the proportion of potentially eligible subjects who consented to take part in the trial and the attrition rate of subjects. Qualitative interviews with a subset of successfully recruited participants were done to determine why they chose to participate in the study; data were analyzed using thematic analysis. Five out of the six general practices contacted agreed to take part in the study. Thirty-eight per cent of the 1663 subjects who received a letter of invitation contacted the university study personnel regarding their interest in the project. Recruitment of the required number of eligible participants (n = 256) was accomplished in seven months. Thematic analysis of interviews with 30 participants regarding key factors in their study participation identified a personalised letter of endorsement from their general practitioner, expectation of personal benefit and altruism as important factors in their decision to participate. Recruitment can be successfully achieved in General Practice through design of the research project to facilitate recruitment, minimize the impact on general practice operations and ensure special care in enrolling and maintaining subjects in the project.
Ouari, Kamel; Rekioua, Toufik; Ouhrouche, Mohand
2014-01-01
In order to make a wind power generation truly cost-effective and reliable, an advanced control techniques must be used. In this paper, we develop a new control strategy, using nonlinear generalized predictive control (NGPC) approach, for DFIG-based wind turbine. The proposed control law is based on two points: NGPC-based torque-current control loop generating the rotor reference voltage and NGPC-based speed control loop that provides the torque reference. In order to enhance the robustness of the controller, a disturbance observer is designed to estimate the aerodynamic torque which is considered as an unknown perturbation. Finally, a real-time simulation is carried out to illustrate the performance of the proposed controller. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.
Applications of Landsat data and the data base approach
Lauer, D.T.
1986-01-01
A generalized methodology for applying digital Landsat data to resource inventory and assessment tasks is currently being used by several bureaux and agencies within the US Department of the Interior. The methodology includes definition of project objectives and output, identification of source materials, construction of the digital data base, performance of computer-assisted analyses, and generation of output. The USGS, Bureau of Land Management, US Fish and Wildlife Service, Bureau of Indian Affairs, Bureau of Reclamation, and National Park Service have used this generalized methodology to assemble comprehensive digital data bases for resource management. Advanced information processing techniques have been applied to these data bases for making regional environmental surveys on millions of acres of public lands at costs ranging from $0.01 to $0.08 an acre.-Author
Spherically-symmetric solutions in general relativity using a tetrad-based approach
NASA Astrophysics Data System (ADS)
Kim, Do Young; Lasenby, Anthony N.; Hobson, Michael P.
2018-03-01
We present a tetrad-based method for solving the Einstein field equations for spherically-symmetric systems and compare it with the widely-used Lemaître-Tolman-Bondi (LTB) model. In particular, we focus on the issues of gauge ambiguity and the use of comoving versus `physical' coordinate systems. We also clarify the correspondences between the two approaches, and illustrate their differences by applying them to the classic examples of the Schwarzschild and Friedmann-Lemaître-Robertson-Walker spacetimes. We demonstrate that the tetrad-based method does not suffer from the gauge freedoms inherent to the LTB model, naturally accommodates non-uniform pressure and has a more transparent physical interpretation. We further apply our tetrad-based method to a generalised form of `Swiss cheese' model, which consists of an interior spherical region surrounded by a spherical shell of vacuum that is embedded in an exterior background universe. In general, we allow the fluid in the interior and exterior regions to support pressure, and do not demand that the interior region be compensated. We pay particular attention to the form of the solution in the intervening vacuum region and illustrate the validity of Birkhoff's theorem at both the metric and tetrad level. We then reconsider critically the original theoretical arguments underlying the so-called Rh = ct cosmological model, which has recently received considerable attention. These considerations in turn illustrate the interesting behaviour of a number of `horizons' in general cosmological models.
Rule Extracting based on MCG with its Application in Helicopter Power Train Fault Diagnosis
NASA Astrophysics Data System (ADS)
Wang, M.; Hu, N. Q.; Qin, G. J.
2011-07-01
In order to extract decision rules for fault diagnosis from incomplete historical test records for knowledge-based damage assessment of helicopter power train structure. A method that can directly extract the optimal generalized decision rules from incomplete information based on GrC was proposed. Based on semantic analysis of unknown attribute value, the granule was extended to handle incomplete information. Maximum characteristic granule (MCG) was defined based on characteristic relation, and MCG was used to construct the resolution function matrix. The optimal general decision rule was introduced, with the basic equivalent forms of propositional logic, the rules were extracted and reduction from incomplete information table. Combined with a fault diagnosis example of power train, the application approach of the method was present, and the validity of this method in knowledge acquisition was proved.
Herbold, Craig W.; Pelikan, Claus; Kuzyk, Orest; Hausmann, Bela; Angel, Roey; Berry, David; Loy, Alexander
2015-01-01
High throughput sequencing of phylogenetic and functional gene amplicons provides tremendous insight into the structure and functional potential of complex microbial communities. Here, we introduce a highly adaptable and economical PCR approach to barcoding and pooling libraries of numerous target genes. In this approach, we replace gene- and sequencing platform-specific fusion primers with general, interchangeable barcoding primers, enabling nearly limitless customized barcode-primer combinations. Compared to barcoding with long fusion primers, our multiple-target gene approach is more economical because it overall requires lower number of primers and is based on short primers with generally lower synthesis and purification costs. To highlight our approach, we pooled over 900 different small-subunit rRNA and functional gene amplicon libraries obtained from various environmental or host-associated microbial community samples into a single, paired-end Illumina MiSeq run. Although the amplicon regions ranged in size from approximately 290 to 720 bp, we found no significant systematic sequencing bias related to amplicon length or gene target. Our results indicate that this flexible multiplexing approach produces large, diverse, and high quality sets of amplicon sequence data for modern studies in microbial ecology. PMID:26236305
Abuassba, Adnan O M; Zhang, Dezheng; Luo, Xiong; Shaheryar, Ahmad; Ali, Hazrat
2017-01-01
Extreme Learning Machine (ELM) is a fast-learning algorithm for a single-hidden layer feedforward neural network (SLFN). It often has good generalization performance. However, there are chances that it might overfit the training data due to having more hidden nodes than needed. To address the generalization performance, we use a heterogeneous ensemble approach. We propose an Advanced ELM Ensemble (AELME) for classification, which includes Regularized-ELM, L 2 -norm-optimized ELM (ELML2), and Kernel-ELM. The ensemble is constructed by training a randomly chosen ELM classifier on a subset of training data selected through random resampling. The proposed AELM-Ensemble is evolved by employing an objective function of increasing diversity and accuracy among the final ensemble. Finally, the class label of unseen data is predicted using majority vote approach. Splitting the training data into subsets and incorporation of heterogeneous ELM classifiers result in higher prediction accuracy, better generalization, and a lower number of base classifiers, as compared to other models (Adaboost, Bagging, Dynamic ELM ensemble, data splitting ELM ensemble, and ELM ensemble). The validity of AELME is confirmed through classification on several real-world benchmark datasets.
Abuassba, Adnan O. M.; Ali, Hazrat
2017-01-01
Extreme Learning Machine (ELM) is a fast-learning algorithm for a single-hidden layer feedforward neural network (SLFN). It often has good generalization performance. However, there are chances that it might overfit the training data due to having more hidden nodes than needed. To address the generalization performance, we use a heterogeneous ensemble approach. We propose an Advanced ELM Ensemble (AELME) for classification, which includes Regularized-ELM, L2-norm-optimized ELM (ELML2), and Kernel-ELM. The ensemble is constructed by training a randomly chosen ELM classifier on a subset of training data selected through random resampling. The proposed AELM-Ensemble is evolved by employing an objective function of increasing diversity and accuracy among the final ensemble. Finally, the class label of unseen data is predicted using majority vote approach. Splitting the training data into subsets and incorporation of heterogeneous ELM classifiers result in higher prediction accuracy, better generalization, and a lower number of base classifiers, as compared to other models (Adaboost, Bagging, Dynamic ELM ensemble, data splitting ELM ensemble, and ELM ensemble). The validity of AELME is confirmed through classification on several real-world benchmark datasets. PMID:28546808
NASA Technical Reports Server (NTRS)
Muscettola, Nicola; Smith, Steven S.
1996-01-01
This final report summarizes research performed under NASA contract NCC 2-531 toward generalization of constraint-based scheduling theories and techniques for application to space telescope observation scheduling problems. Our work into theories and techniques for solution of this class of problems has led to the development of the Heuristic Scheduling Testbed System (HSTS), a software system for integrated planning and scheduling. Within HSTS, planning and scheduling are treated as two complementary aspects of the more general process of constructing a feasible set of behaviors of a target system. We have validated the HSTS approach by applying it to the generation of observation schedules for the Hubble Space Telescope. This report summarizes the HSTS framework and its application to the Hubble Space Telescope domain. First, the HSTS software architecture is described, indicating (1) how the structure and dynamics of a system is modeled in HSTS, (2) how schedules are represented at multiple levels of abstraction, and (3) the problem solving machinery that is provided. Next, the specific scheduler developed within this software architecture for detailed management of Hubble Space Telescope operations is presented. Finally, experimental performance results are given that confirm the utility and practicality of the approach.
Meerpohl, Joerg J; Schell, Lisa K; Bassler, Dirk; Gallus, Silvano; Kleijnen, Jos; Kulig, Michael; La Vecchia, Carlo; Marušić, Ana; Ravaud, Philippe; Reis, Andreas; Schmucker, Christine; Strech, Daniel; Urrútia, Gerard; Wager, Elizabeth; Antes, Gerd
2015-05-05
Dissemination bias in clinical research severely impedes informed decision-making not only for healthcare professionals and patients, but also for funders, research ethics committees, regulatory bodies and other stakeholder groups that make health-related decisions. Decisions based on incomplete and biased evidence cannot only harm people, but may also have huge financial implications by wasting resources on ineffective or harmful diagnostic and therapeutic measures, and unnecessary research. Owing to involvement of multiple stakeholders, it remains easy for any single group to assign responsibility for resolving the problem to others. To develop evidence-informed general and targeted recommendations addressing the various stakeholders involved in knowledge generation and dissemination to help overcome the problem of dissemination bias on the basis of previously collated evidence. Based on findings from systematic reviews, document analyses and surveys, we developed general and targeted draft recommendations. During a 2-day workshop in summer 2013, these draft recommendations were discussed with external experts and key stakeholders, and refined following a rigorous and transparent methodological approach. Four general, overarching recommendations applicable to all or most stakeholder groups were formulated, addressing (1) awareness raising, (2) implementation of targeted recommendations, (3) trial registration and results posting, and (4) systematic approaches to evidence synthesis. These general recommendations are complemented and specified by 47 targeted recommendations tailored towards funding agencies, pharmaceutical and device companies, research institutions, researchers (systematic reviewers and trialists), research ethics committees, trial registries, journal editors and publishers, regulatory agencies, benefit (health technology) assessment institutions and legislators. Despite various recent examples of dissemination bias and several initiatives to reduce it, the problem of dissemination bias has not been resolved. Tailored recommendations based on a comprehensive approach will hopefully help increase transparency in biomedical research by overcoming the failure to disseminate negative findings. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.