Sample records for actual features problems

  1. [Life-cycles, psychopathology and suicidal behaviour].

    PubMed

    Osváth, Péter

    2012-12-01

    According to modern psychological theories the human life implies continuous development, the efficient solution of age-specific problems is necessary to the successful transition of age-periods. The phases of transition are very vulnerable against the accidental stressors and negative life-events. Thus the problem-solving capacity may run out, which impairs chance of the successful coping with stressful events. It may result in some negative consequences, such as different psychopathological symptoms (depression, anxiety, psychosis) or even suicidal behaviour. For that reason we have to pay special attention to the symptoms of psychological crisis and the presuicidal syndrome. In certain life-cycle transitions (such as adolescent, middle or elderly age) the personality has special vulnerability to the development of psychological and psychopathological problems. In this article the most important features of life-cycles and psychopathological symptoms are reviewed. The developmental and age-specific characteristics have special importance in understanding the background of the actual psychological crisis and improving the efficacy of the treatment. Using the complex bio-psycho-socio-spiritual approach not only the actual psychopatological problems, but the individual psychological features can be recognised. Thus the effective treatment relieves not only the actual symptoms, but will increase the chance for solving further crises.

  2. Can I cut the Gordian tnok? The impact of pronounceability, actual solvability, and length on intuitive problem assessments of anagrams.

    PubMed

    Topolinski, Sascha; Bakhtiari, Giti; Erle, Thorsten M

    2016-01-01

    When assessing a problem, many cues can be used to predict solvability and solving effort. Some of these cues, however, can be misleading. The present approach shows that a feature of a problem that is actually related to solving difficulty is used as a cue for solving ease when assessing the problem in the first place. For anagrams, it is an established effect that easy-to-pronounce anagrams (e.g., NOGAL) take more time to being solved than hard-to-pronounce anagrams (e.g., HNWEI). However, when assessing an anagram in the first place, individuals use the feature of pronounceability to predict solving ease, because pronounceability is an instantiation of the general mechanism of processing fluency. Participants (total N=536) received short and long anagrams and nonanagrams and judged solvability and solving ease intuitively without actually solving the items. Easy-to-pronounce letter strings were more frequently judged as being solvable than hard-to-pronounce letters strings (Experiment 1), and were estimated to require less effort (Experiments 2, 4-7) and time to be solved (Experiment 3). This effect was robust for short and long items, anagrams and nonanagrams, and presentation timings from 4 down to 0.5s, and affected novices and experts alike. Spontaneous solutions did not mediate this effect. Participants were sensitive to actual solvability even for long anagrams (6-11 letters long) presented only for 500 ms. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Feature binding, attention and object perception.

    PubMed Central

    Treisman, A

    1998-01-01

    The seemingly effortless ability to perceive meaningful objects in an integrated scene actually depends on complex visual processes. The 'binding problem' concerns the way in which we select and integrate the separate features of objects in the correct combinations. Experiments suggest that attention plays a central role in solving this problem. Some neurological patients show a dramatic breakdown in the ability to see several objects; their deficits suggest a role for the parietal cortex in the binding process. However, indirect measures of priming and interference suggest that more information may be implicitly available than we can consciously access. PMID:9770223

  4. Formulations and algorithms for problems on rock mass and support deformation during mining

    NASA Astrophysics Data System (ADS)

    Seryakov, VM

    2018-03-01

    The analysis of problem formulations to calculate stress-strain state of mine support and surrounding rocks mass in rock mechanics shows that such formulations incompletely describe the mechanical features of joint deformation in the rock mass–support system. The present paper proposes an algorithm to take into account the actual conditions of rock mass and support interaction and the algorithm implementation method to ensure efficient calculation of stresses in rocks and support.

  5. Transformation of Infrastructure Projects for the Sustainable Development of the Transport Complex

    NASA Astrophysics Data System (ADS)

    Polyakova, Irina; Vasilyeva, Elena; Vorontsova, Natalya

    2017-10-01

    The article contains actual data on the review of the performance of the transport infrastructure in Russia. The problems and restrictions, affecting its sustainable development, are identified; their interaction and interrelations are traced. The authors argue that the majority of the revealed restrictions are of internal character and mainly is the feature of the state contract scheme. According to the authors, the scheme of public-and-private partnership is an effective mechanism, which can be suggested for the existing problems solution.

  6. The Model-Based Study of the Effectiveness of Reporting Lists of Small Feature Sets Using RNA-Seq Data.

    PubMed

    Kim, Eunji; Ivanov, Ivan; Hua, Jianping; Lampe, Johanna W; Hullar, Meredith Aj; Chapkin, Robert S; Dougherty, Edward R

    2017-01-01

    Ranking feature sets for phenotype classification based on gene expression is a challenging issue in cancer bioinformatics. When the number of samples is small, all feature selection algorithms are known to be unreliable, producing significant error, and error estimators suffer from different degrees of imprecision. The problem is compounded by the fact that the accuracy of classification depends on the manner in which the phenomena are transformed into data by the measurement technology. Because next-generation sequencing technologies amount to a nonlinear transformation of the actual gene or RNA concentrations, they can potentially produce less discriminative data relative to the actual gene expression levels. In this study, we compare the performance of ranking feature sets derived from a model of RNA-Seq data with that of a multivariate normal model of gene concentrations using 3 measures: (1) ranking power, (2) length of extensions, and (3) Bayes features. This is the model-based study to examine the effectiveness of reporting lists of small feature sets using RNA-Seq data and the effects of different model parameters and error estimators. The results demonstrate that the general trends of the parameter effects on the ranking power of the underlying gene concentrations are preserved in the RNA-Seq data, whereas the power of finding a good feature set becomes weaker when gene concentrations are transformed by the sequencing machine.

  7. [Morphological verification problems of Chernobyl factor influence on the testis of coal miners of Donbas-liquidators of Chernobyl accident].

    PubMed

    Danylov, Iu V; Motkov, K V; Shevchenko, T I

    2013-01-01

    Problem of a diagnostic of Chernobyl factor influences on different organs and systems of Chernobyl accident liquidators are remain actually until now. Though morbidly background which development at unfavorable work conditions in underground coalminers prevents from objective identification features of Chernobyl factor influences. The qualitative and quantitative histological and immunohistochemical law of morphogenesis changes in testis of Donbas's coalminer - non-liquidators Chernobyl accident in comparison with the group of Donbas's coalminers-liquidators Chernobyl accident, which we were stationed non determined problem. This reason stipulates to development and practical use of mathematical model of morphogenesis of a testis changes.

  8. [Morphological verification problems of Chernobyl factor influence on the prostate of coalminers of Donbas--liquidators of Chernobyl accident].

    PubMed

    Danylov, Iu V; Motkov, K V; Shevchenko, T I

    2013-12-01

    Problem of a diagnostic of Chernobyl factor influences on different organs and systems of Chernobyl accident liquidators are remain actually until now. Though morbidly background which development at unfavorable work conditions in underground coalminers prevents from objective identification features of Chernobyl factor influences. The qualitative and quantitative histological and immunohistochemical law of morphogenesis changes in prostate of Donbas's coalminer-non-liquidators Chernobyl accident in comparison with the group of Donbas's coalminers-liquidators Chernobyl accident which we were stationed non determined problem. This reason stipulates to development and practical use of mathematical model of morphogenesis of a prostatic gland changes.

  9. The method for homography estimation between two planes based on lines and points

    NASA Astrophysics Data System (ADS)

    Shemiakina, Julia; Zhukovsky, Alexander; Nikolaev, Dmitry

    2018-04-01

    The paper considers the problem of estimating a transform connecting two images of one plane object. The method based on RANSAC is proposed for calculating the parameters of projective transform which uses points and lines correspondences simultaneously. A series of experiments was performed on synthesized data. Presented results show that the algorithm convergence rate is significantly higher when actual lines are used instead of points of lines intersection. When using both lines and feature points it is shown that the convergence rate does not depend on the ratio between lines and feature points in the input dataset.

  10. Application of Hybrid Real-Time Power System Simulator for Designing and Researching of Relay Protection and Automation

    NASA Astrophysics Data System (ADS)

    Borovikov, Yu S.; Sulaymanov, A. O.; Andreev, M. V.

    2015-10-01

    Development, research and operation of smart grids (SG) with active-adaptive networks (AAS) are actual tasks for today. Planned integration of high-speed FACTS devices greatly complicates complex dynamic properties of power systems. As a result the operating conditions of equipment of power systems are significantly changing. Such situation creates the new actual problem of development and research of relay protection and automation (RPA) which will be able to adequately operate in the SGs and adapt to its regimes. Effectiveness of solution of the problem depends on using tools - different simulators of electric power systems. Analysis of the most famous and widely exploited simulators led to the conclusion about the impossibility of using them for solution of the mentioned problem. In Tomsk Polytechnic University developed the prototype of hybrid multiprocessor software and hardware system - Hybrid Real-Time Power System Simulator (HRTSim). Because of its unique features this simulator can be used for solution of mentioned tasks. This article introduces the concept of development and research of relay protection and automation with usage of HRTSim.

  11. Defeating feature fatigue.

    PubMed

    Rust, Roland T; Thompson, Debora Viana; Hamilton, Rebecca W

    2006-02-01

    Consider a coffeemaker that offers 12 drink options, a car with more than 700 features on the dashboard, and a mouse pad that's also a clock, calculator, and FM radio. All are examples of "feature bloat", or "featuritis", the result of an almost irresistible temptation to load products with lots of bells and whistles. The problem is that the more features a product boasts, the harder it is to use. Manufacturers that increase a product's capability--the number of useful functions it can perform--at the expense of its usability are exposing their customers to feature fatigue. The authors have conducted three studies to gain a better understanding of how consumers weigh a product's capability relative to its usability. They found that even though consumers know that products with more features are harder to use, they initially choose high-feature models. They also pile on more features when given the chance to customize a product for their needs. Once consumers have actually worked with a product, however, usability starts to matter more to them than capability. For managers in consumer products companies, these findings present a dilemma: Should they maximize initial sales by designing high-feature models, which consumers consistently choose, or should they limit the number of features in order to enhance the lifetime value of their customers? The authors' analytical model guides companies toward a happy middle ground: maximizing the net present value of the typical customer's profit stream. The authors also advise companies to build simpler products, help consumers learn which products suit their needs, develop products that do one thing very well, and design market research in which consumers use actual products or prototypes.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Yijian; Hong, Mingyi; Dall'Anese, Emiliano

    This paper considers power distribution systems featuring renewable energy sources (RESs), and develops a distributed optimization method to steer the RES output powers to solutions of AC optimal power flow (OPF) problems. The design of the proposed method leverages suitable linear approximations of the AC-power flow equations, and is based on the Alternating Direction Method of Multipliers (ADMM). Convergence of the RES-inverter output powers to solutions of the OPF problem is established under suitable conditions on the stepsize as well as mismatches between the commanded setpoints and actual RES output powers. In a broad sense, the methods and results proposedmore » here are also applicable to other distributed optimization problem setups with ADMM and inexact dual updates.« less

  13. Representing nested semantic information in a linear string of text using XML.

    PubMed

    Krauthammer, Michael; Johnson, Stephen B; Hripcsak, George; Campbell, David A; Friedman, Carol

    2002-01-01

    XML has been widely adopted as an important data interchange language. The structure of XML enables sharing of data elements with variable degrees of nesting as long as the elements are grouped in a strict tree-like fashion. This requirement potentially restricts the usefulness of XML for marking up written text, which often includes features that do not properly nest within other features. We encountered this problem while marking up medical text with structured semantic information from a Natural Language Processor. Traditional approaches to this problem separate the structured information from the actual text mark up. This paper introduces an alternative solution, which tightly integrates the semantic structure with the text. The resulting XML markup preserves the linearity of the medical texts and can therefore be easily expanded with additional types of information.

  14. Representing nested semantic information in a linear string of text using XML.

    PubMed Central

    Krauthammer, Michael; Johnson, Stephen B.; Hripcsak, George; Campbell, David A.; Friedman, Carol

    2002-01-01

    XML has been widely adopted as an important data interchange language. The structure of XML enables sharing of data elements with variable degrees of nesting as long as the elements are grouped in a strict tree-like fashion. This requirement potentially restricts the usefulness of XML for marking up written text, which often includes features that do not properly nest within other features. We encountered this problem while marking up medical text with structured semantic information from a Natural Language Processor. Traditional approaches to this problem separate the structured information from the actual text mark up. This paper introduces an alternative solution, which tightly integrates the semantic structure with the text. The resulting XML markup preserves the linearity of the medical texts and can therefore be easily expanded with additional types of information. PMID:12463856

  15. Supporting Scientific Modeling Practices in Atmospheric Sciences: Intended and Actual Affordances of a Computer-Based Modeling Tool

    ERIC Educational Resources Information Center

    Wu, Pai-Hsing; Wu, Hsin-Kai; Kuo, Che-Yu; Hsu, Ying-Shao

    2015-01-01

    Computer-based learning tools include design features to enhance learning but learners may not always perceive the existence of these features and use them in desirable ways. There might be a gap between what the tool features are designed to offer (intended affordance) and what they are actually used (actual affordance). This study thus aims at…

  16. Regulation of Renewable Energy Sources to Optimal Power Flow Solutions Using ADMM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dall-Anese, Emiliano; Zhang, Yijian; Hong, Mingyi

    This paper considers power distribution systems featuring renewable energy sources (RESs), and develops a distributed optimization method to steer the RES output powers to solutions of AC optimal power flow (OPF) problems. The design of the proposed method leverages suitable linear approximations of the AC-power flow equations, and is based on the Alternating Direction Method of Multipliers (ADMM). Convergence of the RES-inverter output powers to solutions of the OPF problem is established under suitable conditions on the stepsize as well as mismatches between the commanded setpoints and actual RES output powers. In a broad sense, the methods and results proposedmore » here are also applicable to other distributed optimization problem setups with ADMM and inexact dual updates.« less

  17. Online signature recognition using principal component analysis and artificial neural network

    NASA Astrophysics Data System (ADS)

    Hwang, Seung-Jun; Park, Seung-Je; Baek, Joong-Hwan

    2016-12-01

    In this paper, we propose an algorithm for on-line signature recognition using fingertip point in the air from the depth image acquired by Kinect. We extract 10 statistical features from X, Y, Z axis, which are invariant to changes in shifting and scaling of the signature trajectories in three-dimensional space. Artificial neural network is adopted to solve the complex signature classification problem. 30 dimensional features are converted into 10 principal components using principal component analysis, which is 99.02% of total variances. We implement the proposed algorithm and test to actual on-line signatures. In experiment, we verify the proposed method is successful to classify 15 different on-line signatures. Experimental result shows 98.47% of recognition rate when using only 10 feature vectors.

  18. Nearest-neighbor guided evaluation of data reliability and its applications.

    PubMed

    Boongoen, Tossapon; Shen, Qiang

    2010-12-01

    The intuition of data reliability has recently been incorporated into the main stream of research on ordered weighted averaging (OWA) operators. Instead of relying on human-guided variables, the aggregation behavior is determined in accordance with the underlying characteristics of the data being aggregated. Data-oriented operators such as the dependent OWA (DOWA) utilize centralized data structures to generate reliable weights, however. Despite their simplicity, the approach taken by these operators neglects entirely any local data structure that represents a strong agreement or consensus. To address this issue, the cluster-based OWA (Clus-DOWA) operator has been proposed. It employs a cluster-based reliability measure that is effective to differentiate the accountability of different input arguments. Yet, its actual application is constrained by the high computational requirement. This paper presents a more efficient nearest-neighbor-based reliability assessment for which an expensive clustering process is not required. The proposed measure can be perceived as a stress function, from which the OWA weights and associated decision-support explanations can be generated. To illustrate the potential of this measure, it is applied to both the problem of information aggregation for alias detection and the problem of unsupervised feature selection (in which unreliable features are excluded from an actual learning process). Experimental results demonstrate that these techniques usually outperform their conventional state-of-the-art counterparts.

  19. Optimal spatial filtering and transfer function for SAR ocean wave spectra

    NASA Technical Reports Server (NTRS)

    Goldfinger, A. D.; Beal, R. C.; Tilley, D. G.

    1981-01-01

    The Seasat Synthetic Aperture Radar (SAR) has proved to be an instrument of great utility in the sensing of ocean conditions on a global scale. An analysis of oceanographic and atmospheric aspects of Seasat data has shown that the features observed in the imagery are linked to ocean phenomena such as storm sources and their resulting swell systems. However, there remains one central problem which has not been satisfactorily solved to date. This problem is related to the accurate measurement of wind-generated ocean wave spectra. Investigations addressing this problem are currently being conducted. The problem has two parts, including the accurate measurement of the image spectra and the inference of actual surface wave spectra from these measurements. A description is presented of the progress made towards solving the first part of the problem, taking into account a digital rather than optical computation of the image transforms.

  20. Network reconstruction via graph blending

    NASA Astrophysics Data System (ADS)

    Estrada, Rolando

    2016-05-01

    Graphs estimated from empirical data are often noisy and incomplete due to the difficulty of faithfully observing all the components (nodes and edges) of the true graph. This problem is particularly acute for large networks where the number of components may far exceed available surveillance capabilities. Errors in the observed graph can render subsequent analyses invalid, so it is vital to develop robust methods that can minimize these observational errors. Errors in the observed graph may include missing and spurious components, as well fused (multiple nodes are merged into one) and split (a single node is misinterpreted as many) nodes. Traditional graph reconstruction methods are only able to identify missing or spurious components (primarily edges, and to a lesser degree nodes), so we developed a novel graph blending framework that allows us to cast the full estimation problem as a simple edge addition/deletion problem. Armed with this framework, we systematically investigate the viability of various topological graph features, such as the degree distribution or the clustering coefficients, and existing graph reconstruction methods for tackling the full estimation problem. Our experimental results suggest that incorporating any topological feature as a source of information actually hinders reconstruction accuracy. We provide a theoretical analysis of this phenomenon and suggest several avenues for improving this estimation problem.

  1. An Analysis of the Formal Features of "Reality-Based" Television Programs.

    ERIC Educational Resources Information Center

    Neapolitan, D. M.

    Reality-based television programs showcase actual footage or recreate actual events, and include programs such as "America's Most Wanted" and "Rescue 911." To identify the features that typify reality-based television programs, this study conducted an analysis of formal features used in reality-based programs. Formal features…

  2. [Physiological features of skin ageing in human].

    PubMed

    Tikhonova, I V; Tankanag, A V; Chemeris, N K

    2013-01-01

    The issue deals with the actual problem of gerontology, notably physiological features of human skin ageing. In the present review the authors have considered the kinds of ageing, central factors, affected on the ageing process (ultraviolet radiation and oxidation stress), as well as the research guidelines of the ageing changes in the skin structure and fuctions: study of mechanical properties, microcirculation, pH and skin thickness. The special attention has been payed to the methods of assessment of skin blood flow, and to results of investigations of age features of peripheral microhemodynamics. The laser Doppler flowmetry technique - one of the modern, noninvasive and extensively used methods for the assessmant of skin blood flow microcirculation system has been expanded in the review. The main results of the study of the ageing changes of skin blood perfusion using this method has been also presented.

  3. Modern European monographs for quality control of Chinese herbs.

    PubMed

    Bauer, Rudolf; Franz, Gerhard

    2010-12-01

    The actual concern about the safety and efficacy of herbal drugs originating from traditional Chinese medicine (TCM) is based on observations that these medicinal plants may have a high risk potential due to insufficient definitions, problems with identity, purity and falsifications. No uniform legal status for these groups of herbal drugs currently exists in the European Union. For quality control, monographs for TCM herbs can mainly be found in the Pharmacopoeia of the Peoples Republic of China. Based on these facts the Commission of the European Pharmacopoeia decided in 2005 to establish TCM-herbal drug monographs for the most important medicinal plants imported from Far East. These new monographs had to be established and evaluated on the basis of existing monographs in the Chinese Pharmacopoeia (ChP), English edition 2005. Due to important differences in the overall features of EP and ChP, a simple adapt/adopt procedure was not feasible. Therefore, specialist groups were mandated with a corresponding working programme. Some results and actual problems related to this working programme will be presented and discussed. © Georg Thieme Verlag KG Stuttgart · New York.

  4. MuSCoWERT: multi-scale consistence of weighted edge Radon transform for horizon detection in maritime images.

    PubMed

    Prasad, Dilip K; Rajan, Deepu; Rachmawati, Lily; Rajabally, Eshan; Quek, Chai

    2016-12-01

    This paper addresses the problem of horizon detection, a fundamental process in numerous object detection algorithms, in a maritime environment. The maritime environment is characterized by the absence of fixed features, the presence of numerous linear features in dynamically changing objects and background and constantly varying illumination, rendering the typically simple problem of detecting the horizon a challenging one. We present a novel method called multi-scale consistence of weighted edge Radon transform, abbreviated as MuSCoWERT. It detects the long linear features consistent over multiple scales using multi-scale median filtering of the image followed by Radon transform on a weighted edge map and computing the histogram of the detected linear features. We show that MuSCoWERT has excellent performance, better than seven other contemporary methods, for 84 challenging maritime videos, containing over 33,000 frames, and captured using visible range and near-infrared range sensors mounted onboard, onshore, or on floating buoys. It has a median error of about 2 pixels (less than 0.2%) from the center of the actual horizon and a median angular error of less than 0.4 deg. We are also sharing a new challenging horizon detection dataset of 65 videos of visible, infrared cameras for onshore and onboard ship camera placement.

  5. An unsupervised video foreground co-localization and segmentation process by incorporating motion cues and frame features

    NASA Astrophysics Data System (ADS)

    Zhang, Chao; Zhang, Qian; Zheng, Chi; Qiu, Guoping

    2018-04-01

    Video foreground segmentation is one of the key problems in video processing. In this paper, we proposed a novel and fully unsupervised approach for foreground object co-localization and segmentation of unconstrained videos. We firstly compute both the actual edges and motion boundaries of the video frames, and then align them by their HOG feature maps. Then, by filling the occlusions generated by the aligned edges, we obtained more precise masks about the foreground object. Such motion-based masks could be derived as the motion-based likelihood. Moreover, the color-base likelihood is adopted for the segmentation process. Experimental Results show that our approach outperforms most of the State-of-the-art algorithms.

  6. Three-dimensional fingerprint recognition by using convolution neural network

    NASA Astrophysics Data System (ADS)

    Tian, Qianyu; Gao, Nan; Zhang, Zonghua

    2018-01-01

    With the development of science and technology and the improvement of social information, fingerprint recognition technology has become a hot research direction and been widely applied in many actual fields because of its feasibility and reliability. The traditional two-dimensional (2D) fingerprint recognition method relies on matching feature points. This method is not only time-consuming, but also lost three-dimensional (3D) information of fingerprint, with the fingerprint rotation, scaling, damage and other issues, a serious decline in robustness. To solve these problems, 3D fingerprint has been used to recognize human being. Because it is a new research field, there are still lots of challenging problems in 3D fingerprint recognition. This paper presents a new 3D fingerprint recognition method by using a convolution neural network (CNN). By combining 2D fingerprint and fingerprint depth map into CNN, and then through another CNN feature fusion, the characteristics of the fusion complete 3D fingerprint recognition after classification. This method not only can preserve 3D information of fingerprints, but also solves the problem of CNN input. Moreover, the recognition process is simpler than traditional feature point matching algorithm. 3D fingerprint recognition rate by using CNN is compared with other fingerprint recognition algorithms. The experimental results show that the proposed 3D fingerprint recognition method has good recognition rate and robustness.

  7. Research of infrared laser based pavement imaging and crack detection

    NASA Astrophysics Data System (ADS)

    Hong, Hanyu; Wang, Shu; Zhang, Xiuhua; Jing, Genqiang

    2013-08-01

    Road crack detection is seriously affected by many factors in actual applications, such as some shadows, road signs, oil stains, high frequency noise and so on. Due to these factors, the current crack detection methods can not distinguish the cracks in complex scenes. In order to solve this problem, a novel method based on infrared laser pavement imaging is proposed. Firstly, single sensor laser pavement imaging system is adopted to obtain pavement images, high power laser line projector is well used to resist various shadows. Secondly, the crack extraction algorithm which has merged multiple features intelligently is proposed to extract crack information. In this step, the non-negative feature and contrast feature are used to extract the basic crack information, and circular projection based on linearity feature is applied to enhance the crack area and eliminate noise. A series of experiments have been performed to test the proposed method, which shows that the proposed automatic extraction method is effective and advanced.

  8. Prototype Effect and the Persuasiveness of Generalizations.

    PubMed

    Dahlman, Christian; Sarwar, Farhan; Bååth, Rasmus; Wahlberg, Lena; Sikström, Sverker

    An argument that makes use of a generalization activates the prototype for the category used in the generalization. We conducted two experiments that investigated how the activation of the prototype affects the persuasiveness of the argument. The results of the experiments suggest that the features of the prototype overshadow and partly overwrite the actual facts of the case. The case is, to some extent, judged as if it had the features of the prototype instead of the features it actually has. This prototype effect increases the persuasiveness of the argument in situations where the audience finds the judgment more warranted for the prototype than for the actual case (positive prototype effect), but decreases persuasiveness in situations where the audience finds the judgment less warranted for the prototype than for the actual case (negative prototype effect).

  9. Studying PubMed usages in the field for complex problem solving: Implications for tool design

    PubMed Central

    Song, Jean; Tonks, Jennifer Steiner; Meng, Fan; Xuan, Weijian; Ameziane, Rafiqa

    2012-01-01

    Many recent studies on MEDLINE-based information seeking have shed light on scientists’ behaviors and associated tool innovations that may improve efficiency and effectiveness. Few if any studies, however, examine scientists’ problem-solving uses of PubMed in actual contexts of work and corresponding needs for better tool support. Addressing this gap, we conducted a field study of novice scientists (14 upper level undergraduate majors in molecular biology) as they engaged in a problem solving activity with PubMed in a laboratory setting. Findings reveal many common stages and patterns of information seeking across users as well as variations, especially variations in cognitive search styles. Based on findings, we suggest tool improvements that both confirm and qualify many results found in other recent studies. Our findings highlight the need to use results from context-rich studies to inform decisions in tool design about when to offer improved features to users. PMID:24376375

  10. Radiative corrections in the (varying power)-law modified gravity

    NASA Astrophysics Data System (ADS)

    Hammad, Fayçal

    2015-06-01

    Although the (varying power)-law modified gravity toy model has the attractive feature of unifying the early- and late-time expansions of the Universe, thanks to the peculiar dependence of the scalar field's potential on the scalar curvature, the model still suffers from the fine-tuning problem when used to explain the actually observed Hubble parameter. Indeed, a more correct estimate of the mass of the scalar field needed to comply with actual observations gives an unnaturally small value. On the other hand, for a massless scalar field the potential would have no minimum and hence the field would always remain massless. What solves these issues are the radiative corrections that modify the field's effective potential. These corrections raise the field's effective mass, rendering the model free from fine-tuning, immune against positive fifth-force tests, and better suited to tackle the dark matter sector.

  11. Recorded Behavior as a Valuable Resource for Diagnostics in Mobile Phone Addiction: Evidence from Psychoinformatics.

    PubMed

    Montag, Christian; Błaszkiewicz, Konrad; Lachmann, Bernd; Sariyska, Rayna; Andone, Ionut; Trendafilov, Boris; Markowetz, Alexander

    2015-10-19

    Psychologists and psychiatrists commonly rely on self-reports or interviews to diagnose or treat behavioral addictions. The present study introduces a novel source of data: recordings of the actual problem behavior under investigation. A total of N = 58 participants were asked to fill in a questionnaire measuring problematic mobile phone behavior featuring several questions on weekly phone usage. After filling in the questionnaire, all participants received an application to be installed on their smartphones, which recorded their phone usage for five weeks. The analyses revealed that weekly phone usage in hours was overestimated; in contrast, numbers of call and text message related variables were underestimated. Importantly, several associations between actual usage and being addicted to mobile phones could be derived exclusively from the recorded behavior, but not from self-report variables. The study demonstrates the potential benefit to include methods of psychoinformatics in the diagnosis and treatment of problematic mobile phone use.

  12. C++QEDv2 Milestone 10: A C++/Python application-programming framework for simulating open quantum dynamics

    NASA Astrophysics Data System (ADS)

    Sandner, Raimar; Vukics, András

    2014-09-01

    The v2 Milestone 10 release of C++QED is primarily a feature release, which also corrects some problems of the previous release, especially as regards the build system. The adoption of C++11 features has led to many simplifications in the codebase. A full doxygen-based API manual [1] is now provided together with updated user guides. A largely automated, versatile new testsuite directed both towards computational and physics features allows for quickly spotting arising errors. The states of trajectories are now savable and recoverable with full binary precision, allowing for trajectory continuation regardless of evolution method (single/ensemble Monte Carlo wave-function or Master equation trajectory). As the main new feature, the framework now presents Python bindings to the highest-level programming interface, so that actual simulations for given composite quantum systems can now be performed from Python. Catalogue identifier: AELU_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AELU_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: yes No. of lines in distributed program, including test data, etc.: 492422 No. of bytes in distributed program, including test data, etc.: 8070987 Distribution format: tar.gz Programming language: C++/Python. Computer: i386-i686, x86 64. Operating system: In principle cross-platform, as yet tested only on UNIX-like systems (including Mac OS X). RAM: The framework itself takes about 60MB, which is fully shared. The additional memory taken by the program which defines the actual physical system (script) is typically less than 1MB. The memory storing the actual data scales with the system dimension for state-vector manipulations, and the square of the dimension for density-operator manipulations. This might easily be GBs, and often the memory of the machine limits the size of the simulated system. Classification: 4.3, 4.13, 6.2. External routines: Boost C++ libraries, GNU Scientific Library, Blitz++, FLENS, NumPy, SciPy Catalogue identifier of previous version: AELU_v1_0 Journal reference of previous version: Comput. Phys. Comm. 183 (2012) 1381 Does the new version supersede the previous version?: Yes Nature of problem: Definition of (open) composite quantum systems out of elementary building blocks [2,3]. Manipulation of such systems, with emphasis on dynamical simulations such as Master-equation evolution [4] and Monte Carlo wave-function simulation [5]. Solution method: Master equation, Monte Carlo wave-function method Reasons for new version: The new version is mainly a feature release, but it does correct some problems of the previous version, especially as regards the build system. Summary of revisions: We give an example for a typical Python script implementing the ring-cavity system presented in Sec. 3.3 of Ref. [2]: Restrictions: Total dimensionality of the system. Master equation-few thousands. Monte Carlo wave-function trajectory-several millions. Unusual features: Because of the heavy use of compile-time algorithms, compilation of programs written in the framework may take a long time and much memory (up to several GBs). Additional comments: The framework is not a program, but provides and implements an application-programming interface for developing simulations in the indicated problem domain. We use several C++11 features which limits the range of supported compilers (g++ 4.7, clang++ 3.1) Documentation, http://cppqed.sourceforge.net/ Running time: Depending on the magnitude of the problem, can vary from a few seconds to weeks. References: [1] Entry point: http://cppqed.sf.net [2] A. Vukics, C++QEDv2: The multi-array concept and compile-time algorithms in the definition of composite quantum systems, Comp. Phys. Comm. 183(2012)1381. [3] A. Vukics, H. Ritsch, C++QED: an object-oriented framework for wave-function simulations of cavity QED systems, Eur. Phys. J. D 44 (2007) 585. [4] H. J. Carmichael, An Open Systems Approach to Quantum Optics, Springer, 1993. [5] J. Dalibard, Y. Castin, K. Molmer, Wave-function approach to dissipative processes in quantum optics, Phys. Rev. Lett. 68 (1992) 580.

  13. Min-Cut Based Segmentation of Airborne LIDAR Point Clouds

    NASA Astrophysics Data System (ADS)

    Ural, S.; Shan, J.

    2012-07-01

    Introducing an organization to the unstructured point cloud before extracting information from airborne lidar data is common in many applications. Aggregating the points with similar features into segments in 3-D which comply with the nature of actual objects is affected by the neighborhood, scale, features and noise among other aspects. In this study, we present a min-cut based method for segmenting the point cloud. We first assess the neighborhood of each point in 3-D by investigating the local geometric and statistical properties of the candidates. Neighborhood selection is essential since point features are calculated within their local neighborhood. Following neighborhood determination, we calculate point features and determine the clusters in the feature space. We adapt a graph representation from image processing which is especially used in pixel labeling problems and establish it for the unstructured 3-D point clouds. The edges of the graph that are connecting the points with each other and nodes representing feature clusters hold the smoothness costs in the spatial domain and data costs in the feature domain. Smoothness costs ensure spatial coherence, while data costs control the consistency with the representative feature clusters. This graph representation formalizes the segmentation task as an energy minimization problem. It allows the implementation of an approximate solution by min-cuts for a global minimum of this NP hard minimization problem in low order polynomial time. We test our method with airborne lidar point cloud acquired with maximum planned post spacing of 1.4 m and a vertical accuracy 10.5 cm as RMSE. We present the effects of neighborhood and feature determination in the segmentation results and assess the accuracy and efficiency of the implemented min-cut algorithm as well as its sensitivity to the parameters of the smoothness and data cost functions. We find that smoothness cost that only considers simple distance parameter does not strongly conform to the natural structure of the points. Including shape information within the energy function by assigning costs based on the local properties may help to achieve a better representation for segmentation.

  14. The inverse problem of using the information of historical data to estimate model errors is one of the science frontier research topics. In this study, we investigate such a problem using the classic Lorenz (1963) equation as a prediction model.

    NASA Astrophysics Data System (ADS)

    Wan, S.; He, W.

    2016-12-01

    The inverse problem of using the information of historical data to estimate model errors is one of the science frontier research topics. In this study, we investigate such a problem using the classic Lorenz (1963) equation as a prediction model and the Lorenz equation with a periodic evolutionary function as an accurate representation of reality to generate "observational data." On the basis of the intelligent features of evolutionary modeling (EM), including self-organization, self-adaptive and self-learning, the dynamic information contained in the historical data can be identified and extracted by computer automatically. Thereby, a new approach is proposed to estimate model errors based on EM in the present paper. Numerical tests demonstrate the ability of the new approach to correct model structural errors. In fact, it can actualize the combination of the statistics and dynamics to certain extent.

  15. The Use of Photo-projects and Term Projects in Large-Format (200+ Students) Introductory Geology Courses.

    NASA Astrophysics Data System (ADS)

    Giles, A. N.; Wilkie, K. M.

    2008-12-01

    Photo-projects have long been utilized as a way of getting students in introductory geology courses to apply what they have learned in lecture to the outcrop and landscape. While the projects have many benefits, we have found that with large-format classes of 200+ students, where a mandatory field trip is logistically impossible, many problems can arise. One problem has been that of consistent and timely grading, which can be addressed by a project that can be turned in throughout the course of the semester and by utilizing a grading rubric. Also, in many cases, students simply take photographs of "scenery" and then try to identify features/processes with little thought as to whether that particular feature/process can occur in that geologic setting (such as identifying features as having a glacial origin in a non-glaciated terrain.) These types of problem can be attributed to the student's lack of knowledge of the geology of the area within which the photographs were taken and having little to no field instruction. Many of these problems can be addressed by utilizing a term project that combines elements of both research and the traditional photo project. The student chooses a specific area/region (i.e. a national park) that the student will/has actually visit(ed) and is then required to do background research before attempting to identify features and processes in photographs they have taken from the area. Here we present details of such a project that involves students performing research activities in three stages: The history/geologic setting of the area, the specific lithology of the area, and then the hydrology of the area, with each being completed at specified times throughout the semester. The final stage is the photo project component where the student identifies and interprets the features/processes in photographs from the area. The research provides the student with a framework within which they can identify and interpret the features/processes that are likely to be seen in their area.

  16. Reinforcement of timber beams with carbon fibers reinforced plastics

    NASA Astrophysics Data System (ADS)

    Gugutsidze, G.; Draškovič, F.

    2010-06-01

    Wood is a polymeric material with many valuable features and which also lacks some negative features. In order to keep up with high construction rates and the minimization of negative effects, wood has become one of the most valuable materials in modern engineering. But the use of timber material economically is also an actual problem in order to protect the environment and improve natural surroundings. A panel of scientists is interested in solving these problems and in creating rational structures, where timber can be used efficiently. These constructions are as follows: glue-laminated (gluelam), composed and reinforced wooden constructions. Composed and reinforced wooden constructions are examined less, but according to researches already carried out, it is clear that significant work can be accomplished in creating rational, highly effective and economic timber constructions. The paper deals with research on the formation of composed fiber-reinforced beams (CFRP) made of timber and provide evidence of their effectiveness. The aim of the paper is to investigate cross-bending of CFRP-reinforced gluelaminated timber beams. According to the results we were able to determine the additional effectiveness of reinforcement with CFRP (which depends on the CFRP material's quality, quantity and module of elasticity) on the mechanical features of timber and a whole beam.

  17. Theory of the Trojan-Horse Method - From the Original Idea to Actual Applications

    NASA Astrophysics Data System (ADS)

    Typel, Stefan

    2018-01-01

    The origin and the main features of the Trojan-horse (TH) method are delineated starting with the original idea of Gerhard Baur. Basic theoretical considerations, general experimental conditions and possible problems are discussed. Significant steps in experimental studies towards the implementation of the TH method and the development of the theoretical description are presented. This lead to the successful application of the TH approach by Claudio Spitaleri and his group to determine low-energy cross section that are relevant for astrophysics. An outlook with possible developments in the future are given.

  18. Quantum Computing Architectural Design

    NASA Astrophysics Data System (ADS)

    West, Jacob; Simms, Geoffrey; Gyure, Mark

    2006-03-01

    Large scale quantum computers will invariably require scalable architectures in addition to high fidelity gate operations. Quantum computing architectural design (QCAD) addresses the problems of actually implementing fault-tolerant algorithms given physical and architectural constraints beyond those of basic gate-level fidelity. Here we introduce a unified framework for QCAD that enables the scientist to study the impact of varying error correction schemes, architectural parameters including layout and scheduling, and physical operations native to a given architecture. Our software package, aptly named QCAD, provides compilation, manipulation/transformation, multi-paradigm simulation, and visualization tools. We demonstrate various features of the QCAD software package through several examples.

  19. Inference of Vohradský's Models of Genetic Networks by Solving Two-Dimensional Function Optimization Problems

    PubMed Central

    Kimura, Shuhei; Sato, Masanao; Okada-Hatakeyama, Mariko

    2013-01-01

    The inference of a genetic network is a problem in which mutual interactions among genes are inferred from time-series of gene expression levels. While a number of models have been proposed to describe genetic networks, this study focuses on a mathematical model proposed by Vohradský. Because of its advantageous features, several researchers have proposed the inference methods based on Vohradský's model. When trying to analyze large-scale networks consisting of dozens of genes, however, these methods must solve high-dimensional non-linear function optimization problems. In order to resolve the difficulty of estimating the parameters of the Vohradský's model, this study proposes a new method that defines the problem as several two-dimensional function optimization problems. Through numerical experiments on artificial genetic network inference problems, we showed that, although the computation time of the proposed method is not the shortest, the method has the ability to estimate parameters of Vohradský's models more effectively with sufficiently short computation times. This study then applied the proposed method to an actual inference problem of the bacterial SOS DNA repair system, and succeeded in finding several reasonable regulations. PMID:24386175

  20. Feature detection on 3D images of dental imprints

    NASA Astrophysics Data System (ADS)

    Mokhtari, Marielle; Laurendeau, Denis

    1994-09-01

    A computer vision approach for the extraction of feature points on 3D images of dental imprints is presented. The position of feature points are needed for the measurement of a set of parameters for automatic diagnosis of malocclusion problems in orthodontics. The system for the acquisition of the 3D profile of the imprint, the procedure for the detection of the interstices between teeth, and the approach for the identification of the type of tooth are described, as well as the algorithm for the reconstruction of the surface of each type of tooth. A new approach for the detection of feature points, called the watershed algorithm, is described in detail. The algorithm is a two-stage procedure which tracks the position of local minima at four different scales and produces a final map of the position of the minima. Experimental results of the application of the watershed algorithm on actual 3D images of dental imprints are presented for molars, premolars and canines. The segmentation approach for the analysis of the shape of incisors is also described in detail.

  1. Learning high-level features for chord recognition using Autoencoder

    NASA Astrophysics Data System (ADS)

    Phongthongloa, Vilailukkana; Kamonsantiroj, Suwatchai; Pipanmaekaporn, Luepol

    2016-07-01

    Chord transcription is valuable to do by itself. It is known that the manual transcription of chords is very tiresome, time-consuming. It requires, moreover, musical knowledge. Automatic chord recognition has recently attracted a number of researches in the Music Information Retrieval field. It has known that a pitch class profile (PCP) is the commonly signal representation of musical harmonic analysis. However, the PCP may contain additional non-harmonic noise such as harmonic overtones and transient noise. The problem of non-harmonic might be generating the sound energy in term of frequency more than the actual notes of the respective chord. Autoencoder neural network may be trained to learn a mapping from low level feature to one or more higher-level representation. These high-level representations can explain dependencies of the inputs and reduce the effect of non-harmonic noise. Then these improve features are fed into neural network classifier. The proposed high-level musical features show 80.90% of accuracy. The experimental results have shown that the proposed approach can achieve better performance in comparison with other based method.

  2. A Survey on the Feasibility of Sound Classification on Wireless Sensor Nodes

    PubMed Central

    Salomons, Etto L.; Havinga, Paul J. M.

    2015-01-01

    Wireless sensor networks are suitable to gain context awareness for indoor environments. As sound waves form a rich source of context information, equipping the nodes with microphones can be of great benefit. The algorithms to extract features from sound waves are often highly computationally intensive. This can be problematic as wireless nodes are usually restricted in resources. In order to be able to make a proper decision about which features to use, we survey how sound is used in the literature for global sound classification, age and gender classification, emotion recognition, person verification and identification and indoor and outdoor environmental sound classification. The results of the surveyed algorithms are compared with respect to accuracy and computational load. The accuracies are taken from the surveyed papers; the computational loads are determined by benchmarking the algorithms on an actual sensor node. We conclude that for indoor context awareness, the low-cost algorithms for feature extraction perform equally well as the more computationally-intensive variants. As the feature extraction still requires a large amount of processing time, we present four possible strategies to deal with this problem. PMID:25822142

  3. Effects of image compression and degradation on an automatic diabetic retinopathy screening algorithm

    NASA Astrophysics Data System (ADS)

    Agurto, C.; Barriga, S.; Murray, V.; Pattichis, M.; Soliz, P.

    2010-03-01

    Diabetic retinopathy (DR) is one of the leading causes of blindness among adult Americans. Automatic methods for detection of the disease have been developed in recent years, most of them addressing the segmentation of bright and red lesions. In this paper we present an automatic DR screening system that does approach the problem through the segmentation of features. The algorithm determines non-diseased retinal images from those with pathology based on textural features obtained using multiscale Amplitude Modulation-Frequency Modulation (AM-FM) decompositions. The decomposition is represented as features that are the inputs to a classifier. The algorithm achieves 0.88 area under the ROC curve (AROC) for a set of 280 images from the MESSIDOR database. The algorithm is then used to analyze the effects of image compression and degradation, which will be present in most actual clinical or screening environments. Results show that the algorithm is insensitive to illumination variations, but high rates of compression and large blurring effects degrade its performance.

  4. Relationships among employees' working conditions, mental health, and intention to leave in nursing homes.

    PubMed

    Zhang, Yuan; Punnett, Laura; Gore, Rebecca

    2014-02-01

    Employee turnover is a large and expensive problem in the long-term care environment. Stated intention to leave is a reliable indicator of likely turnover, but actual predictors, especially for nursing assistants, have been incompletely investigated. This quantitative study identifies the relationships among employees' working conditions, mental health, and intention to leave. Self-administered questionnaires were collected with 1,589 employees in 18 for-profit nursing homes. A working condition index for the number of beneficial job features was constructed. Poisson regression modeling found that employees who reported four positive features were 77% less likely to state strong intention to leave (PR = 0.23, p < .001). The strength of relationship between working conditions and intention to leave was slightly mediated by employee mental health. Effective workplace intervention programs must address work organization features to reduce employee intention to leave. Healthy workplaces should build better interpersonal relationships, show respect for employee work, and involve employees in decision-making processes.

  5. The mass media destabilizes the cultural homogenous regime in Axelrod's model

    NASA Astrophysics Data System (ADS)

    Peres, Lucas R.; Fontanari, José F.

    2010-02-01

    An important feature of Axelrod's model for culture dissemination or social influence is the emergence of many multicultural absorbing states, despite the fact that the local rules that specify the agents interactions are explicitly designed to decrease the cultural differences between agents. Here we re-examine the problem of introducing an external, global interaction—the mass media—in the rules of Axelrod's model: in addition to their nearest neighbors, each agent has a certain probability p to interact with a virtual neighbor whose cultural features are fixed from the outset. Most surprisingly, this apparently homogenizing effect actually increases the cultural diversity of the population. We show that, contrary to previous claims in the literature, even a vanishingly small value of p is sufficient to destabilize the homogeneous regime for very large lattice sizes.

  6. Salient Key Features of Actual English Instructional Practices in Saudi Arabia

    ERIC Educational Resources Information Center

    Al-Seghayer, Khalid

    2015-01-01

    This is a comprehensive review of the salient key features of the actual English instructional practices in Saudi Arabia. The goal of this work is to gain insights into the practices and pedagogic approaches to English as a foreign language (EFL) teaching currently employed in this country. In particular, we identify the following central features…

  7. Interactive object recognition assistance: an approach to recognition starting from target objects

    NASA Astrophysics Data System (ADS)

    Geisler, Juergen; Littfass, Michael

    1999-07-01

    Recognition of target objects in remotely sensed imagery required detailed knowledge about the target object domain as well as about mapping properties of the sensing system. The art of object recognition is to combine both worlds appropriately and to provide models of target appearance with respect to sensor characteristics. Common approaches to support interactive object recognition are either driven from the sensor point of view and address the problem of displaying images in a manner adequate to the sensing system. Or they focus on target objects and provide exhaustive encyclopedic information about this domain. Our paper discusses an approach to assist interactive object recognition based on knowledge about target objects and taking into account the significance of object features with respect to characteristics of the sensed imagery, e.g. spatial and spectral resolution. An `interactive recognition assistant' takes the image analyst through the interpretation process by indicating step-by-step the respectively most significant features of objects in an actual set of candidates. The significance of object features is expressed by pregenerated trees of significance, and by the dynamic computation of decision relevance for every feature at each step of the recognition process. In the context of this approach we discuss the question of modeling and storing the multisensorial/multispectral appearances of target objects and object classes as well as the problem of an adequate dynamic human-machine-interface that takes into account various mental models of human image interpretation.

  8. Online Case-Based Discussions: Examining Coverage of the Afforded Problem Space

    ERIC Educational Resources Information Center

    Ertmer, Peggy A.; Koehler, Adrie A.

    2014-01-01

    Case studies hold great potential for engaging students in disciplinary content. However, little is known about the extent to which students actually cover the problem space afforded by a particular case study. In this research, we compared the problem space afforded by an instructional design case study with the actual content covered by 16…

  9. Probable errors in width distributions of sea ice leads measured along a transect

    NASA Technical Reports Server (NTRS)

    Key, J.; Peckham, S.

    1991-01-01

    The degree of error expected in the measurement of widths of sea ice leads along a single transect are examined in a probabilistic sense under assumed orientation and width distributions, where both isotropic and anisotropic lead orientations are examined. Methods are developed for estimating the distribution of 'actual' widths (measured perpendicular to the local lead orientation) knowing the 'apparent' width distribution (measured along the transect), and vice versa. The distribution of errors, defined as the difference between the actual and apparent lead width, can be estimated from the two width distributions, and all moments of this distribution can be determined. The problem is illustrated with Landsat imagery and the procedure is applied to a submarine sonar transect. Results are determined for a range of geometries, and indicate the importance of orientation information if data sampled along a transect are to be used for the description of lead geometries. While the application here is to sea ice leads, the methodology can be applied to measurements of any linear feature.

  10. Cross-domain expression recognition based on sparse coding and transfer learning

    NASA Astrophysics Data System (ADS)

    Yang, Yong; Zhang, Weiyi; Huang, Yong

    2017-05-01

    Traditional facial expression recognition methods usually assume that the training set and the test set are independent and identically distributed. However, in actual expression recognition applications, the conditions of independent and identical distribution are hardly satisfied for the training set and test set because of the difference of light, shade, race and so on. In order to solve this problem and improve the performance of expression recognition in the actual applications, a novel method based on transfer learning and sparse coding is applied to facial expression recognition. First of all, a common primitive model, that is, the dictionary is learnt. Then, based on the idea of transfer learning, the learned primitive pattern is transferred to facial expression and the corresponding feature representation is obtained by sparse coding. The experimental results in CK +, JAFFE and NVIE database shows that the transfer learning based on sparse coding method can effectively improve the expression recognition rate in the cross-domain expression recognition task and is suitable for the practical facial expression recognition applications.

  11. Recorded Behavior as a Valuable Resource for Diagnostics in Mobile Phone Addiction: Evidence from Psychoinformatics

    PubMed Central

    Montag, Christian; Błaszkiewicz, Konrad; Lachmann, Bernd; Sariyska, Rayna; Andone, Ionut; Trendafilov, Boris; Markowetz, Alexander

    2015-01-01

    Psychologists and psychiatrists commonly rely on self-reports or interviews to diagnose or treat behavioral addictions. The present study introduces a novel source of data: recordings of the actual problem behavior under investigation. A total of N = 58 participants were asked to fill in a questionnaire measuring problematic mobile phone behavior featuring several questions on weekly phone usage. After filling in the questionnaire, all participants received an application to be installed on their smartphones, which recorded their phone usage for five weeks. The analyses revealed that weekly phone usage in hours was overestimated; in contrast, numbers of call and text message related variables were underestimated. Importantly, several associations between actual usage and being addicted to mobile phones could be derived exclusively from the recorded behavior, but not from self-report variables. The study demonstrates the potential benefit to include methods of psychoinformatics in the diagnosis and treatment of problematic mobile phone use. PMID:26492275

  12. STS-11/41-B Post Flight Press Conference

    NASA Technical Reports Server (NTRS)

    1984-01-01

    This NASA KSC video release begins with opening remarks from Mission Commander Vance D. Brand followed by the other 4 spacecrew panel members (Robert L. Gibson, Pilot, and Mission Specialists, Bruce McCandless II, Ronald E. McNair, Robert L. Stewart) commenting on a home-video that includes highlights of the entire flight from take-off to landing. This video includes actual footage of the deployment of the Westar-VI and PALAPA-B2 satellites as well as preparation for and the actual EVA's that featured a Spacepak that enabled the astronauts to move outside the orbiter untethered. This video is followed by a slide presentation made-up of images taken from approximately 2000 still photographs taken during the mission. All of the slides are described by members of the space crew and include images of the Earth seen from Challenger. A question and answer period rounds out the video, which include problems encountered with the deployment of the satellites as well as the possibilities of sending civilians into space.

  13. Research on flight stability performance of rotor aircraft based on visual servo control method

    NASA Astrophysics Data System (ADS)

    Yu, Yanan; Chen, Jing

    2016-11-01

    control method based on visual servo feedback is proposed, which is used to improve the attitude of a quad-rotor aircraft and to enhance its flight stability. Ground target images are obtained by a visual platform fixed on aircraft. Scale invariant feature transform (SIFT) algorism is used to extract image feature information. According to the image characteristic analysis, fast motion estimation is completed and used as an input signal of PID flight control system to realize real-time status adjustment in flight process. Imaging tests and simulation results show that the method proposed acts good performance in terms of flight stability compensation and attitude adjustment. The response speed and control precision meets the requirements of actual use, which is able to reduce or even eliminate the influence of environmental disturbance. So the method proposed has certain research value to solve the problem of aircraft's anti-disturbance.

  14. The role of corridors in conservation: Solution or bandwagon?

    PubMed

    Hobbs, R J

    1992-11-01

    Corridors are currently a major buzzword in conservation biology and landscape ecology. These linear landscape features may perform numerous functions, but it is their role in facilitating movement of fauna that has attracted much recent debate. The database supporting the idea of corridors acting as faunal conduits is remarkably small, and few studies have actually demonstrated that movement along corridors is important for any given species. Such data are very difficult to obtain, and conservation biologists are thus faced with the problem of whether to recommend the allocation of resources to corridors on the assumption that they may be important. Copyright © 1992. Published by Elsevier Ltd.

  15. Short-memory traders and their impact on group learning in financial markets

    PubMed Central

    LeBaron, Blake

    2002-01-01

    This article highlights several issues from simulating agent-based financial markets. These all center around the issue of learning in a multiagent setting, and specifically the question of whether the trading behavior of short-memory agents could interfere with the learning process of the market as whole. It is shown in a simple example that short-memory traders persist in generating excess volatility and other features common to actual markets. Problems related to short-memory trader behavior can be eliminated by using several different methods. These are discussed along with their relevance to agent-based models in general. PMID:11997443

  16. Validation of Biomarkers for Prostate Cancer Prognosis

    DTIC Science & Technology

    2015-11-01

    and actually have occult high-risk features or are destined to progress to high-risk disease. Therefore a critical need in localized prostate...cancer is the development of biomarkers that predict occult or incipient aggressive disease in the low-risk population. 15. SUBJECT TERMS- Nothing...causing death. However, it is well known that a significant fraction of low risk cases are misclassified and actually have occult high-risk features or

  17. Profugus

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, Thomas; Hamilton, Steven; Slattery, Stuart

    Profugus is an open-source mini-application (mini-app) for radiation transport and reactor applications. It contains the fundamental computational kernels used in the Exnihilo code suite from Oak Ridge National Laboratory. However, Exnihilo is production code with a substantial user base. Furthermore, Exnihilo is export controlled. This makes collaboration with computer scientists and computer engineers difficult. Profugus is designed to bridge that gap. By encapsulating the core numerical algorithms in an abbreviated code base that is open-source, computer scientists can analyze the algorithms and easily make code-architectural changes to test performance without compromising the production code values of Exnihilo. Profugus is notmore » meant to be production software with respect to problem analysis. The computational kernels in Profugus are designed to analyze performance, not correctness. Nonetheless, users of Profugus can setup and run problems with enough real-world features to be useful as proof-of-concept for actual production work.« less

  18. Centralization of dairy farming facilities for improved economics and environmental quality.

    PubMed

    Inaba, Rokuta; Furuichi, Tohru; Komatsu, Toshihiro; Tanikawa, Noboru; Ishii, Kazuei

    2009-01-01

    In Japan, most farm animal excreta has been stored directly on farmland. Runoff from this storage has often caused water pollution. Biogasification is anticipated as an important technology to manage excreta properly, but complex problems hinder its introduction. Economic aspects of management have been especially difficult for dairy farmers. For this study, structural problems regarding introduction of biogasification into dairy farming were identified. Subsequently, a desirable system of dairy farming including biogasification was suggested, and an evaluation model of the financial balance was constructed. A case study using current financial balances of several systems of dairy farming was evaluated using the constructed model and actual data. The systems were based on several policy alternatives including the suggested system mentioned above. Results show that a farmer can obtain sufficient income from a system featuring centralization of dairy housing and biogasification facilities and coordinated management by over six farmers.

  19. From simple receptors to complex multimodal percepts: a first global picture on the mechanisms involved in perceptual binding.

    PubMed

    Velik, Rosemarie

    2012-01-01

    The binding problem in perception is concerned with answering the question how information from millions of sensory receptors, processed by millions of neurons working in parallel, can be merged into a unified percept. Binding in perception reaches from the lowest levels of feature binding up to the levels of multimodal binding of information coming from the different sensor modalities and also from other functional systems. The last 40 years of research have shown that the binding problem cannot be solved easily. Today, it is considered as one of the key questions to brain understanding. To date, various solutions have been suggested to the binding problem including: (1) combination coding, (2) binding by synchrony, (3) population coding, (4) binding by attention, (5) binding by knowledge, expectation, and memory, (6) hardwired vs. on-demand binding, (7) bundling and binding of features, (8) the feature-integration theory of attention, and (9) synchronization through top-down processes. Each of those hypotheses addresses important aspects of binding. However, each of them also suffers from certain weak points and can never give a complete explanation. This article gives a brief overview of the so far suggested solutions of perceptual binding and then shows that those are actually not mutually exclusive but can complement each other. A computationally verified model is presented which shows that, most likely, the different described mechanisms of binding act (1) at different hierarchical levels and (2) in different stages of "perceptual knowledge acquisition." The model furthermore considers and explains a number of inhibitory "filter mechanisms" that suppress the activation of inappropriate or currently irrelevant information.

  20. From Simple Receptors to Complex Multimodal Percepts: A First Global Picture on the Mechanisms Involved in Perceptual Binding

    PubMed Central

    Velik, Rosemarie

    2012-01-01

    The binding problem in perception is concerned with answering the question how information from millions of sensory receptors, processed by millions of neurons working in parallel, can be merged into a unified percept. Binding in perception reaches from the lowest levels of feature binding up to the levels of multimodal binding of information coming from the different sensor modalities and also from other functional systems. The last 40 years of research have shown that the binding problem cannot be solved easily. Today, it is considered as one of the key questions to brain understanding. To date, various solutions have been suggested to the binding problem including: (1) combination coding, (2) binding by synchrony, (3) population coding, (4) binding by attention, (5) binding by knowledge, expectation, and memory, (6) hardwired vs. on-demand binding, (7) bundling and binding of features, (8) the feature-integration theory of attention, and (9) synchronization through top-down processes. Each of those hypotheses addresses important aspects of binding. However, each of them also suffers from certain weak points and can never give a complete explanation. This article gives a brief overview of the so far suggested solutions of perceptual binding and then shows that those are actually not mutually exclusive but can complement each other. A computationally verified model is presented which shows that, most likely, the different described mechanisms of binding act (1) at different hierarchical levels and (2) in different stages of “perceptual knowledge acquisition.” The model furthermore considers and explains a number of inhibitory “filter mechanisms” that suppress the activation of inappropriate or currently irrelevant information. PMID:22837751

  1. Ethnic/racial misidentification in death: a problem which may distort suicide statistics.

    PubMed

    Andres, V R

    1977-01-01

    Since the majority of suicide studies are ex post facto studies of demographic data collected by pathologists anc coroner's investigators, the role of the forensic scientist in determining the accuracy of statistical analyses of death is extremely important. This paper discusses how two salient features of a decedent, surname and residence location, can be misleading in determining the ethnic/racial classification of the decreased. Because many Southern California Indians have Spanish Surnames and most do not reside on an Indian reservation it is shown that the suicide statistics may represent an over-estimation of actual Mexican-American suicidal deaths while simultaneously representing an under-estimation of the suicides among American Indians of the region.

  2. An Intelligent Actuator Fault Reconstruction Scheme for Robotic Manipulators.

    PubMed

    Xiao, Bing; Yin, Shen

    2018-02-01

    This paper investigates a difficult problem of reconstructing actuator faults for robotic manipulators. An intelligent approach with fast reconstruction property is developed. This is achieved by using observer technique. This scheme is capable of precisely reconstructing the actual actuator fault. It is shown by Lyapunov stability analysis that the reconstruction error can converge to zero after finite time. A perfect reconstruction performance including precise and fast properties can be provided for actuator fault. The most important feature of the scheme is that, it does not depend on control law, dynamic model of actuator, faults' type, and also their time-profile. This super reconstruction performance and capability of the proposed approach are further validated by simulation and experimental results.

  3. Laser-induced regeneration of cartilage

    NASA Astrophysics Data System (ADS)

    Sobol, Emil; Shekhter, Anatoly; Guller, Anna; Baum, Olga; Baskov, Andrey

    2011-08-01

    Laser radiation provides a means to control the fields of temperature and thermo mechanical stress, mass transfer, and modification of fine structure of the cartilage matrix. The aim of this outlook paper is to review physical and biological aspects of laser-induced regeneration of cartilage and to discuss the possibilities and prospects of its clinical applications. The problems and the pathways of tissue regeneration, the types and features of cartilage will be introduced first. Then we will review various actual and prospective approaches for cartilage repair; consider possible mechanisms of laser-induced regeneration. Finally, we present the results in laser regeneration of joints and spine disks cartilages and discuss some future applications of lasers in regenerative medicine.

  4. AN EFFICIENT HIGHER-ORDER FAST MULTIPOLE BOUNDARY ELEMENT SOLUTION FOR POISSON-BOLTZMANN BASED MOLECULAR ELECTROSTATICS

    PubMed Central

    Bajaj, Chandrajit; Chen, Shun-Chuan; Rand, Alexander

    2011-01-01

    In order to compute polarization energy of biomolecules, we describe a boundary element approach to solving the linearized Poisson-Boltzmann equation. Our approach combines several important features including the derivative boundary formulation of the problem and a smooth approximation of the molecular surface based on the algebraic spline molecular surface. State of the art software for numerical linear algebra and the kernel independent fast multipole method is used for both simplicity and efficiency of our implementation. We perform a variety of computational experiments, testing our method on a number of actual proteins involved in molecular docking and demonstrating the effectiveness of our solver for computing molecular polarization energy. PMID:21660123

  5. The Use of Artificial Neural Networks for Forecasting the Electric Demand of Stand-Alone Consumers

    NASA Astrophysics Data System (ADS)

    Ivanin, O. A.; Direktor, L. B.

    2018-05-01

    The problem of short-term forecasting of electric power demand of stand-alone consumers (small inhabited localities) situated outside centralized power supply areas is considered. The basic approaches to modeling the electric power demand depending on the forecasting time frame and the problems set, as well as the specific features of such modeling, are described. The advantages and disadvantages of the methods used for the short-term forecast of the electric demand are indicated, and difficulties involved in the solution of the problem are outlined. The basic principles of arranging artificial neural networks are set forth; it is also shown that the proposed method is preferable when the input information necessary for prediction is lacking or incomplete. The selection of the parameters that should be included into the list of the input data for modeling the electric power demand of residential areas using artificial neural networks is validated. The structure of a neural network is proposed for solving the problem of modeling the electric power demand of residential areas. The specific features of generation of the training dataset are outlined. The results of test modeling of daily electric demand curves for some settlements of Kamchatka and Yakutia based on known actual electric demand curves are provided. The reliability of the test modeling has been validated. A high value of the deviation of the modeled curve from the reference curve obtained in one of the four reference calculations is explained. The input data and the predicted power demand curves for the rural settlement of Kuokuiskii Nasleg are provided. The power demand curves were modeled for four characteristic days of the year, and they can be used in the future for designing a power supply system for the settlement. To enhance the accuracy of the method, a series of measures based on specific features of a neural network's functioning are proposed.

  6. Parallel Tetrahedral Mesh Adaptation with Dynamic Load Balancing

    NASA Technical Reports Server (NTRS)

    Oliker, Leonid; Biswas, Rupak; Gabow, Harold N.

    1999-01-01

    The ability to dynamically adapt an unstructured grid is a powerful tool for efficiently solving computational problems with evolving physical features. In this paper, we report on our experience parallelizing an edge-based adaptation scheme, called 3D_TAG. using message passing. Results show excellent speedup when a realistic helicopter rotor mesh is randomly refined. However. performance deteriorates when the mesh is refined using a solution-based error indicator since mesh adaptation for practical problems occurs in a localized region., creating a severe load imbalance. To address this problem, we have developed PLUM, a global dynamic load balancing framework for adaptive numerical computations. Even though PLUM primarily balances processor workloads for the solution phase, it reduces the load imbalance problem within mesh adaptation by repartitioning the mesh after targeting edges for refinement but before the actual subdivision. This dramatically improves the performance of parallel 3D_TAG since refinement occurs in a more load balanced fashion. We also present optimal and heuristic algorithms that, when applied to the default mapping of a parallel repartitioner, significantly reduce the data redistribution overhead. Finally, portability is examined by comparing performance on three state-of-the-art parallel machines.

  7. Exploring students’ perceived and actual ability in solving statistical problems based on Rasch measurement tools

    NASA Astrophysics Data System (ADS)

    Azila Che Musa, Nor; Mahmud, Zamalia; Baharun, Norhayati

    2017-09-01

    One of the important skills that is required from any student who are learning statistics is knowing how to solve statistical problems correctly using appropriate statistical methods. This will enable them to arrive at a conclusion and make a significant contribution and decision for the society. In this study, a group of 22 students majoring in statistics at UiTM Shah Alam were given problems relating to topics on testing of hypothesis which require them to solve the problems using confidence interval, traditional and p-value approach. Hypothesis testing is one of the techniques used in solving real problems and it is listed as one of the difficult concepts for students to grasp. The objectives of this study is to explore students’ perceived and actual ability in solving statistical problems and to determine which item in statistical problem solving that students find difficult to grasp. Students’ perceived and actual ability were measured based on the instruments developed from the respective topics. Rasch measurement tools such as Wright map and item measures for fit statistics were used to accomplish the objectives. Data were collected and analysed using Winsteps 3.90 software which is developed based on the Rasch measurement model. The results showed that students’ perceived themselves as moderately competent in solving the statistical problems using confidence interval and p-value approach even though their actual performance showed otherwise. Item measures for fit statistics also showed that the maximum estimated measures were found on two problems. These measures indicate that none of the students have attempted these problems correctly due to reasons which include their lack of understanding in confidence interval and probability values.

  8. WHAMII - An enumeration and insertion procedure with binomial bounds for the stochastic time-constrained traveling salesman problem

    NASA Technical Reports Server (NTRS)

    Dahl, Roy W.; Keating, Karen; Salamone, Daryl J.; Levy, Laurence; Nag, Barindra; Sanborn, Joan A.

    1987-01-01

    This paper presents an algorithm (WHAMII) designed to solve the Artificial Intelligence Design Challenge at the 1987 AIAA Guidance, Navigation and Control Conference. The problem under consideration is a stochastic generalization of the traveling salesman problem in which travel costs can incur a penalty with a given probability. The variability in travel costs leads to a probability constraint with respect to violating the budget allocation. Given the small size of the problem (eleven cities), an approach is considered that combines partial tour enumeration with a heuristic city insertion procedure. For computational efficiency during both the enumeration and insertion procedures, precalculated binomial probabilities are used to determine an upper bound on the actual probability of violating the budget constraint for each tour. The actual probability is calculated for the final best tour, and additional insertions are attempted until the actual probability exceeds the bound.

  9. Research-based active-learning instruction in physics

    NASA Astrophysics Data System (ADS)

    Meltzer, David E.; Thornton, Ronald K.

    2013-04-01

    The development of research-based active-learning instructional methods in physics has significantly altered the landscape of U.S. physics education during the past 20 years. Based on a recent review [D.E. Meltzer and R.K. Thornton, Am. J. Phys. 80, 478 (2012)], we define these methods as those (1) explicitly based on research in the learning and teaching of physics, (2) that incorporate classroom and/or laboratory activities that require students to express their thinking through speaking, writing, or other actions that go beyond listening and the copying of notes, or execution of prescribed procedures, and (3) that have been tested repeatedly in actual classroom settings and have yielded objective evidence of improved student learning. We describe some key features common to methods in current use. These features focus on (a) recognizing and addressing students' physics ideas, and (b) guiding students to solve problems in realistic physical settings, in novel and diverse contexts, and to justify or explain the reasoning they have used.

  10. Analysis of dynamic thresholds for the normalized difference water index

    USGS Publications Warehouse

    Ji, Lei; Zhang, Li; Wylie, Bruce K.

    2009-01-01

    The normalized difference water index (NDWI) has been successfully used to delineate surface water features. However, two major problems have been often encountered: (a) NDWIs calculated from different band combinations [visible, nearinfrared, or shortwave-infrared (SWIR)] can generate different results, and (b) NDWI thresholds vary depending on the proportions of subpixel water/non-water components. We need to evaluate all the NDWIS for determining the best performing index and to establish appropriate thresholds for clearly identifying water features. We used the spectral data obtained from a spectral library to simulate the satellite sensors Landsat ETM+, SPOT-5, ASTER, and MODIS, and calculated the simulated NDWI in different forms. We found that the NDWI calculated from (green - swm)/(green + SWIR), where SWIR is the shorter wavelength region (1.2 to 1.8 ??m), has the most stable threshold. We recommend this NDWI be employed for mapping water, but adjustment of the threshold based on actual situations is necessary. ?? 2009 American Society for Photogrammetry and Remote Sensing.

  11. Y Marks the Spot

    NASA Image and Video Library

    2016-04-18

    A sinuous feature snakes northward from Enceladus south pole like a giant tentacle in this image from NASA Cassini spacecraft. This feature, is actually tectonic in nature, created by stresses in Enceladus icy shell.

  12. Fast detection of vascular plaque in optical coherence tomography images using a reduced feature set

    NASA Astrophysics Data System (ADS)

    Prakash, Ammu; Ocana Macias, Mariano; Hewko, Mark; Sowa, Michael; Sherif, Sherif

    2018-03-01

    Optical coherence tomography (OCT) images are capable of detecting vascular plaque by using the full set of 26 Haralick textural features and a standard K-means clustering algorithm. However, the use of the full set of 26 textural features is computationally expensive and may not be feasible for real time implementation. In this work, we identified a reduced set of 3 textural feature which characterizes vascular plaque and used a generalized Fuzzy C-means clustering algorithm. Our work involves three steps: 1) the reduction of a full set 26 textural feature to a reduced set of 3 textural features by using genetic algorithm (GA) optimization method 2) the implementation of an unsupervised generalized clustering algorithm (Fuzzy C-means) on the reduced feature space, and 3) the validation of our results using histology and actual photographic images of vascular plaque. Our results show an excellent match with histology and actual photographic images of vascular tissue. Therefore, our results could provide an efficient pre-clinical tool for the detection of vascular plaque in real time OCT imaging.

  13. Overview of Digital Forensics Algorithms in Dslr Cameras

    NASA Astrophysics Data System (ADS)

    Aminova, E.; Trapeznikov, I.; Priorov, A.

    2017-05-01

    The widespread usage of the mobile technologies and the improvement of the digital photo devices getting has led to more frequent cases of falsification of images including in the judicial practice. Consequently, the actual task for up-to-date digital image processing tools is the development of algorithms for determining the source and model of the DSLR (Digital Single Lens Reflex) camera and improve image formation algorithms. Most research in this area based on the mention that the extraction of unique sensor trace of DSLR camera could be possible on the certain stage of the imaging process into the camera. It is considered that the study focuses on the problem of determination of unique feature of DSLR cameras based on optical subsystem artifacts and sensor noises.

  14. A semi-Lagrangian advection scheme for radioactive tracers in a regional spectral model

    NASA Astrophysics Data System (ADS)

    Chang, E.-C.; Yoshimura, K.

    2015-06-01

    In this study, the non-iteration dimensional-split semi-Lagrangian (NDSL) advection scheme is applied to the National Centers for Environmental Prediction (NCEP) regional spectral model (RSM) to alleviate the Gibbs phenomenon. The Gibbs phenomenon is a problem wherein negative values of positive-definite quantities (e.g., moisture and tracers) are generated by the spectral space transformation in a spectral model system. To solve this problem, the spectral prognostic specific humidity and radioactive tracer advection scheme is replaced by the NDSL advection scheme, which considers advection of tracers in a grid system without spectral space transformations. A regional version of the NDSL is developed in this study and is applied to the RSM. Idealized experiments show that the regional version of the NDSL is successful. The model runs for an actual case study suggest that the NDSL can successfully advect radioactive tracers (iodine-131 and cesium-137) without noise from the Gibbs phenomenon. The NDSL can also remove negative specific humidity values produced in spectral calculations without losing detailed features.

  15. Postdecisional counterfactual thinking by actors and readers.

    PubMed

    Girotto, Vittorio; Ferrante, Donatella; Pighin, Stefania; Gonzalez, Michel

    2007-06-01

    How do individuals think counterfactually about the outcomes of their decisions? Most previous studies have investigated how readers think about fictional stories, rather than how actors think about events they have actually experienced. We assumed that differences in individuals' roles (actor vs. reader) can make different information available, which in turn can affect counterfactual thinking. Hence, we predicted an effect of role on postdecisional counterfactual thinking. Reporting the results of eight studies, we show that readers undo the negative outcome of a story by undoing the protagonist's choice to tackle a given problem, rather than the protagonist's unsuccessful attempt to solve it. But actors who make the same choice and experience the same negative outcome as the protagonist undo this outcome by altering features of the problem. We also show that this effect does not depend on motivational factors. These results contradict current accounts of counterfactual thinking and demonstrate the necessity of investigating the counterfactual thoughts of individuals in varied roles.

  16. Cadastral data model established and perfected with 4S technology

    NASA Astrophysics Data System (ADS)

    He, Beijing; He, Jiang; He, Jianpeng

    1998-08-01

    Considering China's social essential system and the actual case of the formation of cadastral information in urban and rural area, and based on the 4S technology and the theory and method of canton's GPS geodetic data bench developed by the authors, we thoroughly research on some correlative technical problems about establishing and perfecting all-level's microcosmic cadastral data model (called model in the following) once again. Such problems as the following are included: cadastral, feature and topographic information and its modality and expressing method, classifying and grading the model, coordinate system to be selected, data basis for the model, the collecting method and digitalization of information, database's structural model, mathematical model and the establishing technology of 3 or more dimensional model, dynamic monitoring of and the development and application of the model. Then, the domestic and overseas application prospect is revealed. It also has the tendency to intrude markets cooperated with 'data bench' technology or RS image maps' all-analysis digital surveying and mapping technology.

  17. Method of monaural localization of the acoustic source direction from the standpoint of the active perception theory

    NASA Astrophysics Data System (ADS)

    Gai, V. E.; Polyakov, I. V.; Krasheninnikov, M. S.; Koshurina, A. A.; Dorofeev, R. A.

    2017-01-01

    Currently, the scientific and educational center of the “Transport” of NNSTU performs work on the creation of the universal rescue vehicle. This vehicle is a robot, and intended to reduce the number of human victims in accidents on offshore oil platforms. An actual problem is the development of a method for determining the location of a person overboard in low visibility conditions, when a traditional vision is not efficient. One of the most important sensory robot systems is the acoustic sensor system, because it is omnidirectional and does not require finding of an acoustic source in visibility scope. Features of the acoustic sensor robot system can complement the capabilities of the video sensor in the solution of the problem of localization of a person or some event in the environment. This paper describes the method of determination of the direction of the acoustic source using just one microphone. The proposed method is based on the active perception theory.

  18. A novel automated spike sorting algorithm with adaptable feature extraction.

    PubMed

    Bestel, Robert; Daus, Andreas W; Thielemann, Christiane

    2012-10-15

    To study the electrophysiological properties of neuronal networks, in vitro studies based on microelectrode arrays have become a viable tool for analysis. Although in constant progress, a challenging task still remains in this area: the development of an efficient spike sorting algorithm that allows an accurate signal analysis at the single-cell level. Most sorting algorithms currently available only extract a specific feature type, such as the principal components or Wavelet coefficients of the measured spike signals in order to separate different spike shapes generated by different neurons. However, due to the great variety in the obtained spike shapes, the derivation of an optimal feature set is still a very complex issue that current algorithms struggle with. To address this problem, we propose a novel algorithm that (i) extracts a variety of geometric, Wavelet and principal component-based features and (ii) automatically derives a feature subset, most suitable for sorting an individual set of spike signals. Thus, there is a new approach that evaluates the probability distribution of the obtained spike features and consequently determines the candidates most suitable for the actual spike sorting. These candidates can be formed into an individually adjusted set of spike features, allowing a separation of the various shapes present in the obtained neuronal signal by a subsequent expectation maximisation clustering algorithm. Test results with simulated data files and data obtained from chick embryonic neurons cultured on microelectrode arrays showed an excellent classification result, indicating the superior performance of the described algorithm approach. Copyright © 2012 Elsevier B.V. All rights reserved.

  19. State of the field: Are the results of science contingent or inevitable?

    PubMed

    Kinzel, Katherina

    2015-08-01

    This paper presents a survey of the literature on the problem of contingency in science. The survey is structured around three challenges faced by current attempts at understanding the conflict between "contingentist" and "inevitabilist" interpretations of scientific knowledge and practice. First, the challenge of definition: it proves hard to define the positions that are at stake in a way that is both conceptually rigorous and does justice to the plethora of views on the issue. Second, the challenge of distinction: some features of the debate suggest that the contingency issue may not be sufficiently distinct from other philosophical debates to constitute a genuine, independent philosophical problem. And third, the challenge of decidability: it remains unclear whether and how the conflict could be settled on the basis of empirical evidence from the actual history of science. The paper argues that in order to make progress in the present debate, we need to distinguish more systematically between different expressions that claims about contingency and inevitability in science can take. To this end, it introduces a taxonomy of different contingency and inevitability claims. The taxonomy has the structure of an ordered quadruple. Each contingency and each inevitability claim contains an answer to the following four questions: (how) are alternatives to current science possible, what types of alternatives are we talking about, how should the alternatives be assessed, and how different are they from actual science? Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Virtual Environment Training on Mobile Devices

    DTIC Science & Technology

    2013-09-01

    NONFUNCTIONAL REQUIREMENTS .............................................. 41 G. PRODUCT FEATURES...52 E. SOFTWARE PRODUCTION .............................................................. 52 F. LIMITATIONS...on Android and iOS tablets. G. PRODUCT FEATURES 1. The final product shall include interactive 3D graphics with simulated representation of actual

  1. Analyzing and Detecting Problems in Systems of Systems

    NASA Technical Reports Server (NTRS)

    Lindvall, Mikael; Ackermann, Christopher; Stratton, William C.; Sibol, Deane E.; Godfrey, Sally

    2008-01-01

    Many software systems are evolving complex system of systems (SoS) for which inter-system communication is mission-critical. Evidence indicates that transmission failures and performance issues are not uncommon occurrences. In a NASA-supported Software Assurance Research Program (SARP) project, we are researching a new approach addressing such problems. In this paper, we are presenting an approach for analyzing inter-system communications with the goal to uncover both transmission errors and performance problems. Our approach consists of a visualization and an evaluation component. While the visualization of the observed communication aims to facilitate understanding, the evaluation component automatically checks the conformance of an observed communication (actual) to a desired one (planned). The actual and the planned are represented as sequence diagrams. The evaluation algorithm checks the conformance of the actual to the planned diagram. We have applied our approach to the communication of aerospace systems and were successful in detecting and resolving even subtle and long existing transmission problems.

  2. Study of coherent synchrotron radiation effects by means of a new simulation code based on the non-linear extension of the operator splitting method

    NASA Astrophysics Data System (ADS)

    Dattoli, G.; Migliorati, M.; Schiavi, A.

    2007-05-01

    The coherent synchrotron radiation (CSR) is one of the main problems limiting the performance of high-intensity electron accelerators. The complexity of the physical mechanisms underlying the onset of instabilities due to CSR demands for accurate descriptions, capable of including the large number of features of an actual accelerating device. A code devoted to the analysis of these types of problems should be fast and reliable, conditions that are usually hardly achieved at the same time. In the past, codes based on Lie algebraic techniques have been very efficient to treat transport problems in accelerators. The extension of these methods to the non-linear case is ideally suited to treat CSR instability problems. We report on the development of a numerical code, based on the solution of the Vlasov equation, with the inclusion of non-linear contribution due to wake field effects. The proposed solution method exploits an algebraic technique that uses the exponential operators. We show that the integration procedure is capable of reproducing the onset of instability and the effects associated with bunching mechanisms leading to the growth of the instability itself. In addition, considerations on the threshold of the instability are also developed.

  3. Cynefin as Reference Framework to Facilitate Insight and Decision-Making in Complex Contexts of Biomedical Research.

    PubMed

    Kempermann, Gerd

    2017-01-01

    The Cynefin scheme is a concept of knowledge management, originally devised to support decision making in management, but more generally applicable to situations, in which complexity challenges the quality of insight, prediction, and decision. Despite the fact that life itself, and especially the brain and its diseases, are complex to the extent that complexity could be considered their cardinal feature, complex problems in biomedicine are often treated as if they were actually not more than the complicated sum of solvable sub-problems. Because of the emergent properties of complex contexts this is not correct. With a set of clear criteria Cynefin helps to set apart complex problems from "simple/obvious," "complicated," "chaotic," and "disordered" contexts in order to avoid misinterpreting the relevant causality structures. The distinction comes with the insight, which specific kind of knowledge is possible in each of these categories and what are the consequences for resulting decisions and actions. From student's theses over the publication and grant writing process to research politics, misinterpretation of complexity can have problematic or even dangerous consequences, especially in clinical contexts. Conceptualization of problems within a straightforward reference language like Cynefin improves clarity and stringency within projects and facilitates communication and decision-making about them.

  4. The point-spread function measure of resolution for the 3-D electrical resistivity experiment

    NASA Astrophysics Data System (ADS)

    Oldenborger, Greg A.; Routh, Partha S.

    2009-02-01

    The solution appraisal component of the inverse problem involves investigation of the relationship between our estimated model and the actual model. However, full appraisal is difficult for large 3-D problems such as electrical resistivity tomography (ERT). We tackle the appraisal problem for 3-D ERT via the point-spread functions (PSFs) of the linearized resolution matrix. The PSFs represent the impulse response of the inverse solution and quantify our parameter-specific resolving capability. We implement an iterative least-squares solution of the PSF for the ERT experiment, using on-the-fly calculation of the sensitivity via an adjoint integral equation with stored Green's functions and subgrid reduction. For a synthetic example, analysis of individual PSFs demonstrates the truly 3-D character of the resolution. The PSFs for the ERT experiment are Gaussian-like in shape, with directional asymmetry and significant off-diagonal features. Computation of attributes representative of the blurring and localization of the PSF reveal significant spatial dependence of the resolution with some correlation to the electrode infrastructure. Application to a time-lapse ground-water monitoring experiment demonstrates the utility of the PSF for assessing feature discrimination, predicting artefacts and identifying model dependence of resolution. For a judicious selection of model parameters, we analyse the PSFs and their attributes to quantify the case-specific localized resolving capability and its variability over regions of interest. We observe approximate interborehole resolving capability of less than 1-1.5m in the vertical direction and less than 1-2.5m in the horizontal direction. Resolving capability deteriorates significantly outside the electrode infrastructure.

  5. Social responsibility tools in online gambling: a survey of attitudes and behavior among Internet gamblers.

    PubMed

    Griffiths, Mark D; Wood, Richard T A; Parke, Jonathan

    2009-08-01

    To date, little empirical research has focused on social responsibility in gambling. This study examined players' attitudes and behavior toward using the social responsibility tool PlayScan designed by the Swedish gaming company Svenska Spel. Via PlayScan, players have the option to utilize various social responsibility control tools (e.g., personal gaming budgets, self-diagnostic tests of gambling habits, self-exclusion options). A total of 2,348 participants took part in an online questionnaire study. Participants were clientele of the Svenska Spel online gambling Web site. Results showed that just over a quarter of players (26%) had used PlayScan. The vast majority of those who had activated PlayScan (almost 9 in 10 users) said that PlayScan was easy to use. Over half of PlayScan users (52%) said it was useful; 19% said it was not. Many features were seen as useful by online gamblers, including limit setting (70%), viewing their gambling profile (49%), self-exclusion facilities (42%), self-diagnostic problem gambling tests (46%), information and support for gambling issues (40%), and gambling profile predictions (36%). In terms of actual (as opposed to theoretical) use, over half of PlayScan users (56%) had set spending limits, 40% had taken a self-diagnostic problem gambling test, and 17% had used a self-exclusion feature.

  6. Computer-based mechanical design of overhead lines

    NASA Astrophysics Data System (ADS)

    Rusinaru, D.; Bratu, C.; Dinu, R. C.; Manescu, L. G.

    2016-02-01

    Beside the performance, the safety level according to the actual standards is a compulsory condition for distribution grids’ operation. Some of the measures leading to improvement of the overhead lines reliability ask for installations’ modernization. The constraints imposed to the new lines components refer to the technical aspects as thermal stress or voltage drop, and look for economic efficiency, too. The mechanical sizing of the overhead lines is after all an optimization problem. More precisely, the task in designing of the overhead line profile is to size poles, cross-arms and stays and locate poles along a line route so that the total costs of the line's structure to be minimized and the technical and safety constraints to be fulfilled.The authors present in this paper an application for the Computer-Based Mechanical Design of the Overhead Lines and the features of the corresponding Visual Basic program, adjusted to the distribution lines. The constraints of the optimization problem are adjusted to the existing weather and loading conditions of Romania. The outputs of the software application for mechanical design of overhead lines are: the list of components chosen for the line: poles, cross-arms, stays; the list of conductor tension and forces for each pole, cross-arm and stay for different weather conditions; the line profile drawings.The main features of the mechanical overhead lines design software are interactivity, local optimization function and high-level user-interface

  7. Developing a CD-CBM Anticipatory Approach for Cavitation - Defining a Model-Based Descriptor Consistent Across Processes, Phase 1 Final Report Context-Dependent Prognostics and Health Assessment: A New Paradigm for Condition-based Maintenance SBIR Topic No. N98-114

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allgood, G.O.; Dress, W.B.; Kercel, S.W.

    1999-06-01

    The objective of this research, and subsequent testing, was to identify specific features of cavitation that could be used as a model-based descriptor in a context-dependent condition-based maintenance (CD-CBM) anticipatory prognostic and health assessment model. This descriptor is based on the physics of the phenomena, capturing the salient features of the process dynamics. The test methodology and approach were developed to make the cavitation features the dominant effect in the process and collected signatures. This would allow the accurate characterization of the salient cavitation features at different operational states. By developing such an abstraction, these attributes can be used asmore » a general diagnostic for a system or any of its components. In this study, the particular focus will be pumps. As many as 90% of pump failures are catastrophic. They seem to be operating normally and fail abruptly without warning. This is true whether the failure is sudden hardware damage requiring repair, such as a gasket failure, or a transition into an undesired operating mode, such as cavitation. This means that conventional diagnostic methods fail to predict 90% of incipient failures and that in addressing this problem, model-based methods can add value where it is actually needed.« less

  8. Vertically integrated medical education and the readiness for practice of graduates.

    PubMed

    Wijnen-Meijer, Marjo; Ten Cate, Olle; van der Schaaf, Marieke; Burgers, Chantalle; Borleffs, Jan; Harendza, Sigrid

    2015-12-21

    Medical curricula become more and more vertically integrated (VI) to prepare graduates better for clinical practice. VI curricula show early clinical education, integration of biomedical sciences and focus on increasing clinical responsibility levels for trainees. Results of earlier questionnaire-based studies indicate that the type of the curriculum can affect the perceived preparedness for work as perceived by students or supervisors. The aim of the present study is to determine difference in actual performance of graduates from VI and non-VI curricula. We developed and implemented an authentic performance assessment based on different facets of competence for medical near-graduates in the role of beginning residents on a very busy day. Fifty nine candidates participated: 30 VI (Utrecht, The Netherlands) and 29 non-VI (Hamburg, Germany). Two physicians, one nurse and five standardized patients independently assessed each candidate on different facets of competence. Afterwards, the physicians indicated how much supervision they estimated each candidate would require on nine so called "Entrustable Professional Activities (EPAs)" unrelated to the observed scenarios. Graduates from a VI curriculum received significantly higher scores by the physicians for the facet of competence "active professional development", with features like 'reflection' and 'asking for feedback'. In addition, VI graduates scored better on the EPA "solving a management problem", while the non-VI graduates got higher scores for the EPA "breaking bad news". This study gives an impression of the actual performance of medical graduates from VI and non-VI curricula. Even though not many differences were found, VI graduates got higher scores for features of professional development, which is important for postgraduate training and continuing education.

  9. Evolutionary and mechanistic theories of aging.

    PubMed

    Hughes, Kimberly A; Reynolds, Rose M

    2005-01-01

    Senescence (aging) is defined as a decline in performance and fitness with advancing age. Senescence is a nearly universal feature of multicellular organisms, and understanding why it occurs is a long-standing problem in biology. Here we present a concise review of both evolutionary and mechanistic theories of aging. We describe the development of the general evolutionary theory, along with the mutation accumulation, antagonistic pleiotropy, and disposable soma versions of the evolutionary model. The review of the mechanistic theories focuses on the oxidative stress resistance, cellular signaling, and dietary control mechanisms of life span extension. We close with a discussion of how an approach that makes use of both evolutionary and molecular analyses can address a critical question: Which of the mechanisms that can cause variation in aging actually do cause variation in natural populations?

  10. Effects of atmospheric aerosols on scattering reflected visible light from earth resource features

    NASA Technical Reports Server (NTRS)

    Noll, K. E.; Tschantz, B. A.; Davis, W. T.

    1972-01-01

    The vertical variations in atmospheric light attenuation under ambient conditions were identified, and a method through which aerial photographs of earth features might be corrected to yield quantitative information about the actual features was provided. A theoretical equation was developed based on the Bouguer-Lambert extinction law and basic photographic theory.

  11. Joint hypermobility syndrome in childhood. A not so benign multisystem disorder?

    PubMed

    Adib, N; Davies, K; Grahame, R; Woo, P; Murray, K J

    2005-06-01

    Joint hypermobility (JH) or "ligamentous laxity" is felt to be an underlying risk factor for many types of musculoskeletal presentation in paediatrics, and joint hypermobility syndrome (JHS) describes such disorders where symptoms become chronic, often more generalized and associated with functional impairment. Clinical features are felt to have much in common with more severe disorders, including Ehlers-Danlos syndrome (EDS), osteogenesis imperfecta and Marfan syndrome, although this has not been formally studied in children. We defined the clinical characteristics of all patients with joint hypermobility-related presentations seen from 1999 to 2002 in a tertiary referral paediatric rheumatology unit. Patients were identified and recruited from paediatric rheumatology clinic and ward, and a dedicated paediatric rheumatology hypermobility clinic at Great Ormond Street Hospital. Data were collected retrospectively on the patients from the paediatric rheumatology clinics (1999-2002) and prospectively on patients seen in the hypermobility clinic (2000-2002). Specifically, historical details of developmental milestones, musculoskeletal or soft tissue diagnoses and symptoms, and significant past medical history were recorded. Examination features sought included measurements of joint and soft tissue laxity, and associated conditions such as scoliosis, dysmorphic features, cardiac murmurs and eye problems. One hundred and twenty-five children (64 females) were included on whom sufficient clinical data could be identified and who had clinical problems ascribed to JH present for longer than 3 months. Sixty-four were from the paediatric rheumatology clinic and 61 from the hypermobility clinic. No differences were found in any of the measures between the two populations and results are presented in a combined fashion. Three-quarters of referrals came from paediatricians and general practitioners but in only 10% was hypermobility recognized as a possible cause of joint complaint. The average age at onset of symptoms was 6.2 yr and age at diagnosis 9.0 yr, indicating a 2- to 3-yr delay in diagnosis. The major presenting complaint was arthralgia in 74%, abnormal gait in 10%, apparent joint deformity in 10% and back pain in 6%. Mean age at first walking was 15.0 months; 48% were considered "clumsy" and 36% as having poor coordination in early childhood. Twelve per cent had "clicky" hips at birth and 4% actual congenital dislocatable hip. Urinary tract infections were present in 13 and 6% of the female and male cases, respectively. Thirteen and 14%, respectively, had speech and learning difficulties diagnosed. A history of recurrent joint sprains was seen in 20% and actual subluxation/dislocation of joints in 10%. Forty per cent had experienced problems with handwriting tasks, 48% had major limitations of school-based physical education activities, 67% other physical activities and 41% had missed significant periods of schooling because of symptoms. Forty-three per cent described a history of easy bruising. Examination revealed that 94% scored > or =4/9 on the Beighton scale for generalized hypermobility, with knees (92%), elbows (87%), wrists (82%), hand metacarpophalangeal joints (79%), and ankles (75%) being most frequently involved. JHS is poorly recognized in children with a long delay in the time to diagnosis. Although there is a referral bias towards joint symptoms, a surprisingly large proportion is associated with significant neuromuscular and motor development problems. Our patients with JHS also show many overlap features with genetic disorders such as EDS and Marfan syndrome. The delay in diagnosis results in poor control of pain and disruption of normal home life, schooling and physical activities. Knowledge of the diagnosis and simple interventions are likely to be highly effective in reducing the morbidity and cost to the health and social services.

  12. Dermoscopy for common skin problems in Chinese children using a novel Hong Kong-made dermoscope.

    PubMed

    Luk, David C K; Lam, Sam Y Y; Cheung, Patrick C H; Chan, Bill H B

    2014-12-01

    To evaluate the dermoscopic features of common skin problems in Chinese children. A case series with retrospective qualitative analysis of dermoscopic features of common skin problems in Chinese children. A regional hospital in Hong Kong. Dermoscopic image database, from 1 May 2013 to 31 October 2013, of 185 Chinese children (aged 0 to 18 years). Dermoscopic features of common paediatric skin problems in Chinese children were identified. These features corresponded with the known dermoscopic features reported in the western medical literature. New dermoscopic features were identified in café-au-lait macules. Dermoscopic features of common skin problems in Chinese children were consistent with those reported in western medical literature. Dermoscopy has a role in managing children with skin problems.

  13. Word Problems: A "Meme" for Our Times.

    ERIC Educational Resources Information Center

    Leamnson, Robert N.

    1996-01-01

    Discusses a novel approach to word problems that involves linear relationships between variables. Argues that working stepwise through intermediates is the way our minds actually work and therefore this should be used in solving word problems. (JRH)

  14. An Automated Method to Compute Orbital Re-entry Trajectories with Heating Constraints

    NASA Technical Reports Server (NTRS)

    Zimmerman, Curtis; Dukeman, Greg; Hanson, John; Fogle, Frank R. (Technical Monitor)

    2002-01-01

    Determining how to properly manipulate the controls of a re-entering re-usable launch vehicle (RLV) so that it is able to safely return to Earth and land involves the solution of a two-point boundary value problem (TPBVP). This problem, which can be quite difficult, is traditionally solved on the ground prior to flight. If necessary, a nearly unlimited amount of time is available to find the 'best' solution using a variety of trajectory design and optimization tools. The role of entry guidance during flight is to follow the pre- determined reference solution while correcting for any errors encountered along the way. This guidance method is both highly reliable and very efficient in terms of onboard computer resources. There is a growing interest in a style of entry guidance that places the responsibility of solving the TPBVP in the actual entry guidance flight software. Here there is very limited computer time. The powerful, but finicky, mathematical tools used by trajectory designers on the ground cannot in general be converted to do the job. Non-convergence or slow convergence can result in disaster. The challenges of designing such an algorithm are numerous and difficult. Yet the payoff (in the form of decreased operational costs and increased safety) can be substantiaL This paper presents an algorithm that incorporates features of both types of guidance strategies. It takes an initial RLV orbital re-entry state and finds a trajectory that will safely transport the vehicle to Earth. During actual flight, the computed trajectory is used as the reference to be flown by a more traditional guidance method.

  15. A model based on feature objects aided strategy to evaluate the methane generation from food waste by anaerobic digestion.

    PubMed

    Yu, Meijuan; Zhao, Mingxing; Huang, Zhenxing; Xi, Kezhong; Shi, Wansheng; Ruan, Wenquan

    2018-02-01

    A model based on feature objects (FOs) aided strategy was used to evaluate the methane generation from food waste by anaerobic digestion. The kinetics of feature objects was tested by the modified Gompertz model and the first-order kinetic model, and the first-order kinetic hydrolysis constants were used to estimate the reaction rate of homemade and actual food waste. The results showed that the methane yields of four feature objects were significantly different. The anaerobic digestion of homemade food waste and actual food waste had various methane yields and kinetic constants due to the different contents of FOs in food waste. Combining the kinetic equations with the multiple linear regression equation could well express the methane yield of food waste, as the R 2 of food waste was more than 0.9. The predictive methane yields of the two actual food waste were 528.22 mL g -1  TS and 545.29 mL g -1  TS with the model, while the experimental values were 527.47 mL g -1  TS and 522.1 mL g -1  TS, respectively. The relative error between the experimental cumulative methane yields and the predicted cumulative methane yields were both less than 5%. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Automated method for the systematic interpretation of resonance peaks in spectrum data

    DOEpatents

    Damiano, B.; Wood, R.T.

    1997-04-22

    A method is described for spectral signature interpretation. The method includes the creation of a mathematical model of a system or process. A neural network training set is then developed based upon the mathematical model. The neural network training set is developed by using the mathematical model to generate measurable phenomena of the system or process based upon model input parameter that correspond to the physical condition of the system or process. The neural network training set is then used to adjust internal parameters of a neural network. The physical condition of an actual system or process represented by the mathematical model is then monitored by extracting spectral features from measured spectra of the actual process or system. The spectral features are then input into said neural network to determine the physical condition of the system or process represented by the mathematical model. More specifically, the neural network correlates the spectral features (i.e. measurable phenomena) of the actual process or system with the corresponding model input parameters. The model input parameters relate to specific components of the system or process, and, consequently, correspond to the physical condition of the process or system. 1 fig.

  17. Automated method for the systematic interpretation of resonance peaks in spectrum data

    DOEpatents

    Damiano, Brian; Wood, Richard T.

    1997-01-01

    A method for spectral signature interpretation. The method includes the creation of a mathematical model of a system or process. A neural network training set is then developed based upon the mathematical model. The neural network training set is developed by using the mathematical model to generate measurable phenomena of the system or process based upon model input parameter that correspond to the physical condition of the system or process. The neural network training set is then used to adjust internal parameters of a neural network. The physical condition of an actual system or process represented by the mathematical model is then monitored by extracting spectral features from measured spectra of the actual process or system. The spectral features are then input into said neural network to determine the physical condition of the system or process represented by the mathematical. More specifically, the neural network correlates the spectral features (i.e. measurable phenomena) of the actual process or system with the corresponding model input parameters. The model input parameters relate to specific components of the system or process, and, consequently, correspond to the physical condition of the process or system.

  18. 41 CFR 302-6.105 - What is a “compelling reason” warranting extension of my authorized period for claiming an actual...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... reasonâ warranting extension of my authorized period for claiming an actual TQSE reimbursement? 302-6.105... is a “compelling reason” warranting extension of my authorized period for claiming an actual TQSE... problems (e.g., delay in settlement on the new residence, or short-term delay in construction of the...

  19. 41 CFR 302-6.105 - What is a “compelling reason” warranting extension of my authorized period for claiming an actual...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... reasonâ warranting extension of my authorized period for claiming an actual TQSE reimbursement? 302-6.105... is a “compelling reason” warranting extension of my authorized period for claiming an actual TQSE... problems (e.g., delay in settlement on the new residence, or short-term delay in construction of the...

  20. 41 CFR 302-6.105 - What is a “compelling reason” warranting extension of my authorized period for claiming an actual...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... reasonâ warranting extension of my authorized period for claiming an actual TQSE reimbursement? 302-6.105... is a “compelling reason” warranting extension of my authorized period for claiming an actual TQSE... problems (e.g., delay in settlement on the new residence, or short-term delay in construction of the...

  1. 41 CFR 302-6.105 - What is a “compelling reason” warranting extension of my authorized period for claiming an actual...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... reasonâ warranting extension of my authorized period for claiming an actual TQSE reimbursement? 302-6.105... is a “compelling reason” warranting extension of my authorized period for claiming an actual TQSE... problems (e.g., delay in settlement on the new residence, or short-term delay in construction of the...

  2. The Social Problem-Solving Questionnaire: Evaluation of Psychometric Properties among Turkish Primary School Students

    ERIC Educational Resources Information Center

    Dereli Iman, Esra

    2013-01-01

    Problem Statement: Children, like adults, face numerous problems and conflicts in their everyday lives, including issues with peers, siblings, older children, parents, teachers, and other adults. The methods children use to solve such problems are more important than actually facing the problems. The lack of effective social problem-solving skills…

  3. Physicians' messages in problematic sickness certification: a narrative analysis of case reports

    PubMed Central

    2011-01-01

    Background Many physicians find sickness certification tasks problematic. There is some knowledge about situations that are experienced as problematic, whereas less is understood about how physicians respond to the problems they face. One way to acquire such knowledge is to consider "reflection-in-action", aspects of which are expressed in the physician's interpretation of the patient's story. The aim of this study was to gain knowledge about the meaning content of case reports about problematic sickness certification. Specifically, we looked for possible messages to the colleagues intended to read the reports. Methods A narrative approach was used to analyse reports about problematic sickness certification cases that had been written by GPs and occupational health service physicians as part of a sickness insurance course. The analysis included elements from both thematic and structural analysis. Nineteen case reports were used in the actual analysis and 25 in the validation of the results. Main narrative qualities and structural features of the written case reports were explored. Results Five types of messages were identified in the case reports, here classified as "a call for help", "a call for understanding", "hidden worries", "in my opinion", and "appearing neutral". In the reports, the physicians tried to achieve neutrality in their writing, and the patients' stories tended to be interpreted within a traditional biomedical framework. In some cases there was an open request for help, in others it was not obvious that the physician had any problems. Overall, the messages were about having problems as such, rather than the specific features of the problems. Conclusions The case reports clearly demonstrated different ways of writing about problems that arise during sickness certification, from being neutral and not mentioning the problems to being emotionally involved and asking for help. The general character of the messages suggests that they are also relevant for case reports in problematic areas other than sickness certification. If pertinent relationships can be found between reflection-in-practice and the narrative writing about practice, they will provide an approach to further research concerning consultations perceived as problematic and also to medical education. PMID:21481257

  4. Investigation of interactions between limb-manipulator dynamics and effective vehicle roll control characteristics

    NASA Technical Reports Server (NTRS)

    Johnston, D. E.; Mcruer, D. T.

    1986-01-01

    A fixed-base simulation was performed to identify and quantify interactions between the pilot's hand/arm neuromuscular subsystem and such features of typical modern fighter aircraft roll rate command control system mechanization as: (1) force sensing side-stick type manipulator; (2) vehicle effective role time constant; and (3) flight control system effective time delay. The simulation results provide insight to high frequency pilot induced oscillations (PIO) (roll ratchet), low frequency PIO, and roll-to-right control and handling problems previously observed in experimental and production fly-by-wire control systems. The simulation configurations encompass and/or duplicate actual flight situations, reproduce control problems observed in flight, and validate the concept that the high frequency nuisance mode known as roll ratchet derives primarily from the pilot's neuromuscular subsystem. The simulations show that force-sensing side-stick manipulator force/displacement/command gradients, command prefilters, and flight control system time delays need to be carefully adjusted to minimize neuromuscular mode amplitude peaking (roll ratchet tendency) without restricting roll control bandwidth (with resulting sluggish or PIO prone control).

  5. A semi-Lagrangian advection scheme for radioactive tracers in the NCEP Regional Spectral Model (RSM)

    NASA Astrophysics Data System (ADS)

    Chang, E.-C.; Yoshimura, K.

    2015-10-01

    In this study, the non-iteration dimensional-split semi-Lagrangian (NDSL) advection scheme is applied to the National Centers for Environmental Prediction (NCEP) Regional Spectral Model (RSM) to alleviate the Gibbs phenomenon. The Gibbs phenomenon is a problem wherein negative values of positive-definite quantities (e.g., moisture and tracers) are generated by the spectral space transformation in a spectral model system. To solve this problem, the spectral prognostic specific humidity and radioactive tracer advection scheme is replaced by the NDSL advection scheme, which considers advection of tracers in a grid system without spectral space transformations. A regional version of the NDSL is developed in this study and is applied to the RSM. Idealized experiments show that the regional version of the NDSL is successful. The model runs for an actual case study suggest that the NDSL can successfully advect radioactive tracers (iodine-131 and cesium-137) without noise from the Gibbs phenomenon. The NDSL can also remove negative specific humidity values produced in spectral calculations without losing detailed features.

  6. Science, philosophy, and society: some recent books.

    PubMed

    Holtzman, E

    1981-01-01

    The essay discusses a number of issues developed in several recent books on philosophical and ethical problems in the natural sciences, both pure (especially biology) and applied (especially medicine). The scaffolding of the discussion can be outlined as follows: Science is most coherently portrayed as a set of activities through which societies deal with a distinctive, but continually evolving set of interwoven practical, empirical, and conceptual problems. Consequently, approaches which attempt to delineate universal features of "scientific methods" or to depict the sciences as providing an approximation to an "objective" view of reality are much less enlightening than are analyses rooted directly in concrete scientific history and in the actual interplay of science with other social configurations. However, scientists are granted some meaningful autonomy in exercising their "curiosity" and there is a real sense in which scientific ideas and activities do possess momentum of their own. In other words, as is also true for other spheres, such as the arts, it is important not to fall into mechanical viewpoints which treat the movement of science as simply a derivative of forces generated elsewhere.

  7. Multigroup Propensity Score Approach to Evaluating an Effectiveness Trial of the New Beginnings Program.

    PubMed

    Tein, Jenn-Yun; Mazza, Gina L; Gunn, Heather J; Kim, Hanjoe; Stuart, Elizabeth A; Sandler, Irwin N; Wolchik, Sharlene A

    2018-06-01

    We used a multigroup propensity score approach to evaluate a randomized effectiveness trial of the New Beginnings Program (NBP), an intervention targeting divorced or separated families. Two features of effectiveness trials, high nonattendance rates and inclusion of an active control, make program effects harder to detect. To estimate program effects based on actual intervention participation, we created a synthetic inactive control comprised of nonattenders and assessed the impact of attending the NBP or active control relative to no intervention (inactive control). We estimated propensity scores using generalized boosted models and applied inverse probability of treatment weighting for the comparisons. Relative to the inactive control, NBP strengthened parenting quality as well as reduced child exposure to interparental conflict, parent psychological distress, and child internalizing problems. Some effects were moderated by parent gender, parent ethnicity, or child age. On the other hand, the effects of active versus inactive control were minimal for parenting and in the unexpected direction for child internalizing problems. Findings from the propensity score approach complement and enhance the interpretation of findings from the intention-to-treat approach.

  8. Science in a New Mode: Good Old (Theoretical) Science Versus Brave New (Commodified) Knowledge Production?

    NASA Astrophysics Data System (ADS)

    Knuuttila, Tarja

    2013-10-01

    The present transformation of the university system is conceptualized in terms of such terminologies as "Mode-2 knowledge production" and the "entrepreneurial university." What is remarkable about these analyses is how closely they link the generally accepted requirement of more socially relevant knowledge to the commercialization of university research. This paper critically examines the Mode-1/Mode-2 distinction through a combination of philosophical and empirical analysis. It argues that, from the perspective of actual scientific practice, this Mode-1/Mode-2 distinction and the related transition thesis do not stand closer scrutiny. Theoretical "Mode-1" science shares "Mode-2" features in being also problem-oriented, interventive and transdisciplinary. On the other hand, the empirical case on language technology demonstrates that even in "Mode-2"-like research, undertaken in the "context of application," scientists make a distinction between more difficult scientific problems and those that are considered more applied or commercial. Moreover, the case shows that the need to make such distinctions may even become more acute due to the compromises imposed by the commercialization of research.

  9. Direct position determination for digital modulation signals based on improved particle swarm optimization algorithm

    NASA Astrophysics Data System (ADS)

    Yu, Wan-Ting; Yu, Hong-yi; Du, Jian-Ping; Wang, Ding

    2018-04-01

    The Direct Position Determination (DPD) algorithm has been demonstrated to achieve a better accuracy with known signal waveforms. However, the signal waveform is difficult to be completely known in the actual positioning process. To solve the problem, we proposed a DPD method for digital modulation signals based on improved particle swarm optimization algorithm. First, a DPD model is established for known modulation signals and a cost function is obtained on symbol estimation. Second, as the optimization of the cost function is a nonlinear integer optimization problem, an improved Particle Swarm Optimization (PSO) algorithm is considered for the optimal symbol search. Simulations are carried out to show the higher position accuracy of the proposed DPD method and the convergence of the fitness function under different inertia weight and population size. On the one hand, the proposed algorithm can take full advantage of the signal feature to improve the positioning accuracy. On the other hand, the improved PSO algorithm can improve the efficiency of symbol search by nearly one hundred times to achieve a global optimal solution.

  10. Qualitative Analysis for Maintenance Process Assessment

    NASA Technical Reports Server (NTRS)

    Brand, Lionel; Kim, Yong-Mi; Melo, Walcelio; Seaman, Carolyn; Basili, Victor

    1996-01-01

    In order to improve software maintenance processes, we first need to be able to characterize and assess them. These tasks must be performed in depth and with objectivity since the problems are complex. One approach is to set up a measurement-based software process improvement program specifically aimed at maintenance. However, establishing a measurement program requires that one understands the problems to be addressed by the measurement program and is able to characterize the maintenance environment and processes in order to collect suitable and cost-effective data. Also, enacting such a program and getting usable data sets takes time. A short term substitute is therefore needed. We propose in this paper a characterization process aimed specifically at maintenance and based on a general qualitative analysis methodology. This process is rigorously defined in order to be repeatable and usable by people who are not acquainted with such analysis procedures. A basic feature of our approach is that actual implemented software changes are analyzed in order to understand the flaws in the maintenance process. Guidelines are provided and a case study is shown that demonstrates the usefulness of the approach.

  11. An Exploration of Strategies Used by Students To Solve Problems with Multiple Ways of Solution.

    ERIC Educational Resources Information Center

    Santos-Trigo, Manuel

    1996-01-01

    Describes a study that provides information about the extent to which students actually use their mathematical resources and strategies to solve problems. Interviews were used to analyze the problem solving abilities of high school students (N=35) as they solved five problems. (DDR)

  12. A Random Variable Approach to Nuclear Targeting and Survivability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Undem, Halvor A.

    We demonstrate a common mathematical formalism for analyzing problems in nuclear survivability and targeting. This formalism, beginning with a random variable approach, can be used to interpret past efforts in nuclear-effects analysis, including targeting analysis. It can also be used to analyze new problems brought about by the post Cold War Era, such as the potential effects of yield degradation in a permanently untested nuclear stockpile. In particular, we illustrate the formalism through four natural case studies or illustrative problems, linking these to actual past data, modeling, and simulation, and suggesting future uses. In the first problem, we illustrate themore » case of a deterministically modeled weapon used against a deterministically responding target. Classic "Cookie Cutter" damage functions result. In the second problem, we illustrate, with actual target test data, the case of a deterministically modeled weapon used against a statistically responding target. This case matches many of the results of current nuclear targeting modeling and simulation tools, including the result of distance damage functions as complementary cumulative lognormal functions in the range variable. In the third problem, we illustrate the case of a statistically behaving weapon used against a deterministically responding target. In particular, we show the dependence of target damage on weapon yield for an untested nuclear stockpile experiencing yield degradation. Finally, and using actual unclassified weapon test data, we illustrate in the fourth problem the case of a statistically behaving weapon used against a statistically responding target.« less

  13. A self-adaptive algorithm for traffic sign detection in motion image based on color and shape features

    NASA Astrophysics Data System (ADS)

    Zhang, Ka; Sheng, Yehua; Gong, Zhijun; Ye, Chun; Li, Yongqiang; Liang, Cheng

    2007-06-01

    As an important sub-system in intelligent transportation system (ITS), the detection and recognition of traffic signs from mobile images is becoming one of the hot spots in the international research field of ITS. Considering the problem of traffic sign automatic detection in motion images, a new self-adaptive algorithm for traffic sign detection based on color and shape features is proposed in this paper. Firstly, global statistical color features of different images are computed based on statistics theory. Secondly, some self-adaptive thresholds and special segmentation rules for image segmentation are designed according to these global color features. Then, for red, yellow and blue traffic signs, the color image is segmented to three binary images by these thresholds and rules. Thirdly, if the number of white pixels in the segmented binary image exceeds the filtering threshold, the binary image should be further filtered. Fourthly, the method of gray-value projection is used to confirm top, bottom, left and right boundaries for candidate regions of traffic signs in the segmented binary image. Lastly, if the shape feature of candidate region satisfies the need of real traffic sign, this candidate region is confirmed as the detected traffic sign region. The new algorithm is applied to actual motion images of natural scenes taken by a CCD camera of the mobile photogrammetry system in Nanjing at different time. The experimental results show that the algorithm is not only simple, robust and more adaptive to natural scene images, but also reliable and high-speed on real traffic sign detection.

  14. Perceived Motor Competence Differs From Actual Performance in 8-Year-Old Neonatal ECMO Survivors.

    PubMed

    Toussaint, Leontien C C; van der Cammen-van Zijp, Monique H M; Janssen, Anjo J; Tibboel, Dick; van Heijst, Arno F; IJsselstijn, Hanneke

    2016-03-01

    To assess perceived motor competence, social competence, self-worth, health-related quality of life, and actual motor performancein 8-year-old survivors of neonatal extracorporeal membrane oxygenation (ECMO). In a prospective nationwide study, 135 children completed the extended version of the "athletic competence" domain of the Self Perception Profile for Children (SPPC) called the m-CBSK (Motor supplement of the Competentie BelevingsSchaal voor Kinderen) to assess perceived motor competence, the SPPC, and the Pediatric Quality of Life Inventory (PedsQL), andwere tested with the Movement Assessment Battery for Children. SD scores (SDS) were used to compare with the norm. The mean (SD) SDS for perceived motor competence, social competence, and self-worth were all significantly higher than the norm: 0.18 (0.94), P = .03; 0.35 (1.03), P < .001; and 0.32 (1.08), P < .001, respectively. The total PedsQL score was significantly below the norm: mean (SD) SDS: -1.26 (1.53), P < .001. Twenty-two percent of children had actual motor problems. The SDS m-CBSK and actual motor performance did not correlate (r = 0.12; P = .17). The SDS m-CBSK significantly correlated with the athletic competence domain of the SPPC (r = 0.63; P < .001). Eight-year-old ECMO survivors feel satisfied with their motor- and social competence, despite impaired PedsQL scores and motor problems. Because motor problems in ECMO survivorsdeteriorate throughout childhood, clinicians should be aware that these patients may tend to "overrate" their actual motor performance. Education andstrict monitoring of actual motor performanceare important to enable timelyintervention. Copyright © 2016 by the American Academy of Pediatrics.

  15. The Economics of the Duration of the Baseball World Series

    ERIC Educational Resources Information Center

    Cassuto, Alexander E.; Lowenthal, Franklin

    2007-01-01

    This note examines some statistical features of the major league baseball World Series. We show that, based upon actual historical data, we cannot reject the hypothesis that the two World Series teams are evenly matched. Yet, we can also calculate the relative strengths of the teams that would best match the actual outcomes, and we find that those…

  16. Developing a CD-CBM Anticipatory Approach for Cavitation - Defining a Model Descriptor Consistent Between Processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allgood, G.O.; Dress, W.B.; Kercel, S.W.

    1999-05-10

    A major problem with cavitation in pumps and other hydraulic devices is that there is no effective method for detecting or predicting its inception. The traditional approach is to declare the pump in cavitation when the total head pressure drops by some arbitrary value (typically 3o/0) in response to a reduction in pump inlet pressure. However, the pump is already cavitating at this point. A method is needed in which cavitation events are captured as they occur and characterized by their process dynamics. The object of this research was to identify specific features of cavitation that could be used asmore » a model-based descriptor in a context-dependent condition-based maintenance (CD-CBM) anticipatory prognostic and health assessment model. This descriptor was based on the physics of the phenomena, capturing the salient features of the process dynamics. An important element of this concept is the development and formulation of the extended process feature vector @) or model vector. Thk model-based descriptor encodes the specific information that describes the phenomena and its dynamics and is formulated as a data structure consisting of several elements. The first is a descriptive model abstracting the phenomena. The second is the parameter list associated with the functional model. The third is a figure of merit, a single number between [0,1] representing a confidence factor that the functional model and parameter list actually describes the observed data. Using this as a basis and applying it to the cavitation problem, any given location in a flow loop will have this data structure, differing in value but not content. The extended process feature vector is formulated as follows: E`> [ , {parameter Iist}, confidence factor]. (1) For this study, the model that characterized cavitation was a chirped-exponentially decaying sinusoid. Using the parameters defined by this model, the parameter list included frequency, decay, and chirp rate. Based on this, the process feature vector has the form: @=> [, {01 = a, ~= b, ~ = c}, cf = 0.80]. (2) In this experiment a reversible catastrophe was examined. The reason for this is that the same catastrophe could be repeated to ensure the statistical significance of the data.« less

  17. Toxoplasma gondii infection in humans in China

    PubMed Central

    2011-01-01

    Toxoplasmosis is a zoonotic infection of humans and animals, caused by the opportunistic protozoan Toxoplasma gondii, a parasite belonging to the phylum Apicomplexa. Infection in pregnant women may lead to abortion, stillbirth or other serious consequences in newborns. Infection in immunocompromised patients can be fatal if not treated. On average, one third of people are chronically infected worldwide. Although very limited information from China has been published in the English journals, T. gondii infection is actually a significant human health problem in China. In the present article, we reviewed the clinical features, transmission, prevalence of T. gondii infection in humans in China, and summarized genetic characterizations of reported T. gondii isolates. Educating the public about the risks associated with unhealthy food and life style habits, tracking serological examinations to special populations, and measures to strengthen food and occupational safety are discussed. PMID:21864327

  18. A training paradigm to enhance performance and safe use of an innovative neuroendovascular device

    PubMed Central

    Ricci, Donald R.; Marotta, Thomas R.; Riina, Howard A.; Wan, Martina; De Vries, Joost

    2016-01-01

    Training has been important to facilitate the safe use of new devices designed to repair vascular structures. This paper outlines the generic elements of a training program for vascular devices and uses as an example the actual training requirements for a novel device developed for the treatment of bifurcation intracranial aneurysms. Critical elements of the program include awareness of the clinical problem, technical features of device, case selection, and use of a simulator. Formal proctoring, evaluation of the training, and recording the clinical outcomes complement these elements. Interventional physicians should embrace the merits of a training module to improve the user experience, and vendors, physicians, and patients alike should be aligned in the goal of device training to improve its success rate and minimize complications of the procedure. PMID:27867466

  19. Tubal telocytes: factor infertility reason?

    PubMed

    Aleksandrovych, Veronika; Sajewicz, Marek; Walocha, Jerzy A; Gil, Krzysztof

    Infertility is actually widespread pathological condition, which affected one in every four couples in developing countries. Approximately one third of all cases are connected with tubal factor infertility, o en accompanies by endometriosis, acute salpingitis, urogenital infections etc. The newly identified telocytes (TCs) have multiple potential bio-functions and might participate in the fertility problems. They influence on structural and functional integrity of oviduct tissue. Despite recent discovery, TCs involvement in the majority of physiological and pathological processes is still unclear and require significant increasing of deep observations and data analysis. Focusing on female reproductive system help better understands the main reasons of infertility, while evaluation of TCs impact on Fallopian tube and uterus contractility might be a key point of its correction. The article summarizes the main features of telocytes in Fallopian tubes, emphasizing their involvement in pathophysiological processes and tubal factor infertility.

  20. Bootstrap and Counter-Bootstrap approaches for formation of the cortege of Informative indicators by Results of Measurements

    NASA Astrophysics Data System (ADS)

    Artemenko, M. V.; Chernetskaia, I. E.; Kalugina, N. M.; Shchekina, E. N.

    2018-04-01

    This article describes the solution of the actual problem of the productive formation of a cortege of informative measured features of the object of observation and / or control using author's algorithms for the use of bootstraps and counter-bootstraps technologies for processing the results of measurements of various states of the object on the basis of different volumes of the training sample. The work that is presented in this paper considers aggregation by specific indicators of informative capacity by linear, majority, logical and “greedy” methods, applied both individually and integrally. The results of the computational experiment are discussed, and in conclusion is drawn that the application of the proposed methods contributes to an increase in the efficiency of classification of the states of the object from the results of measurements.

  1. Development of a Quantitative Decision Metric for Selecting the Most Suitable Discretization Method for SN Transport Problems

    NASA Astrophysics Data System (ADS)

    Schunert, Sebastian

    In this work we develop a quantitative decision metric for spatial discretization methods of the SN equations. The quantitative decision metric utilizes performance data from selected test problems for computing a fitness score that is used for the selection of the most suitable discretization method for a particular SN transport application. The fitness score is aggregated as a weighted geometric mean of single performance indicators representing various performance aspects relevant to the user. Thus, the fitness function can be adjusted to the particular needs of the code practitioner by adding/removing single performance indicators or changing their importance via the supplied weights. Within this work a special, broad class of methods is considered, referred to as nodal methods. This class is naturally comprised of the DGFEM methods of all function space families. Within this work it is also shown that the Higher Order Diamond Difference (HODD) method is a nodal method. Building on earlier findings that the Arbitrarily High Order Method of the Nodal type (AHOTN) is also a nodal method, a generalized finite-element framework is created to yield as special cases various methods that were developed independently using profoundly different formalisms. A selection of test problems related to a certain performance aspect are considered: an Method of Manufactured Solutions (MMS) test suite for assessing accuracy and execution time, Lathrop's test problem for assessing resilience against occurrence of negative fluxes, and a simple, homogeneous cube test problem to verify if a method possesses the thick diffusive limit. The contending methods are implemented as efficiently as possible under a common SN transport code framework to level the playing field for a fair comparison of their computational load. Numerical results are presented for all three test problems and a qualitative rating of each method's performance is provided for each aspect: accuracy/efficiency, resilience against negative fluxes, and possession of the thick diffusion limit, separately. The choice of the most efficient method depends on the utilized error norm: in Lp error norms higher order methods such as the AHOTN method of order three perform best, while for computing integral quantities the linear nodal (LN) method is most efficient. The most resilient method against occurrence of negative fluxes is the simple corner balance (SCB) method. A validation of the quantitative decision metric is performed based on the NEA box-inbox suite of test problems. The validation exercise comprises two stages: first prediction of the contending methods' performance via the decision metric and second computing the actual scores based on data obtained from the NEA benchmark problem. The comparison of predicted and actual scores via a penalty function (ratio of predicted best performer's score to actual best score) completes the validation exercise. It is found that the decision metric is capable of very accurate predictions (penalty < 10%) in more than 83% of the considered cases and features penalties up to 20% for the remaining cases. An exception to this rule is the third test case NEA-III intentionally set up to incorporate a poor match of the benchmark with the "data" problems. However, even under these worst case conditions the decision metric's suggestions are never detrimental. Suggestions for improving the decision metric's accuracy are to increase the pool of employed data, to refine the mapping of a given configuration to a case in the database, and to better characterize the desired target quantities.

  2. Parents’ Aggressive Influences and Children's Aggressive Problem Solutions with Peers

    PubMed Central

    Duman, Sarah; Margolin, Gayla

    2009-01-01

    This study examined children's aggressive and assertive solutions to hypothetical peer scenarios in relation to parents’ responses to similar hypothetical social scenarios and parents’ actual marital aggression. The study included 118 9−10 year old children, and their mothers and fathers. Children's aggressive solutions correlated with same-sex parents’ actual marital aggression. For children with mothers who exhibit low actual marital aggression, mothers’ aggressive solutions to hypothetical situations corresponded with children's tendencies to propose aggressive but not assertive solutions. In a 3-way interaction, fathers’ aggressive solutions to peer scenarios and marital aggression, combined, exacerbated girls’ aggressive problem solving, but had the opposite effect for boys. Discussion addresses the complexity, particularly with respect to parent and child gender combinations, in understanding parents’ aggressive influences on children's peer relationships. PMID:17206880

  3. LiDAR DTMs and anthropogenic feature extraction: testing the feasibility of geomorphometric parameters in floodplains

    NASA Astrophysics Data System (ADS)

    Sofia, G.; Tarolli, P.; Dalla Fontana, G.

    2012-04-01

    In floodplains, massive investments in land reclamation have always played an important role in the past for flood protection. In these contexts, human alteration is reflected by artificial features ('Anthropogenic features'), such as banks, levees or road scarps, that constantly increase and change, in response to the rapid growth of human populations. For these areas, various existing and emerging applications require up-to-date, accurate and sufficiently attributed digital data, but such information is usually lacking, especially when dealing with large-scale applications. More recently, National or Local Mapping Agencies, in Europe, are moving towards the generation of digital topographic information that conforms to reality and are highly reliable and up to date. LiDAR Digital Terrain Models (DTMs) covering large areas are readily available for public authorities, and there is a greater and more widespread interest in the application of such information by agencies responsible for land management for the development of automated methods aimed at solving geomorphological and hydrological problems. Automatic feature recognition based upon DTMs can offer, for large-scale applications, a quick and accurate method that can help in improving topographic databases, and that can overcome some of the problems associated with traditional, field-based, geomorphological mapping, such as restrictions on access, and constraints of time or costs. Although anthropogenic features as levees and road scarps are artificial structures that actually do not belong to what is usually defined as the bare ground surface, they are implicitly embedded in digital terrain models (DTMs). Automatic feature recognition based upon DTMs, therefore, can offer a quick and accurate method that does not require additional data, and that can help in improving flood defense asset information, flood modeling or other applications. In natural contexts, morphological indicators derived from high resolution topography have been proven to be reliable for feasible applications. The use of statistical operators as thresholds for these geomorphic parameters, furthermore, showed a high reliability for feature extraction in mountainous environments. The goal of this research is to test if these morphological indicators and objective thresholds can be feasible also in floodplains, where features assume different characteristics and other artificial disturbances might be present. In the work, three different geomorphic parameters are tested and applied at different scales on a LiDAR DTM of typical alluvial plain's area in the North East of Italy. The box-plot is applied to identify the threshold for feature extraction, and a filtering procedure is proposed, to improve the quality of the final results. The effectiveness of the different geomorphic parameters is analyzed, comparing automatically derived features with the surveyed ones. The results highlight the capability of high resolution topography, geomorphic indicators and statistical thresholds for anthropogenic features extraction and characterization in a floodplains context.

  4. Six Sigma Approach to Improve Stripping Quality of Automotive Electronics Component – a case study

    NASA Astrophysics Data System (ADS)

    Razali, Noraini Mohd; Murni Mohamad Kadri, Siti; Con Ee, Toh

    2018-03-01

    Lacking of problem solving skill techniques and cooperation between support groups are the two obstacles that always been faced in actual production line. Inadequate detail analysis and inappropriate technique in solving the problem may cause the repeating issues which may give impact to the organization performance. This study utilizes a well-structured six sigma DMAIC with combination of other problem solving tools to solve product quality problem in manufacturing of automotive electronics component. The study is concentrated at the stripping process, a critical process steps with highest rejection rate that contribute to the scrap and rework performance. The detail analysis is conducted in the analysis phase to identify the actual root cause of the problem. Then several improvement activities are implemented and the results show that the rejection rate due to stripping defect decrease tremendously and the process capability index improved from 0.75 to 1.67. This results prove that the six sigma approach used to tackle the quality problem is substantially effective.

  5. Electronic medical records for otolaryngology office-based practice.

    PubMed

    Chernobilsky, Boris; Boruk, Marina

    2008-02-01

    Pressure is mounting on physicians to adopt electronic medical records. The field of health information technology is evolving rapidly with innovations and policies often outpacing science. We sought to review research and discussions about electronic medical records from the past year to keep abreast of these changes. Original scientific research, especially from otolaryngologists, is lacking in this field. Adoption rates are slowly increasing, but more of the burden is shouldered by physicians despite policy efforts and the clear benefits to third-party payers. Scientific research from the past year suggests lack of improvements and even decreasing quality of healthcare with electronic medical record adoption in the ambulatory care setting. The increasing prevalence and standardization of electronic medical record systems results in a new set of problems including rising costs, audits, difficulties in transition and public concerns about security of information. As major players in healthcare continue to push for adoption, increased effort must be made to demonstrate actual improvements in patient care in the ambulatory care setting. More scientific studies are needed to demonstrate what features of electronic medical records actually improve patient care. Otolaryngologists should help each other by disseminating research about improvement in patient outcomes with their systems since current adoption and outcomes policies do not apply to specialists.

  6. An evaluation of the lap-shear test for Sn-rich solder/Cu couples: Experiments and simulation

    NASA Astrophysics Data System (ADS)

    Chawla, N.; Shen, Y.-L.; Deng, X.; Ege, E. S.

    2004-12-01

    The lap-shear technique is commonly used to evaluate the shear, creep, and thermal fatigue behavior of solder joints. We have conducted a parametric experimental and modeling study, on the effect of testing and geometrical parameters on solder/copper joint response in lap-shear. It was shown that the farfield applied strain is quite different from the actual solder strain (measured optically). Subtraction of the deformation of the Cu substrate provides a reasonable approximation of the solder strain in the elastic regime, but not in the plastic regime. Solder joint thickness has a profound effect on joint response. The solder response moves progressively closer to “true” shear response with increasing joint thickness. Numerical modeling using finite-element analyses were performed to rationalize the experimental findings. The same lap-shear configuration was used in the simulation. The input response for solder was based on the experimental tensile test result on bulk specimens. The calculated shear response, using both the commonly adopted far-field measure and the actual shear strain in solder, was found to be consistent with the trends observed in the lap-shear experiments. The geometric features were further explored to provide physical insight into the problem. Deformation of the substrate was found to greatly influence the shear behavior of the solder.

  7. D Capturing Performances of Low-Cost Range Sensors for Mass-Market Applications

    NASA Astrophysics Data System (ADS)

    Guidi, G.; Gonizzi, S.; Micoli, L.

    2016-06-01

    Since the advent of the first Kinect as motion controller device for the Microsoft XBOX platform (November 2010), several similar active and low-cost range sensing devices have been introduced on the mass-market for several purposes, including gesture based interfaces, 3D multimedia interaction, robot navigation, finger tracking, 3D body scanning for garment design and proximity sensors for automotive. However, given their capability to generate a real time stream of range images, these has been used in some projects also as general purpose range devices, with performances that for some applications might be satisfying. This paper shows the working principle of the various devices, analyzing them in terms of systematic errors and random errors for exploring the applicability of them in standard 3D capturing problems. Five actual devices have been tested featuring three different technologies: i) Kinect V1 by Microsoft, Structure Sensor by Occipital, and Xtion PRO by ASUS, all based on different implementations of the Primesense sensor; ii) F200 by Intel/Creative, implementing the Realsense pattern projection technology; Kinect V2 by Microsoft, equipped with the Canesta TOF Camera. A critical analysis of the results tries first of all to compare them, and secondarily to focus the range of applications for which such devices could actually work as a viable solution.

  8. Single-Word Recognition Need Not Depend on Single-Word Features: Narrative Coherence Counteracts Effects of Single-Word Features That Lexical Decision Emphasizes

    ERIC Educational Resources Information Center

    Teng, Dan W.; Wallot, Sebastian; Kelty-Stephen, Damian G.

    2016-01-01

    Research on reading comprehension of connected text emphasizes reliance on single-word features that organize a stable, mental lexicon of words and that speed or slow the recognition of each new word. However, the time needed to recognize a word might not actually be as fixed as previous research indicates, and the stability of the mental lexicon…

  9. Classification of Partial Discharge Measured under Different Levels of Noise Contamination.

    PubMed

    Jee Keen Raymond, Wong; Illias, Hazlee Azil; Abu Bakar, Ab Halim

    2017-01-01

    Cable joint insulation breakdown may cause a huge loss to power companies. Therefore, it is vital to diagnose the insulation quality to detect early signs of insulation failure. It is well known that there is a correlation between Partial discharge (PD) and the insulation quality. Although many works have been done on PD pattern recognition, it is usually performed in a noise free environment. Also, works on PD pattern recognition in actual cable joint are less likely to be found in literature. Therefore, in this work, classifications of actual cable joint defect types from partial discharge data contaminated by noise were performed. Five cross-linked polyethylene (XLPE) cable joints with artificially created defects were prepared based on the defects commonly encountered on site. Three different types of input feature were extracted from the PD pattern under artificially created noisy environment. These include statistical features, fractal features and principal component analysis (PCA) features. These input features were used to train the classifiers to classify each PD defect types. Classifications were performed using three different artificial intelligence classifiers, which include Artificial Neural Networks (ANN), Adaptive Neuro-Fuzzy Inference System (ANFIS) and Support Vector Machine (SVM). It was found that the classification accuracy decreases with higher noise level but PCA features used in SVM and ANN showed the strongest tolerance against noise contamination.

  10. Time-Frequency Distribution of Seismocardiographic Signals: A Comparative Study

    PubMed Central

    Taebi, Amirtaha; Mansy, Hansen A.

    2017-01-01

    Accurate estimation of seismocardiographic (SCG) signal features can help successful signal characterization and classification in health and disease. This may lead to new methods for diagnosing and monitoring heart function. Time-frequency distributions (TFD) were often used to estimate the spectrotemporal signal features. In this study, the performance of different TFDs (e.g., short-time Fourier transform (STFT), polynomial chirplet transform (PCT), and continuous wavelet transform (CWT) with different mother functions) was assessed using simulated signals, and then utilized to analyze actual SCGs. The instantaneous frequency (IF) was determined from TFD and the error in estimating IF was calculated for simulated signals. Results suggested that the lowest IF error depended on the TFD and the test signal. STFT had lower error than CWT methods for most test signals. For a simulated SCG, Morlet CWT more accurately estimated IF than other CWTs, but Morlet did not provide noticeable advantages over STFT or PCT. PCT had the most consistently accurate IF estimations and appeared more suited for estimating IF of actual SCG signals. PCT analysis showed that actual SCGs from eight healthy subjects had multiple spectral peaks at 9.20 ± 0.48, 25.84 ± 0.77, 50.71 ± 1.83 Hz (mean ± SEM). These may prove useful features for SCG characterization and classification. PMID:28952511

  11. An Optimization-Based Method for Feature Ranking in Nonlinear Regression Problems.

    PubMed

    Bravi, Luca; Piccialli, Veronica; Sciandrone, Marco

    2017-04-01

    In this paper, we consider the feature ranking problem, where, given a set of training instances, the task is to associate a score with the features in order to assess their relevance. Feature ranking is a very important tool for decision support systems, and may be used as an auxiliary step of feature selection to reduce the high dimensionality of real-world data. We focus on regression problems by assuming that the process underlying the generated data can be approximated by a continuous function (for instance, a feedforward neural network). We formally state the notion of relevance of a feature by introducing a minimum zero-norm inversion problem of a neural network, which is a nonsmooth, constrained optimization problem. We employ a concave approximation of the zero-norm function, and we define a smooth, global optimization problem to be solved in order to assess the relevance of the features. We present the new feature ranking method based on the solution of instances of the global optimization problem depending on the available training data. Computational experiments on both artificial and real data sets are performed, and point out that the proposed feature ranking method is a valid alternative to existing methods in terms of effectiveness. The obtained results also show that the method is costly in terms of CPU time, and this may be a limitation in the solution of large-dimensional problems.

  12. Qualitative Research? Quantitative Research? What's the Problem? Resolving the Dilemma via a Postconstructivist Approach.

    ERIC Educational Resources Information Center

    Shank, Gary

    It is argued that the debate between qualitative and quantitative research for educational researchers is actually an argument between constructivism and positivism. Positivism has been the basis for most quantitative research in education. Two different things are actually meant when constructivism is discussed (constructivism and…

  13. The Self-Actualizing Case Method.

    ERIC Educational Resources Information Center

    Gunn, Bruce

    1980-01-01

    Presents a case procedure designed to assist trainees in perfecting their problem-solving skills. Elements of that procedure are the rationale behind this "self-actualizing" case method; the role that the instructor, case leaders, and participants play in its execution; and the closed-loop grading system used for peer evaluation. (CT)

  14. US Marine Corps assault amphibious vehicle suspension system analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hammonds, C.J.; Jones, J.K.; Mayhall, J.A.

    1988-11-01

    In response to a request from the US Marine Corps (USMC), the Oak Ridge National Laboratory investigated a problem with the suspension system of the assault amphibious vehicle (AAV), Personnel Model 7A1. In the course of the investigation, drawings of the AAV and field survey data on bearing failures provided by VSE Corporation were used. The analysis approach taken was to model the suspension system and the vehicle hull and support structure using finite element techniques. This provided stress and deflection information for the system. To determine the loads imparted to the system as the AAV traversed terrain features, amore » dynamics model was developed to provide loads to the finite element analysis (FEA). Because the primary indication of a problem was frequent suspension-system bearing failure, an analysis of the suspension-system bearings was conducted. Finally, to check the accuracy of the models and to provide actual load data for bearing analysis, an instrumented AAV was tested over a surveyed course at Camp Pendleton, California. Initially the dynamics model assumed the interface between the hull and the suspension system to be fixed. Later improvements incorporating the flexibility of the vehicle hull into the analysis by linking the two models resulted in improved accuracy. Actual measurements of the front road-arm displacement and vertical acceleration of the chassis are compared with predictions from the model. The correlation is quite good and indicates that the model can accurately predict the dynamic load on each road wheel for input into finite element analyses. The dynamics model can be expanded to study the effects of adding weight to the vehicle, traversing other terrains, or evaluating inputs such as weapons firing or drop tests. 7 refs., 75 figs., 10 tabs.« less

  15. Osmoregulation in the Halophilic Bacterium Halomonas elongata: A Case Study for Integrative Systems Biology.

    PubMed

    Kindzierski, Viktoria; Raschke, Silvia; Knabe, Nicole; Siedler, Frank; Scheffer, Beatrix; Pflüger-Grau, Katharina; Pfeiffer, Friedhelm; Oesterhelt, Dieter; Marin-Sanguino, Alberto; Kunte, Hans-Jörg

    2017-01-01

    Halophilic bacteria use a variety of osmoregulatory methods, such as the accumulation of one or more compatible solutes. The wide diversity of compounds that can act as compatible solute complicates the task of understanding the different strategies that halophilic bacteria use to cope with salt. This is specially challenging when attempting to go beyond the pathway that produces a certain compatible solute towards an understanding of how the metabolic network as a whole addresses the problem. Metabolic reconstruction based on genomic data together with Flux Balance Analysis (FBA) is a promising tool to gain insight into this problem. However, as more of these reconstructions become available, it becomes clear that processes predicted by genome annotation may not reflect the processes that are active in vivo. As a case in point, E. coli is unable to grow aerobically on citrate in spite of having all the necessary genes to do it. It has also been shown that the realization of this genetic potential into an actual capability to metabolize citrate is an extremely unlikely event under normal evolutionary conditions. Moreover, many marine bacteria seem to have the same pathways to metabolize glucose but each species uses a different one. In this work, a metabolic network inferred from genomic annotation of the halophilic bacterium Halomonas elongata and proteomic profiling experiments are used as a starting point to motivate targeted experiments in order to find out some of the defining features of the osmoregulatory strategies of this bacterium. This new information is then used to refine the network in order to describe the actual capabilities of H. elongata, rather than its genetic potential.

  16. An Automated Method to Compute Orbital Re-Entry Trajectories with Heating Constraints

    NASA Technical Reports Server (NTRS)

    Zimmerman, Curtis; Dukeman, Greg; Hanson, John; Fogle, Frank R. (Technical Monitor)

    2002-01-01

    Determining how to properly manipulate the controls of a re-entering re-usable launch vehicle (RLV) so that it is able to safely return to Earth and land involves the solution of a two-point boundary value problem (TPBVP). This problem, which can be quite difficult, is traditionally solved on the ground prior to flight. If necessary, a nearly unlimited amount of time is available to find the "best" solution using a variety of trajectory design and optimization tools. The role of entry guidance during flight is to follow the pre-determined reference solution while correcting for any errors encountered along the way. This guidance method is both highly reliable and very efficient in terms of onboard computer resources. There is a growing interest in a style of entry guidance that places the responsibility of solving the TPBVP in the actual entry guidance flight software. Here there is very limited computer time. The powerful, but finicky, mathematical tools used by trajectory designers on the ground cannot in general be made to do the job. Nonconvergence or slow convergence can result in disaster. The challenges of designing such an algorithm are numerous and difficult. Yet the payoff (in the form of decreased operational costs and increased safety) can be substantial. This paper presents an algorithm that incorporates features of both types of guidance strategies. It takes an initial RLV orbital re-entry state and finds a trajectory that will safely transport the vehicle to a Terminal Area Energy Management (TAEM) region. During actual flight, the computed trajectory is used as the reference to be flown by a more traditional guidance method.

  17. A Comparison of Actual and Perceived Problem Drinking among Driving while Intoxicated (DWI) Offenders

    ERIC Educational Resources Information Center

    Barry, Adam E.; Dennis, Maurice

    2011-01-01

    Problem drinkers account for a large proportion of those convicted of driving while intoxicated (DWI). Nevertheless, specific rates of problem drinking among DWI offenders have been shown to exhibit wide variability. Therefore, we seek to (a) present the rate and severity of problem drinking among a sample of DWI offenders, (b) contrast…

  18. Progressive biparietal atrophy: an atypical presentation of Alzheimer's disease.

    PubMed Central

    Ross, S J; Graham, N; Stuart-Green, L; Prins, M; Xuereb, J; Patterson, K; Hodges, J R

    1996-01-01

    OBJECTIVES: To define the clinical, neuropsychological, and radiological features of bilateral parietal lobe atrophy. METHODS: Four patients underwent a comprehensive longitudinal neuropsychological assessment, as well as MRI and HMPAO-SPECT. RESULTS: The consistent findings in the patients were early visuospatial problems, agraphia of a predominantly peripheral (or apraxic) type, and difficulty with bimanual tasks, all of which outweighted deficits in memory and language until later in the course of the illness. As the disease progressed, impairments in the phonological aspects of language and in auditory-verbal short term memory were often striking, perhaps reflecting spread from the parietal lobe to perisylvian language areas. Three patients went on to develop a global dementia and fulfilled the criteria for a clinical diagnosis of probable Alzheimer's disease; the fourth patient has only recently been identified. Neuroimaging disclosed bilateral parietal lobe atrophy (MRI) and hypoperfusion (SPECT), which was out of proportion to that seen elsewhere in the brain. One patient has died and had pathologically confirmed Alzheimer's disease with particular concentration in both superior parietal lobes. CONCLUSIONS: Bilateral biparietal atrophy is a recognisable clinical syndrome which can be the presenting feature of Alzheimer's disease. Although the label "posterior cortical atrophy" has been applied to such cases, review of the medical literature suggests that this broad rubric actually consists of two main clinical syndromes with features reflecting involvement of the occipitotemporal (ventral) and biparietal (dorsal) cortical areas respectively. Images PMID:8890778

  19. Pre-produsage and the remediation of virtual products

    NASA Astrophysics Data System (ADS)

    Skågeby, Jörgen

    2011-04-01

    This paper introduces and explores cycles of pre-produsage and produsage. It reports on the results from an online ethnographical study of the Apple iPad conducted before the public release of the material product. Consequently, most users had not physically interacted with the device in question. Nevertheless, the release of the technical specifications and marketing material generated a massive amount of produsage-related online discussion. As such this paper explores the concept of pre-produsage. Pre-produsage is a form of predicted or expected use, relating to products or services that are only accessible to users as a form of representation (e.g. technical specification, virtual prototype, and design sketch), but with an added element of user-generated design suggestions, conflict coordination, and software development. Remediation-the process by which new digital media technologies reuses qualities of previous technologies and enters an existing media ecology-is a prevalent theme in pre-produsage and involves a tension between features that support protracted use and features that provide total innovation. The paper argues that an analysis of pre-produsage can provide insights that relate to both anticipated and actual user experience (UX). More specifically, pre-produsage analysis can trace the underlying reasons for a certain problem, intention, or concern and connect it to a specific set of features and potential solutions. Finally, the paper shows how proprietary products become subject to produsage, resulting in artifacts negotiated by cycles of produsage.

  20. [Electromagnetic interference in the current era of cardiac implantable electronic devices designed for magnetic resonance environment].

    PubMed

    Ribatti, Valentina; Santini, Luca; Forleo, Giovanni B; Della Rocca, Domenico; Panattoni, Germana; Scali, Marta; Schirripa, Valentina; Danisi, Nicola; Ammirati, Fabrizio; Santini, Massimo

    2017-04-01

    In the last decades we are observing a continuous increase in the number of patients wearing cardiac implantable electronic devices (CIEDs). At the same time, we face daily with a domestic and public environment featured more and more by the presence and the utilization of new emitters and finally, more medical procedures are based on electromagnetic fields as well. Therefore, the topic of the interaction of devices with electromagnetic interference (EMI) is increasingly a real and actual problem.In the medical environment most attention is paid to magnetic resonance, nevertheless the risk of interaction is present also with ionizing radiation, electrical nerve stimulation and electrosurgery. In the non-medical environment, most studies reported in the literature focused on mobile phones, metal detectors, as well as on headphones or digital players as potential EMI sources, but many other instruments and tools may be intentional or non-intentional sources of electromagnetic fields.CIED manufacturers are more and more focusing on new technological features in order to make implantable devices less susceptible to EMI. However, patients and emitter manufacturers should be aware that limitations exist and that there is not complete immunity to EMI.

  1. Modeling chromatic instrumental effects for a better model fitting of optical interferometric data

    NASA Astrophysics Data System (ADS)

    Tallon, M.; Tallon-Bosc, I.; Chesneau, O.; Dessart, L.

    2014-07-01

    Current interferometers often collect data simultaneously in many spectral channels by using dispersed fringes. Such polychromatic data provide powerful insights in various physical properties, where the observed objects show particular spectral features. Furthermore, one can measure spectral differential visibilities that do not directly depend on any calibration by a reference star. But such observations may be sensitive to instrumental artifacts that must be taken into account in order to fully exploit the polychromatic information of interferometric data. As a specimen, we consider here an observation of P Cygni with the VEGA visible combiner on CHARA interferometer. Indeed, although P Cygni is particularly well modeled by the radiative transfer code CMFGEN, we observe questionable discrepancies between expected and actual interferometric data. The problem is to determine their origin and disentangle possible instrumental effects from the astrophysical information. By using an expanded model fitting, which includes several instrumental features, we show that the differential visibilities are well explained by instrumental effects that could be otherwise attributed to the object. Although this approach leads to more reliable results, it assumes a fit specific to a particular instrument, and makes it more difficult to develop a generic model fitting independent of any instrument.

  2. Adaptive image inversion of contrast 3D echocardiography for enabling automated analysis.

    PubMed

    Shaheen, Anjuman; Rajpoot, Kashif

    2015-08-01

    Contrast 3D echocardiography (C3DE) is commonly used to enhance the visual quality of ultrasound images in comparison with non-contrast 3D echocardiography (3DE). Although the image quality in C3DE is perceived to be improved for visual analysis, however it actually deteriorates for the purpose of automatic or semi-automatic analysis due to higher speckle noise and intensity inhomogeneity. Therefore, the LV endocardial feature extraction and segmentation from the C3DE images remains a challenging problem. To address this challenge, this work proposes an adaptive pre-processing method to invert the appearance of C3DE image. The image inversion is based on an image intensity threshold value which is automatically estimated through image histogram analysis. In the inverted appearance, the LV cavity appears dark while the myocardium appears bright thus making it similar in appearance to a 3DE image. Moreover, the resulting inverted image has high contrast and low noise appearance, yielding strong LV endocardium boundary and facilitating feature extraction for segmentation. Our results demonstrate that the inverse appearance of contrast image enables the subsequent LV segmentation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Study of Double-Weighted Graph Model and Optimal Path Planning for Tourist Scenic Area Oriented Intelligent Tour Guide

    NASA Astrophysics Data System (ADS)

    Shi, Y.; Long, Y.; Wi, X. L.

    2014-04-01

    When tourists visiting multiple tourist scenic spots, the travel line is usually the most effective road network according to the actual tour process, and maybe the travel line is different from planned travel line. For in the field of navigation, a proposed travel line is normally generated automatically by path planning algorithm, considering the scenic spots' positions and road networks. But when a scenic spot have a certain area and have multiple entrances or exits, the traditional described mechanism of single point coordinates is difficult to reflect these own structural features. In order to solve this problem, this paper focuses on the influence on the process of path planning caused by scenic spots' own structural features such as multiple entrances or exits, and then proposes a doubleweighted Graph Model, for the weight of both vertexes and edges of proposed Model can be selected dynamically. And then discusses the model building method, and the optimal path planning algorithm based on Dijkstra algorithm and Prim algorithm. Experimental results show that the optimal planned travel line derived from the proposed model and algorithm is more reasonable, and the travelling order and distance would be further optimized.

  4. The Sensitivity of Orographic Precipitation to Flow Direction

    NASA Astrophysics Data System (ADS)

    Mass, C.; Picard, L.

    2015-12-01

    An area of substantial interest is the sensitivity of orographic precipitation to the characteristics of the incoming flow and to the surrounding environment. Some studies have suggested substantial sensitivity of precipitation within individual river drainages for relatively small directional or stability variations of incoming flow. A characterization of such flow sensitivity would be of great value for hydrometeorological prediction, the determination of Probable Maximum Precipitation statistics, and for quantifying the uncertainty in precipitation and hydrological forecasts. To gain insight into this problem, an idealized version of the Weather Research and Forecasting (WRF) modeling system was created in which simulations are driven by a single vertical sounding, with the assumption of thermal wind balance. The actual terrain is used and the full physics complement of the modeling system. The presentation will show how precipitation over the Olympic Mountains of Washington State varies as flow direction changes. This analysis will include both the aggregate precipitation over the barrier and the precipitation within individual drainages or areas. The role of surrounding terrain and the nearby coastline are also examined by removing these features from simulations. Finally, the impact of varying flow stability and speed on the precipitation over this orographic feature will be described.

  5. Automation--planning to implementation; the problems en route.

    PubMed Central

    Pizer, I H

    1976-01-01

    Once the major decision to automate library processes is made, there are a variety of problems which may be encountered before the planned system becomes operational. These include problems of personnel, budget, procurement of adjunct services, institutional priorities, and manufacturing uncertainties. Actual and potential difficulties are discussed. PMID:1247703

  6. Why Adolescent Problem Gamblers Do Not Seek Treatment

    ERIC Educational Resources Information Center

    Ladouceur, Robert; Blaszczynski, Alexander; Pelletier, Amelie

    2004-01-01

    Prevalence studies indicate that approximately 40% of adolescents participate in regular gambling with rates of problem gambling up to four times greater than that found in adult populations. However, it appears that few adolescents actually seek treatment for such problems. The purpose of this study was to explore potential reasons why…

  7. E-Coli and Other Problems

    ERIC Educational Resources Information Center

    Scott, Paul

    2009-01-01

    In applied mathematics particularly, one is interested in modeling real life situations; that is why, one tries to express some actual phenomenon mathematically, and then uses mathematics to determine future outcomes. It may be that one actually wishes to change the future outcome. Mathematics will not do this, but at least it tells one what to…

  8. Digital Photography as a Tool to Measure School Cafeteria Consumption

    ERIC Educational Resources Information Center

    Swanson, Mark

    2008-01-01

    Background: Assessing actual consumption of school cafeteria meals presents challenges, given recall problems of children, the cost of direct observation, and the time constraints in the school cafeteria setting. This study assesses the use of digital photography as a technique to measure what elementary-aged students select and actually consume…

  9. The Problem of Self-Censorship

    ERIC Educational Resources Information Center

    Hill, Rebecca

    2010-01-01

    Self-censorship, not to be confused with actual censorship, is the most complicated, but least understood form of censorship. In most cases of actual censorship, objections to a book are based on offensive language, sexual content, or unsuitability by age, and a complaint is filed to suppress the book. Often an internal review is undertaken, and a…

  10. Accuracy of self-reported versus actual online gambling wins and losses.

    PubMed

    Braverman, Julia; Tom, Matthew A; Shaffer, Howard J

    2014-09-01

    This study is the first to compare the accuracy of self-reported with actual monetary outcomes of online fixed odds sports betting, live action sports betting, and online casino gambling at the individual level of analysis. Subscribers to bwin.party digital entertainment's online gambling service volunteered to respond to the Brief Bio-Social Gambling Screen and questions about their estimated gambling results on specific games for the last 3 or 12 months. We compared the estimated results of each subscriber with his or her actual betting results data. On average, between 34% and 40% of the participants expressed a favorable distortion of their gambling outcomes (i.e., they underestimated losses or overestimated gains) depending on the time period and game. The size of the discrepancy between actual and self-reported results was consistently associated with the self-reported presence of gambling-related problems. However, the specific direction of the reported discrepancy (i.e., favorable vs. unfavorable bias) was not associated with gambling-related problems. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  11. The moment problem and vibrations damping of beams and plates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Atamuratov, Andrey G.; Mikhailov, Igor E.; Muravey, Leonid A.

    2016-06-08

    Beams and plates are the elements of different complex mechanical structures, for example, pipelines and aerospace platforms. That is why the problem of damping of their vibrations caused by unwanted perturbations is actual task.

  12. Estimating effective data density in a satellite retrieval or an objective analysis

    NASA Technical Reports Server (NTRS)

    Purser, R. J.; Huang, H.-L.

    1993-01-01

    An attempt is made to formulate consistent objective definitions of the concept of 'effective data density' applicable both in the context of satellite soundings and more generally in objective data analysis. The definitions based upon various forms of Backus-Gilbert 'spread' functions are found to be seriously misleading in satellite soundings where the model resolution function (expressing the sensitivity of retrieval or analysis to changes in the background error) features sidelobes. Instead, estimates derived by smoothing the trace components of the model resolution function are proposed. The new estimates are found to be more reliable and informative in simulated satellite retrieval problems and, for the special case of uniformly spaced perfect observations, agree exactly with their actual density. The new estimates integrate to the 'degrees of freedom for signal', a diagnostic that is invariant to changes of units or coordinates used.

  13. OMICS: Current and future perspectives in reproductive medicine and technology

    PubMed Central

    Egea, Rocío Rivera; Puchalt, Nicolás Garrido; Escrivá, Marcos Meseguer; Varghese, Alex C.

    2014-01-01

    Many couples present fertility problems at their reproductive age, and although in the last years, the efficiency of assisted reproduction techniques has increased, these are still far from being 100% effective. A key issue in this field is the proper assessment of germ cells, embryos and endometrium quality, in order to determine the actual likelihood to succeed. Currently available analysis is mainly based on morphological features of oocytes, sperm and embryos and although these strategies have improved the results, there is an urgent need of new diagnostic and therapeutic tools. The emergence of the - OMICS technologies (epigenomics, genomics, transcriptomics, proteomics and metabolomics) permitted the improvement on the knowledge in this field, by providing with a huge amount of information regarding the biological processes involved in reproductive success, thereby getting a broader view of complex biological systems with a relatively low cost and effort. PMID:25191020

  14. Parallel implementation of an adaptive scheme for 3D unstructured grids on the SP2

    NASA Technical Reports Server (NTRS)

    Strawn, Roger C.; Oliker, Leonid; Biswas, Rupak

    1996-01-01

    Dynamic mesh adaption on unstructured grids is a powerful tool for computing unsteady flows that require local grid modifications to efficiently resolve solution features. For this work, we consider an edge-based adaption scheme that has shown good single-processor performance on the C90. We report on our experience parallelizing this code for the SP2. Results show a 47.0X speedup on 64 processors when 10 percent of the mesh is randomly refined. Performance deteriorates to 7.7X when the same number of edges are refined in a highly-localized region. This is because almost all the mesh adaption is confined to a single processor. However, this problem can be remedied by repartitioning the mesh immediately after targeting edges for refinement but before the actual adaption takes place. With this change, the speedup improves dramatically to 43.6X.

  15. Parallel Implementation of an Adaptive Scheme for 3D Unstructured Grids on the SP2

    NASA Technical Reports Server (NTRS)

    Oliker, Leonid; Biswas, Rupak; Strawn, Roger C.

    1996-01-01

    Dynamic mesh adaption on unstructured grids is a powerful tool for computing unsteady flows that require local grid modifications to efficiently resolve solution features. For this work, we consider an edge-based adaption scheme that has shown good single-processor performance on the C90. We report on our experience parallelizing this code for the SP2. Results show a 47.OX speedup on 64 processors when 10% of the mesh is randomly refined. Performance deteriorates to 7.7X when the same number of edges are refined in a highly-localized region. This is because almost all mesh adaption is confined to a single processor. However, this problem can be remedied by repartitioning the mesh immediately after targeting edges for refinement but before the actual adaption takes place. With this change, the speedup improves dramatically to 43.6X.

  16. Recurrent cerebellar architecture solves the motor-error problem.

    PubMed Central

    Porrill, John; Dean, Paul; Stone, James V.

    2004-01-01

    Current views of cerebellar function have been heavily influenced by the models of Marr and Albus, who suggested that the climbing fibre input to the cerebellum acts as a teaching signal for motor learning. It is commonly assumed that this teaching signal must be motor error (the difference between actual and correct motor command), but this approach requires complex neural structures to estimate unobservable motor error from its observed sensory consequences. We have proposed elsewhere a recurrent decorrelation control architecture in which Marr-Albus models learn without requiring motor error. Here, we prove convergence for this architecture and demonstrate important advantages for the modular control of systems with multiple degrees of freedom. These results are illustrated by modelling adaptive plant compensation for the three-dimensional vestibular ocular reflex. This provides a functional role for recurrent cerebellar connectivity, which may be a generic anatomical feature of projections between regions of cerebral and cerebellar cortex. PMID:15255096

  17. Elevator mode convection in flows with strong magnetic fields

    NASA Astrophysics Data System (ADS)

    Liu, Li; Zikanov, Oleg

    2015-04-01

    Instability modes in the form of axially uniform vertical jets, also called "elevator modes," are known to be the solutions of thermal convection problems for vertically unbounded systems. Typically, their relevance to the actual flow state is limited by three-dimensional breakdown caused by rapid growth of secondary instabilities. We consider a flow of a liquid metal in a vertical duct with a heated wall and strong transverse magnetic field and find elevator modes that are stable and, thus, not just relevant, but a dominant feature of the flow. We then explore the hypothesis suggested by recent experimental data that an analogous instability to modes of slow axial variation develops in finite-length ducts, where it causes large-amplitude fluctuations of temperature. The implications for liquid metal blankets for tokamak fusion reactors that potentially invalidate some of the currently pursued design concepts are discussed.

  18. Automatic Clustering Using Multi-objective Particle Swarm and Simulated Annealing

    PubMed Central

    Abubaker, Ahmad; Baharum, Adam; Alrefaei, Mahmoud

    2015-01-01

    This paper puts forward a new automatic clustering algorithm based on Multi-Objective Particle Swarm Optimization and Simulated Annealing, “MOPSOSA”. The proposed algorithm is capable of automatic clustering which is appropriate for partitioning datasets to a suitable number of clusters. MOPSOSA combines the features of the multi-objective based particle swarm optimization (PSO) and the Multi-Objective Simulated Annealing (MOSA). Three cluster validity indices were optimized simultaneously to establish the suitable number of clusters and the appropriate clustering for a dataset. The first cluster validity index is centred on Euclidean distance, the second on the point symmetry distance, and the last cluster validity index is based on short distance. A number of algorithms have been compared with the MOPSOSA algorithm in resolving clustering problems by determining the actual number of clusters and optimal clustering. Computational experiments were carried out to study fourteen artificial and five real life datasets. PMID:26132309

  19. The media effect in Axelrod's model explained

    NASA Astrophysics Data System (ADS)

    Peres, L. R.; Fontanari, J. F.

    2011-11-01

    We revisit the problem of introducing an external global field —the mass media— in Axelrod's model of social dynamics, where in addition to their nearest neighbors, the agents can interact with a virtual neighbor whose cultural features are fixed from the outset. The finding that this apparently homogenizing field actually increases the cultural diversity has been considered a puzzle since the phenomenon was first reported more than a decade ago. Here we offer a simple explanation for it, which is based on the pedestrian observation that Axelrod's model exhibits more cultural diversity, i.e., more distinct cultural domains, when the agents are allowed to interact solely with the media field than when they can interact with their neighbors as well. In this perspective, it is the local homogenizing interactions that work towards making the absorbing configurations less fragmented as compared with the extreme situation in which the agents interact with the media only.

  20. Convenience of Statistical Approach in Studies of Architectural Ornament and Other Decorative Elements Specific Application

    NASA Astrophysics Data System (ADS)

    Priemetz, O.; Samoilov, K.; Mukasheva, M.

    2017-11-01

    An ornament is an actual phenomenon of the architecture modern theory, a common element in the practice of design and construction. It has been an important aspect of shaping for millennia. The description of the methods of its application occupies a large place in the studies on the theory and practice of architecture. However, the problem of the saturation of compositions with ornamentation, the specificity of its themes and forms have not been sufficiently studied yet. This aspect requires accumulation of additional knowledge. The application of quantitative methods for the plastic solutions types and a thematic diversity of facade compositions of buildings constructed in different periods creates another tool for an objective analysis of ornament development. It demonstrates the application of this approach for studying the features of the architectural development in Kazakhstan at the end of the XIX - XXI centuries.

  1. GATA: A graphic alignment tool for comparative sequenceanalysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nix, David A.; Eisen, Michael B.

    2005-01-01

    Several problems exist with current methods used to align DNA sequences for comparative sequence analysis. Most dynamic programming algorithms assume that conserved sequence elements are collinear. This assumption appears valid when comparing orthologous protein coding sequences. Functional constraints on proteins provide strong selective pressure against sequence inversions, and minimize sequence duplications and feature shuffling. For non-coding sequences this collinearity assumption is often invalid. For example, enhancers contain clusters of transcription factor binding sites that change in number, orientation, and spacing during evolution yet the enhancer retains its activity. Dotplot analysis is often used to estimate non-coding sequence relatedness. Yet dotmore » plots do not actually align sequences and thus cannot account well for base insertions or deletions. Moreover, they lack an adequate statistical framework for comparing sequence relatedness and are limited to pairwise comparisons. Lastly, dot plots and dynamic programming text outputs fail to provide an intuitive means for visualizing DNA alignments.« less

  2. Simple predictions of maximum transport rate in unsaturated soil and rock

    USGS Publications Warehouse

    Nimmo, John R.

    2007-01-01

    In contrast with the extreme variability expected for water and contaminant fluxes in the unsaturated zone, evidence from 64 field tests of preferential flow indicates that the maximum transport speed Vmax, adjusted for episodicity of infiltration, deviates little from a geometric mean of 13 m/d. A model based on constant‐speed travel during infiltration pulses of actual or estimated duration can predict Vmax with approximate order‐of‐magnitude accuracy, irrespective of medium or travel distance, thereby facilitating such problems as the prediction of worst‐case contaminant traveltimes. The lesser variability suggests that preferential flow is subject to rate‐limiting mechanisms analogous to those that impose a terminal velocity on objects in free fall and to rate‐compensating mechanisms analogous to Le Chatlier's principle. A critical feature allowing such mechanisms to dominate may be the presence of interfacial boundaries confined by neither solid material nor capillary forces.

  3. Elevator mode convection in flows with strong magnetic fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Li; Zikanov, Oleg, E-mail: zikanov@umich.edu

    2015-04-15

    Instability modes in the form of axially uniform vertical jets, also called “elevator modes,” are known to be the solutions of thermal convection problems for vertically unbounded systems. Typically, their relevance to the actual flow state is limited by three-dimensional breakdown caused by rapid growth of secondary instabilities. We consider a flow of a liquid metal in a vertical duct with a heated wall and strong transverse magnetic field and find elevator modes that are stable and, thus, not just relevant, but a dominant feature of the flow. We then explore the hypothesis suggested by recent experimental data that anmore » analogous instability to modes of slow axial variation develops in finite-length ducts, where it causes large-amplitude fluctuations of temperature. The implications for liquid metal blankets for tokamak fusion reactors that potentially invalidate some of the currently pursued design concepts are discussed.« less

  4. Registration using natural features for augmented reality systems.

    PubMed

    Yuan, M L; Ong, S K; Nee, A Y C

    2006-01-01

    Registration is one of the most difficult problems in augmented reality (AR) systems. In this paper, a simple registration method using natural features based on the projective reconstruction technique is proposed. This method consists of two steps: embedding and rendering. Embedding involves specifying four points to build the world coordinate system on which a virtual object will be superimposed. In rendering, the Kanade-Lucas-Tomasi (KLT) feature tracker is used to track the natural feature correspondences in the live video. The natural features that have been tracked are used to estimate the corresponding projective matrix in the image sequence. Next, the projective reconstruction technique is used to transfer the four specified points to compute the registration matrix for augmentation. This paper also proposes a robust method for estimating the projective matrix, where the natural features that have been tracked are normalized (translation and scaling) and used as the input data. The estimated projective matrix will be used as an initial estimate for a nonlinear optimization method that minimizes the actual residual errors based on the Levenberg-Marquardt (LM) minimization method, thus making the results more robust and stable. The proposed registration method has three major advantages: 1) It is simple, as no predefined fiducials or markers are used for registration for either indoor and outdoor AR applications. 2) It is robust, because it remains effective as long as at least six natural features are tracked during the entire augmentation, and the existence of the corresponding projective matrices in the live video is guaranteed. Meanwhile, the robust method to estimate the projective matrix can obtain stable results even when there are some outliers during the tracking process. 3) Virtual objects can still be superimposed on the specified areas, even if some parts of the areas are occluded during the entire process. Some indoor and outdoor experiments have been conducted to validate the performance of this proposed method.

  5. The Use of a Parametric Feature Based CAD System to Teach Introductory Engineering Graphics.

    ERIC Educational Resources Information Center

    Howell, Steven K.

    1995-01-01

    Describes the use of a parametric-feature-based computer-aided design (CAD) System, AutoCAD Designer, in teaching concepts of three dimensional geometrical modeling and design. Allows engineering graphics to go beyond the role of documentation and communication and allows an engineer to actually build a virtual prototype of a design idea and…

  6. Connection and Commitment: Exploring the Generation and Experience of Emotion in a Participatory Drama

    ERIC Educational Resources Information Center

    Dunn, Julie; Bundy, Penny; Stinson, Madonna

    2015-01-01

    Emotion is a complex and important aspect of participatory drama experience. This is because drama work of this kind provokes emotional responses to both actual and dramatic worlds. This paper identifies two key features of participatory drama that influence the generation and experience of emotion: commitment and connection. These features are…

  7. A practical guide to assessing clinical decision-making skills using the key features approach.

    PubMed

    Farmer, Elizabeth A; Page, Gordon

    2005-12-01

    This paper in the series on professional assessment provides a practical guide to writing key features problems (KFPs). Key features problems test clinical decision-making skills in written or computer-based formats. They are based on the concept of critical steps or 'key features' in decision making and represent an advance on the older, less reliable patient management problem (PMP) formats. The practical steps in writing these problems are discussed and illustrated by examples. Steps include assembling problem-writing groups, selecting a suitable clinical scenario or problem and defining its key features, writing the questions, selecting question response formats, preparing scoring keys, reviewing item quality and item banking. The KFP format provides educators with a flexible approach to testing clinical decision-making skills with demonstrated validity and reliability when constructed according to the guidelines provided.

  8. Topological numbering of features on a mesh

    NASA Technical Reports Server (NTRS)

    Atallah, Mikhail J.; Hambrusch, Susanne E.; Tewinkel, Lynn E.

    1988-01-01

    Assume a nxn binary image is given containing horizontally convex features; i.e., for each feature, each of its row's pixels form an interval on that row. The problem of assigning topological numbers to such features is considered; i.e., assign a number to every feature f so that all features to the left of f have a smaller number assigned to them. This problem arises in solutions to the stereo matching problem. A parallel algorithm to solve the topological numbering problem in O(n) time on an nxn mesh of processors is presented. The key idea of the solution is to create a tree from which the topological numbers can be obtained even though the tree does not uniquely represent the to the left of relationship of the features.

  9. Understanding the determinants of problem-solving behavior in a complex environment

    NASA Technical Reports Server (NTRS)

    Casner, Stephen A.

    1994-01-01

    It is often argued that problem-solving behavior in a complex environment is determined as much by the features of the environment as by the goals of the problem solver. This article explores a technique to determine the extent to which measured features of a complex environment influence problem-solving behavior observed within that environment. In this study, the technique is used to determine how complex flight deck and air traffic control environment influences the strategies used by airline pilots when controlling the flight path of a modern jetliner. Data collected aboard 16 commercial flights are used to measure selected features of the task environment. A record of the pilots' problem-solving behavior is analyzed to determine to what extent behavior is adapted to the environmental features that were measured. The results suggest that the measured features of the environment account for as much as half of the variability in the pilots' problem-solving behavior and provide estimates on the probable effects of each environmental feature.

  10. An exploratory study of clinical measures associated with subsyndromal pathological gambling in patients with binge eating disorder.

    PubMed

    Yip, Sarah W; White, Marney A; Grilo, Carlos M; Potenza, Marc N

    2011-06-01

    Both binge eating disorder (BED) and pathological gambling (PG) are characterized by impairments in impulse control. Subsyndromal levels of PG have been associated with measures of adverse health. The nature and significance of PG features in individuals with BED is unknown. Ninety-four patients with BED (28 men and 66 women) were classified by gambling group based on inclusionary criteria for Diagnostic and Statistical Manual-IV (DSM-IV) PG and compared on a range of behavioral, psychological and eating disorder (ED) psychopathology variables. One individual (1.1% of the sample) met criteria for PG, although 18.7% of patients with BED displayed one or more DSM-IV criteria for PG, hereafter referred to as problem gambling features. Men were more likely than women to have problem gambling features. BED patients with problem gambling features were distinguished by lower self-esteem and greater substance problem use. After controlling for gender, findings of reduced self-esteem and increased substance problem use among patients with problem gambling features remained significant. In patients with BED, problem gambling features are associated with a number of heightened clinical problems.

  11. Classification of Partial Discharge Measured under Different Levels of Noise Contamination

    PubMed Central

    2017-01-01

    Cable joint insulation breakdown may cause a huge loss to power companies. Therefore, it is vital to diagnose the insulation quality to detect early signs of insulation failure. It is well known that there is a correlation between Partial discharge (PD) and the insulation quality. Although many works have been done on PD pattern recognition, it is usually performed in a noise free environment. Also, works on PD pattern recognition in actual cable joint are less likely to be found in literature. Therefore, in this work, classifications of actual cable joint defect types from partial discharge data contaminated by noise were performed. Five cross-linked polyethylene (XLPE) cable joints with artificially created defects were prepared based on the defects commonly encountered on site. Three different types of input feature were extracted from the PD pattern under artificially created noisy environment. These include statistical features, fractal features and principal component analysis (PCA) features. These input features were used to train the classifiers to classify each PD defect types. Classifications were performed using three different artificial intelligence classifiers, which include Artificial Neural Networks (ANN), Adaptive Neuro-Fuzzy Inference System (ANFIS) and Support Vector Machine (SVM). It was found that the classification accuracy decreases with higher noise level but PCA features used in SVM and ANN showed the strongest tolerance against noise contamination. PMID:28085953

  12. Problem Posing at All Levels in the Calculus Classroom

    ERIC Educational Resources Information Center

    Perrin, John Robert

    2007-01-01

    This article explores the use of problem posing in the calculus classroom using investigative projects. Specially, four examples of student work are examined, each one differing in originality of problem posed. By allowing students to explore actual questions that they have about calculus, coming from their own work or class discussion, or…

  13. Mechanism problems

    NASA Technical Reports Server (NTRS)

    Riedel, J. K.

    1972-01-01

    It is pointed out that too frequently during the design and development of mechanisms, problems occur that could have been avoided if the right question had been asked before, rather than after, the fact. Several typical problems, drawn from actual experience, are discussed and analyzed. The lessons learned are used to generate various suggestions for minimizing mistakes in mechanism design.

  14. Three-Dimensional Profiles Using a Spherical Cutting Bit: Problem Solving in Practice

    ERIC Educational Resources Information Center

    Ollerton, Richard L.; Iskov, Grant H.; Shannon, Anthony G.

    2002-01-01

    An engineering problem concerned with relating the coordinates of the centre of a spherical cutting tool to the actual cutting surface leads to a potentially rich example of problem-solving techniques. Basic calculus, Lagrange multipliers and vector calculus techniques are employed to produce solutions that may be compared to better understand…

  15. Problem Solving in Swedish Mathematics Textbooks for Upper Secondary School

    ERIC Educational Resources Information Center

    Brehmer, Daniel; Ryve, Andreas; Van Steenbrugge, Hendrik

    2016-01-01

    The aim of this study is to analyse how mathematical problem solving is represented in mathematical textbooks for Swedish upper secondary school. The analysis comprises dominating Swedish textbook series, and relates to uncovering (a) the quantity of tasks that are actually mathematical problems, (b) their location in the chapter, (c) their…

  16. Mspire-Simulator: LC-MS shotgun proteomic simulator for creating realistic gold standard data.

    PubMed

    Noyce, Andrew B; Smith, Rob; Dalgleish, James; Taylor, Ryan M; Erb, K C; Okuda, Nozomu; Prince, John T

    2013-12-06

    The most important step in any quantitative proteomic pipeline is feature detection (aka peak picking). However, generating quality hand-annotated data sets to validate the algorithms, especially for lower abundance peaks, is nearly impossible. An alternative for creating gold standard data is to simulate it with features closely mimicking real data. We present Mspire-Simulator, a free, open-source shotgun proteomic simulator that goes beyond previous simulation attempts by generating LC-MS features with realistic m/z and intensity variance along with other noise components. It also includes machine-learned models for retention time and peak intensity prediction and a genetic algorithm to custom fit model parameters for experimental data sets. We show that these methods are applicable to data from three different mass spectrometers, including two fundamentally different types, and show visually and analytically that simulated peaks are nearly indistinguishable from actual data. Researchers can use simulated data to rigorously test quantitation software, and proteomic researchers may benefit from overlaying simulated data on actual data sets.

  17. Classification of simulated and actual NOAA-6 AVHRR data for hydrologic land-surface feature definition. [Advanced Very High Resolution Radiometer

    NASA Technical Reports Server (NTRS)

    Ormsby, J. P.

    1982-01-01

    An examination of the possibilities of using Landsat data to simulate NOAA-6 Advanced Very High Resolution Radiometer (AVHRR) data on two channels, as well as using actual NOAA-6 imagery, for large-scale hydrological studies is presented. A running average was obtained of 18 consecutive pixels of 1 km resolution taken by the Landsat scanners were scaled up to 8-bit data and investigated for different gray levels. AVHRR data comprising five channels of 10-bit, band-interleaved information covering 10 deg latitude were analyzed and a suitable pixel grid was chosen for comparison with the Landsat data in a supervised classification format, an unsupervised mode, and with ground truth. Landcover delineation was explored by removing snow, water, and cloud features from the cluster analysis, and resulted in less than 10% difference. Low resolution large-scale data was determined useful for characterizing some landcover features if weekly and/or monthly updates are maintained.

  18. Adaptive algorithm of selecting optimal variant of errors detection system for digital means of automation facility of oil and gas complex

    NASA Astrophysics Data System (ADS)

    Poluyan, A. Y.; Fugarov, D. D.; Purchina, O. A.; Nesterchuk, V. V.; Smirnova, O. V.; Petrenkova, S. B.

    2018-05-01

    To date, the problems associated with the detection of errors in digital equipment (DE) systems for the automation of explosive objects of the oil and gas complex are extremely actual. Especially this problem is actual for facilities where a violation of the accuracy of the DE will inevitably lead to man-made disasters and essential material damage, at such facilities, the diagnostics of the accuracy of the DE operation is one of the main elements of the industrial safety management system. In the work, the solution of the problem of selecting the optimal variant of the errors detection system of errors detection by a validation criterion. Known methods for solving these problems have an exponential valuation of labor intensity. Thus, with a view to reduce time for solving the problem, a validation criterion is compiled as an adaptive bionic algorithm. Bionic algorithms (BA) have proven effective in solving optimization problems. The advantages of bionic search include adaptability, learning ability, parallelism, the ability to build hybrid systems based on combining. [1].

  19. Image ratio features for facial expression recognition application.

    PubMed

    Song, Mingli; Tao, Dacheng; Liu, Zicheng; Li, Xuelong; Zhou, Mengchu

    2010-06-01

    Video-based facial expression recognition is a challenging problem in computer vision and human-computer interaction. To target this problem, texture features have been extracted and widely used, because they can capture image intensity changes raised by skin deformation. However, existing texture features encounter problems with albedo and lighting variations. To solve both problems, we propose a new texture feature called image ratio features. Compared with previously proposed texture features, e.g., high gradient component features, image ratio features are more robust to albedo and lighting variations. In addition, to further improve facial expression recognition accuracy based on image ratio features, we combine image ratio features with facial animation parameters (FAPs), which describe the geometric motions of facial feature points. The performance evaluation is based on the Carnegie Mellon University Cohn-Kanade database, our own database, and the Japanese Female Facial Expression database. Experimental results show that the proposed image ratio feature is more robust to albedo and lighting variations, and the combination of image ratio features and FAPs outperforms each feature alone. In addition, we study asymmetric facial expressions based on our own facial expression database and demonstrate the superior performance of our combined expression recognition system.

  20. Beyond Self-Report: Tools to Compare Estimated and Real-World Smartphone Use

    PubMed Central

    Andrews, Sally; Ellis, David A.; Shaw, Heather; Piwek, Lukasz

    2015-01-01

    Psychologists typically rely on self-report data when quantifying mobile phone usage, despite little evidence of its validity. In this paper we explore the accuracy of using self-reported estimates when compared with actual smartphone use. We also include source code to process and visualise these data. We compared 23 participants’ actual smartphone use over a two-week period with self-reported estimates and the Mobile Phone Problem Use Scale. Our results indicate that estimated time spent using a smartphone may be an adequate measure of use, unless a greater resolution of data are required. Estimates concerning the number of times an individual used their phone across a typical day did not correlate with actual smartphone use. Neither estimated duration nor number of uses correlated with the Mobile Phone Problem Use Scale. We conclude that estimated smartphone use should be interpreted with caution in psychological research. PMID:26509895

  1. Beyond Self-Report: Tools to Compare Estimated and Real-World Smartphone Use.

    PubMed

    Andrews, Sally; Ellis, David A; Shaw, Heather; Piwek, Lukasz

    2015-01-01

    Psychologists typically rely on self-report data when quantifying mobile phone usage, despite little evidence of its validity. In this paper we explore the accuracy of using self-reported estimates when compared with actual smartphone use. We also include source code to process and visualise these data. We compared 23 participants' actual smartphone use over a two-week period with self-reported estimates and the Mobile Phone Problem Use Scale. Our results indicate that estimated time spent using a smartphone may be an adequate measure of use, unless a greater resolution of data are required. Estimates concerning the number of times an individual used their phone across a typical day did not correlate with actual smartphone use. Neither estimated duration nor number of uses correlated with the Mobile Phone Problem Use Scale. We conclude that estimated smartphone use should be interpreted with caution in psychological research.

  2. [ACTUAL PROBLEMS OF HYGIENE SCIENCE AND PRACTICE IN THE PRESERVATION OF PUBLIC HEALTH].

    PubMed

    Onishchenko, G G

    2015-01-01

    In the article there are designated the state and actual hygiene tasks on the issue of environmental pollution and its effects on health of the population. There was emphasized the growing importance of chemical contamination of various objects of environment--air water, soil, and living environment. There is presented the analysis of data on different types of treatment of municipal waste in selected countries. There were shown the significance of the developed Guidance on risk assessmentfor public health as a toolfor making sound management decisions, prospects of using of the methodology of epidemiological mapping based on geoinformational technology (GIS technology). There was marked an important role of the younger generation of hygienists and health officers in further work on both preservation and improvement the health of the population in their countries, harmonization of scientific and practical solutions of actual problems of hygiene.

  3. An evaluation of Ada for Al applications

    NASA Technical Reports Server (NTRS)

    Wallace, David R.

    1986-01-01

    Expert system technology seems to be the most promising type of Artificial Intelligence (AI) application for Ada. An expert system implemented with an expert system shell provides a highly structured approach that fits well with the structured approach found in Ada systems. The current commercial expert system shells use Lisp. In this highly structured situation a shell could be built that used Ada just as well. On the other hand, if it is necessary to deal with some AI problems that are not suited to expert systems, the use of Ada becomes more problematical. Ada was not designed as an AI development language, and is not suited to that. It is possible that an application developed in say, Common Lisp could be translated to Ada for actual use in a particular application, but this could be difficult. Some standard Ada packages could be developed to make such a translation easier. If the most general AI programs need to be dealt with, a Common Lisp system integrated with the Ada Environment is probably necessary. Aside from problems with language features, Ada, by itself, is not well suited to the prototyping and incremental development that is well supported by Lisp.

  4. Vehicle height and posture control of the electronic air suspension system using the hybrid system approach

    NASA Astrophysics Data System (ADS)

    Sun, Xiaoqiang; Cai, Yingfeng; Chen, Long; Liu, Yanling; Wang, Shaohua

    2016-03-01

    The electronic air suspension (EAS) system can improve ride comfort, fuel economy and handling safety of vehicles by adjusting vehicle height. This paper describes the development of a novel controller using the hybrid system approach to adjust the vehicle height (height control) and to regulate the roll and pitch angles of the vehicle body during the height adjustment process (posture control). The vehicle height adjustment system of EAS poses challenging hybrid control problems, since it features different discrete modes of operation, where each mode has an associated linear continuous-time dynamic. In this paper, we propose a novel approach to the modelling and controller design problem for the vehicle height adjustment system of EAS. The system model is described firstly in the hybrid system description language (HYSDEL) to obtain a mixed logical dynamical (MLD) hybrid model. For the resulting model, a hybrid model predictive controller is tuned to improve the vehicle height and posture tracking accuracy and to achieve the on-off statuses direct control of solenoid valves. The effectiveness and performance of the proposed approach are demonstrated by simulations and actual vehicle tests.

  5. Numerical Simulations of High-Speed Chemically Reacting Flow

    NASA Technical Reports Server (NTRS)

    Ton, V. T.; Karagozian, A. R.; Marble, F. E.; Osher, S. J.; Engquist, B. E.

    1994-01-01

    The essentially nonoscillatory (ENO) shock-capturing scheme for the solution of hyperbolic equations is extended to solve a system of coupled conservation equations governing two-dimensional, time-dependent, compressible chemically reacting flow with full chemistry. The thermodynamic properties of the mixture are modeled accurately, and stiff kinetic terms are separated from the fluid motion by a fractional step algorithm. The methodology is used to study the concept of shock-induced mixing and combustion, a process by which the interaction of a shock wave with a jet of low-density hydrogen fuel enhances mixing through streamwise vorticity generation. Test cases with and without chemical reaction are explored here. Our results indicate that, in the temperature range examined, vorticity generation as well as the distribution of atomic species do not change significantly with the introduction of a chemical reaction and subsequent heat release. The actual diffusion of hydrogen is also relatively unaffected by the reaction process. This suggests that the fluid mechanics of this problem may be successfully decoupled from the combustion processes, and that computation of the mixing problem (without combustion chemistry) can elucidate much of the important physical features of the flow.

  6. Numerical Simulations of High-Speed Chemically Reacting Flow

    NASA Technical Reports Server (NTRS)

    Ton, V. T.; Karagozin, A. R.; Marble, F. E.; Osher, S. J.; Engquist, B. E.

    1994-01-01

    The Essentially NonOscillatory (ENO) shock-capturing scheme for the solution of hyperbolic equations is extended to solve a system of coupled conservation equations governing two-dimensional, time-dependent, compressible chemically reacting flow with full chemistry. The thermodynamic properties of the mixture are modeled accurately, and stiff kinetic terms are separated from the fluid motion by a fractional step algorithm. The methodology is used to study the concept of shock-induced mixing and combustion, a process by which the interaction of a shock wave with a jet of low-density hydrogen fuel enhances mixing through streamwise vorticity generation. Test cases with and without chemical reaction are explored here. Our results indicate that, in the temperature range examined, vorticity generation as well as the distribution of atomic species do not change significantly with the introduction of a chemical reaction and subsequent heat release. The actual diffusion of hydrogen is also relatively unaffected by the reaction process. This suggests that the fluid mechanics of this problem may be successfully decoupled from the combustion processes, and that computation of the mixing problem (without combustion chemistry) can elucidate much of the important physical features of the flow.

  7. The revolution in psychiatric diagnosis: problems at the foundations.

    PubMed

    Galatzer-Levy, Isaac R; Galatzer-Levy, Robert M

    2007-01-01

    The third edition of the American Psychiatric Association's Diagnostic and Statistical Manual (DSM-III; 1974) not only revolutionized psychiatric diagnosis, it transformed and dominated American psychiatry. The nosology of psychiatry had been conceptually confusing, difficult to apply, and bound to widely questioned theories. Psychiatry and clinical psychology had been struggling with their scientific status. DSM attempted to solve psychiatry's problems by making psychiatry more like its authors' perception of general medicine. It tried to avoid theory, especially psychoanalytic theories, by discussing only observable manifestations of disorders. But DSM is actually highly theory-bound. It implicitly and powerfully includes an exclusively "medical" model of psychological disturbance, while excluding other psychiatric ideas. Its authors tried to meet what they saw as "scientific standards." To a surprising extent, DSM reflects its creators' personal distaste for psychoanalysis. The result is that DSM rests on a narrow philosophical perspective. The consequences of its adoption are widespread: it has profoundly affected drug development and other therapeutic studies, psychiatric education, attitudes toward patients, the public perception of psychiatry, and administrative and legal decisions. This article explores how DSM's most problematic features arise from its history in psychiatric controversies of the 1960s and its underlying positivistic philosophy.

  8. An optimization program based on the method of feasible directions: Theory and users guide

    NASA Technical Reports Server (NTRS)

    Belegundu, Ashok D.; Berke, Laszlo; Patnaik, Surya N.

    1994-01-01

    The theory and user instructions for an optimization code based on the method of feasible directions are presented. The code was written for wide distribution and ease of attachment to other simulation software. Although the theory of the method of feasible direction was developed in the 1960's, many considerations are involved in its actual implementation as a computer code. Included in the code are a number of features to improve robustness in optimization. The search direction is obtained by solving a quadratic program using an interior method based on Karmarkar's algorithm. The theory is discussed focusing on the important and often overlooked role played by the various parameters guiding the iterations within the program. Also discussed is a robust approach for handling infeasible starting points. The code was validated by solving a variety of structural optimization test problems that have known solutions obtained by other optimization codes. It has been observed that this code is robust: it has solved a variety of problems from different starting points. However, the code is inefficient in that it takes considerable CPU time as compared with certain other available codes. Further work is required to improve its efficiency while retaining its robustness.

  9. Using K-Nearest Neighbor Classification to Diagnose Abnormal Lung Sounds

    PubMed Central

    Chen, Chin-Hsing; Huang, Wen-Tzeng; Tan, Tan-Hsu; Chang, Cheng-Chun; Chang, Yuan-Jen

    2015-01-01

    A reported 30% of people worldwide have abnormal lung sounds, including crackles, rhonchi, and wheezes. To date, the traditional stethoscope remains the most popular tool used by physicians to diagnose such abnormal lung sounds, however, many problems arise with the use of a stethoscope, including the effects of environmental noise, the inability to record and store lung sounds for follow-up or tracking, and the physician’s subjective diagnostic experience. This study has developed a digital stethoscope to help physicians overcome these problems when diagnosing abnormal lung sounds. In this digital system, mel-frequency cepstral coefficients (MFCCs) were used to extract the features of lung sounds, and then the K-means algorithm was used for feature clustering, to reduce the amount of data for computation. Finally, the K-nearest neighbor method was used to classify the lung sounds. The proposed system can also be used for home care: if the percentage of abnormal lung sound frames is > 30% of the whole test signal, the system can automatically warn the user to visit a physician for diagnosis. We also used bend sensors together with an amplification circuit, Bluetooth, and a microcontroller to implement a respiration detector. The respiratory signal extracted by the bend sensors can be transmitted to the computer via Bluetooth to calculate the respiratory cycle, for real-time assessment. If an abnormal status is detected, the device will warn the user automatically. Experimental results indicated that the error in respiratory cycles between measured and actual values was only 6.8%, illustrating the potential of our detector for home care applications. PMID:26053756

  10. Research on the Dynamic Hysteresis Loop Model of the Residence Times Difference (RTD)-Fluxgate

    PubMed Central

    Wang, Yanzhang; Wu, Shujun; Zhou, Zhijian; Cheng, Defu; Pang, Na; Wan, Yunxia

    2013-01-01

    Based on the core hysteresis features, the RTD-fluxgate core, while working, is repeatedly saturated with excitation field. When the fluxgate simulates, the accurate characteristic model of the core may provide a precise simulation result. As the shape of the ideal hysteresis loop model is fixed, it cannot accurately reflect the actual dynamic changing rules of the hysteresis loop. In order to improve the fluxgate simulation accuracy, a dynamic hysteresis loop model containing the parameters which have actual physical meanings is proposed based on the changing rule of the permeability parameter when the fluxgate is working. Compared with the ideal hysteresis loop model, this model has considered the dynamic features of the hysteresis loop, which makes the simulation results closer to the actual output. In addition, other hysteresis loops of different magnetic materials can be explained utilizing the described model for an example of amorphous magnetic material in this manuscript. The model has been validated by the output response comparison between experiment results and fitting results using the model. PMID:24002230

  11. Affective Education: A Teacher's Manual to Promote Student Self-Actualization and Human Relations Skills.

    ERIC Educational Resources Information Center

    Snyder, Thomas R.

    This teacher's manual presents affective education as a program to promote student self-actualization and human relations skills. Abraham Maslow's hierarchy of needs and Erik Erikson's life stages of psychosocial development form the conceptual base for this program. The goals and objectives of this manual are concerned with problem-solving…

  12. Racism in the Classroom: Case Studies.

    ERIC Educational Resources Information Center

    Duhon, Gwendolyn M.

    This book presents 20 cases that address racism in one form or another. Many of the cases are from actual experience. They are intended to bring out actual or possible solutions so that student teachers, novice teachers, and seasoned teachers can find ideas for solving racist problems in their classrooms. The first part focuses on the early years,…

  13. Problems with the rush toward advanced physics in high schools

    NASA Astrophysics Data System (ADS)

    Gollub, Jerry

    2003-04-01

    The Advanced Placement (AP) Program has a major impact on the physics experience of many high school students. It affects admission to college, course choices and performance in college, and subsequent career decisions. A study committee of the National Research Council published a review of these programs in 2002, and concluded that while the program has many positive features, important problems need to be addressed. [1] The programs are not currently consistent with what we have learned about student learning from cognitive research. Students are often poorly prepared for AP courses, because of lack of coordination within schools. The Physics AP-B (non-calculus) program is too broad to allow most high school students to achieve an adequate level of conceptual understanding. Participation by minority students in these programs is far below that of other students. The AP exams need to be re-evaluated to insure that they actually measure conceptual understanding and complex reasoning. The AP exams are sometimes used inappropriately to rate teachers or schools. College and high school courses are poorly coordinated, with the result that students often take an introductory physics survey as many as three times. Policies on college credit for AP courses differ widely. These problems cannot be fixed by the College Board alone. [1] Jerry P. Gollub and Robin Spital, "Advanced Physics in the High Schools", Physics Today, May 2002.

  14. [Starving in childhood and diabetes mellitus in elderly age].

    PubMed

    Khoroshinina, L P; Zhavoronkova, N V

    2008-01-01

    The long-term consequences of the protracted starvation or inadequate nutrition of children is a problem in which considerable interest has been shown in recent decades. Between June 1941 and January 1944 the civilian population of Leningrad was besieged for two and a half years. The non-combatant population of this large European city lived through lengthy periods of starvation or malnutrition against a background of additional complex stress factors (including cold, bombing, death of relatives and acquaintances, and lack of means of transport and communication). It may be assumed that the health in adulthood of those who were children and young people in Leningrad during the siege differed from that of people of the same age who were spared those extreme conditions. Impact of starvation in childhood on prevalence rate of diabetes mellitus in elderly age, time of onset, clinical features of the disease course were studied. The results confirm that insulin-independent diabetes without obesity develops more often and earlier in women who got through the Siege of Leningrad in their childhood. Health status of elderly people who underwent continuous starvation in their childhood is the actual problem, because health status of young people in this country who got through 90's, when one of three children in the age of 2 years starved, suggests developing of medical and social problems because of forthcoming changes in the illness patterns of the population in modern Russia.

  15. Actual Romanian research in post-newtonian dynamics

    NASA Astrophysics Data System (ADS)

    Mioc, V.; Stavinschi, M.

    2007-05-01

    We survey the recent Romanian results in the study of the two-body problem in post-Newtonian fields. Such a field is characterized, in general, by a potential of the form U(q)=|q|^{-1}+ something (small, but not compulsorily). We distinguish some classes of post-Newtonian models: relativistic (Schwarzschild, Fock, Einstein PN, Reissner-Nordström, Schwarzschild - de Sitter, etc.) and nonrelativistic (Manev, Mücket-Treder, Seeliger, gravito-elastic, etc.). Generalized models (the zonal-satellite problem, quasihomogeneous fields), as well as special cases (anisotropic Manev-type and Schwarzschild-type models, Popovici or Popovici-Manev photogravitational problem), were also tackled. The methods used in such studies are various: analytical (using mainly the theory of perturbations, but also other theories: functions of complex variable, variational calculus, etc.), geometric (qualitative approach of the theory of dynamical systems), and numerical (especially using the Poincaré-section technique). The areas of interest and the general results obtained focus on: exact or approximate analytical solutions; characteristics of local flows (especially at limit situations: collision and escape); quasiperiodic and periodic orbits; equilibria; symmetries; chaoticity; geometric description of the global flow (and physical interpretation of the phase-space structure). We emphasize some special features, which cannot be met within the Newtonian framework: black-hole effect, oscillatory collisions, radial librations, bounded orbits for nonnegative energy, existence of unstable circular motion (or unstable rest), symmetric periodic orbits within anisotropic models, etc.

  16. Geometric correction of satellite data using curvilinear features and virtual control points

    NASA Technical Reports Server (NTRS)

    Algazi, V. R.; Ford, G. E.; Meyer, D. I.

    1979-01-01

    A simple, yet effective procedure for the geometric correction of partial Landsat scenes is described. The procedure is based on the acquisition of actual and virtual control points from the line printer output of enhanced curvilinear features. The accuracy of this method compares favorably with that of the conventional approach in which an interactive image display system is employed.

  17. Features of a SINDA/FLUINT model of a liquid oxygen supply line

    NASA Astrophysics Data System (ADS)

    Simmonds, Boris G.

    1993-11-01

    The modeling features used in a steady-state heat transfer problem using SINDA/FLUINT are described. The problem modeled is a 125 feet long, 3 inch diameter pipe, filled with liquid oxygen flow driven by a given pressure gradient. The pipe is fully insulated in five sections. Three sections of 1 inch thick spray-on foam and two sections of vacuum jacket. The model evaluates friction, turns losses and convection heat transfer between the fluid and the pipe wall. There is conduction through the foam insulation with temperature dependent thermal conductivity. The vacuum space is modeled with radiation and gas molecular conduction, if present, in the annular gap. Heat is transferred between the outer surface and surrounding ambient by natural convection and radiation; and, by axial conduction along the pipe and through the vacuum jacket spacers and welded seal flanges. The model makes extensive use of SINDA/FLUINT basic capabilities such as the GEN option for nodes and conductors (to generate groups of nodes or conductors), the SIV option (to generate single, temperature varying conductors), the SIM option (for multiple, temperature varying conductors) and the M HX macros for fluids (to generate strings of lumps, paths, and ties representing a diabatic duct). It calls subroutine CONTRN (returns the relative location in the G-array of a network conductor, given an actual conductor number) enabling an extensive manipulation of conductor (calculates an assignment of their values) with DO loops. Models like this illustrate to the new and even to the old SINDA/FLUINT user, features of the program that are not so obvious or known, and that are extremely handy when trying to take advantage of both, the automation of the DATA headers and make surgical modifications to specific parameters of the thermal or fluid elements in the OPERATIONS portion of the model.

  18. Features of a SINDA/FLUINT model of a liquid oxygen supply line

    NASA Technical Reports Server (NTRS)

    Simmonds, Boris G.

    1993-01-01

    The modeling features used in a steady-state heat transfer problem using SINDA/FLUINT are described. The problem modeled is a 125 feet long, 3 inch diameter pipe, filled with liquid oxygen flow driven by a given pressure gradient. The pipe is fully insulated in five sections. Three sections of 1 inch thick spray-on foam and two sections of vacuum jacket. The model evaluates friction, turns losses and convection heat transfer between the fluid and the pipe wall. There is conduction through the foam insulation with temperature dependent thermal conductivity. The vacuum space is modeled with radiation and gas molecular conduction, if present, in the annular gap. Heat is transferred between the outer surface and surrounding ambient by natural convection and radiation; and, by axial conduction along the pipe and through the vacuum jacket spacers and welded seal flanges. The model makes extensive use of SINDA/FLUINT basic capabilities such as the GEN option for nodes and conductors (to generate groups of nodes or conductors), the SIV option (to generate single, temperature varying conductors), the SIM option (for multiple, temperature varying conductors) and the M HX macros for fluids (to generate strings of lumps, paths, and ties representing a diabatic duct). It calls subroutine CONTRN (returns the relative location in the G-array of a network conductor, given an actual conductor number) enabling an extensive manipulation of conductor (calculates an assignment of their values) with DO loops. Models like this illustrate to the new and even to the old SINDA/FLUINT user, features of the program that are not so obvious or known, and that are extremely handy when trying to take advantage of both, the automation of the DATA headers and make surgical modifications to specific parameters of the thermal or fluid elements in the OPERATIONS portion of the model.

  19. Too upset to think: the interplay of borderline personality features, negative emotions, and social problem solving in the laboratory.

    PubMed

    Dixon-Gordon, Katherine L; Chapman, Alexander L; Lovasz, Nathalie; Walters, Kris

    2011-10-01

    Borderline personality disorder (BPD) is associated with poor social problem solving and problems with emotion regulation. In this study, the social problem-solving performance of undergraduates with high (n = 26), mid (n = 32), or low (n = 29) levels of BPD features was assessed with the Social Problem-Solving Inventory-Revised and using the means-ends problem-solving procedure before and after a social rejection stressor. The high-BP group, but not the low-BP group, showed a significant reduction in relevant solutions to social problems and more inappropriate solutions following the negative emotion induction. Increases in self-reported negative emotions during the emotion induction mediated the relationship between BP features and reductions in social problem-solving performance. In addition, the high-BP group demonstrated trait deficits in social problem solving on the Social Problem-Solving Inventory-Revised. These findings suggest that future research must examine social problem solving under differing emotional conditions, and that clinical interventions to improve social problem solving among persons with BP features should focus on responses to emotional contexts.

  20. Information Processing Theory of Human Performance and Related Research.

    DTIC Science & Technology

    1979-05-01

    features are analyzed or compared at one or more times. Excellent reviews are available ( LaBerge , 1976; Sutherland, 1973). Without belaboring the issue, the...We propose then that it is not absolute values which are "features," but rela- tive values, and more specifically based on the work of Stevens ... Stevens , 1975a, 1975b; Stevens & Galanter, 1957) and his colleagues, that a feature is a ratio of actual stimulation to an identifia~ble absolute value on a

  1. Music psychopathology. V. Objective features of instrumental performance and psychopathology.

    PubMed

    Steinberg, R; Fani, M; Raith, L

    1992-01-01

    Mental disease systematically impairs musical expression according to nosologic classification. This was demonstrated with a polarity profile of the instrumental performances of 60 inpatients and 14 controls matched for musical aptitude. Objective performance characteristics such as irregularities and playing faults were analyzed too. No meaningful correlation between these features and psychopathology resulted. This indicates that even in severe psychopathologic alterations performance features, which depend mainly on education and actual training, are not altered in a systematic manner, in contrast to expressive qualities.

  2. Comparison of Genetic Algorithm, Particle Swarm Optimization and Biogeography-based Optimization for Feature Selection to Classify Clusters of Microcalcifications

    NASA Astrophysics Data System (ADS)

    Khehra, Baljit Singh; Pharwaha, Amar Partap Singh

    2017-04-01

    Ductal carcinoma in situ (DCIS) is one type of breast cancer. Clusters of microcalcifications (MCCs) are symptoms of DCIS that are recognized by mammography. Selection of robust features vector is the process of selecting an optimal subset of features from a large number of available features in a given problem domain after the feature extraction and before any classification scheme. Feature selection reduces the feature space that improves the performance of classifier and decreases the computational burden imposed by using many features on classifier. Selection of an optimal subset of features from a large number of available features in a given problem domain is a difficult search problem. For n features, the total numbers of possible subsets of features are 2n. Thus, selection of an optimal subset of features problem belongs to the category of NP-hard problems. In this paper, an attempt is made to find the optimal subset of MCCs features from all possible subsets of features using genetic algorithm (GA), particle swarm optimization (PSO) and biogeography-based optimization (BBO). For simulation, a total of 380 benign and malignant MCCs samples have been selected from mammogram images of DDSM database. A total of 50 features extracted from benign and malignant MCCs samples are used in this study. In these algorithms, fitness function is correct classification rate of classifier. Support vector machine is used as a classifier. From experimental results, it is also observed that the performance of PSO-based and BBO-based algorithms to select an optimal subset of features for classifying MCCs as benign or malignant is better as compared to GA-based algorithm.

  3. Long Live Traditional Textbook Problems!?--Constraints on Faculty Use of Research-Based Problems in Introductory Courses

    ERIC Educational Resources Information Center

    Ding, Lin

    2014-01-01

    Though many research-based problem types have been shown effective in promoting students' conceptual understanding and scientific abilities, the extent of their use in actual classrooms remains unclear. We interviewed and surveyed 16 physics and engineering faculty members at a large US Midwest research university to investigate how university…

  4. Honor Code/Code of Conduct in International Institutions of Higher Education

    ERIC Educational Resources Information Center

    Alahmad, Ala'

    2013-01-01

    In today's society, students are faced with many ethical decisions about which they are uncertain. Unfortunately, many of these problems are rooted not only in their academic lives, but also in the workplace. These problems stem from a lack of knowledge concerning decision-making. This problem presents an actual global dilemma. Codifying ethics in…

  5. Dissociable Stages of Problem Solving (II): First Evidence for Process-Contingent Temporal Order of Activation in Dorsolateral Prefrontal Cortex

    ERIC Educational Resources Information Center

    Ruh, Nina; Rahm, Benjamin; Unterrainer, Josef M.; Weiller, Cornelius; Kaller, Christoph P.

    2012-01-01

    In a companion study, eye-movement analyses in the Tower of London task (TOL) revealed independent indicators of functionally separable cognitive processes during problem solving, with processes of building up an internal representation of the problem preceding actual planning processes. These results imply that processes of internalization and…

  6. A survey of current operational problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prince, W.R.; Nielsen, E.K.; McNair, H.D.

    1989-11-01

    This paper is prepared for use in the Working Group on Current Operational Problems (COPS) forums with the goal of focusing attention of the industry on problems faced by those who are involved in actual power system operation. The results of a survey on operational problems are presented in this paper. Statistical information compiled for various categories of operational problems is given with some general observations about the results. A rough comparison is made from the results of this survey and the first COPS problem list of 1976.

  7. [Special features of actual nutrition and nutritional status of children living in the Yamal-Nenets Autonomous District].

    PubMed

    Istomin, A V; Iudina, T V; Mikhaĭlov, I G; Raengulov, B M

    2000-01-01

    Actual nutrition of children living at boarding-school of the Yamal-Nenets autonomous district and also characteristics of their health, such as capacity for work, vitamin metabolism, physical development and activity of bio-oxidant systems of organism have been studied. The obtained results have become the basis for developing scientifically substantiated principles of creating balanced nutrition ration with antioxidant properties for the children living at the Far North.

  8. Nuclear-physical analysis methods in medical geology: Assessment of the impact of environmental factors on human health

    NASA Astrophysics Data System (ADS)

    Gorbunov, A. V.; Lyapunov, S. M.; Okina, O. I.; Frontas'eva, M. V.; Pavlov, S. S.; Il'chenko, I. N.

    2015-05-01

    The procedure for geomedical studies is outlined, and the niche occupied by the nuclear-physical analysis methods in these studies is pointed out. The necessity of construction of an efficient complex of the most modern analytical methods is demonstrated. The metrological parameters of methods applied in the analysis of natural environments and biological materials are evaluated. The current state of pollution of natural environments with heavy and toxic metals is characterized in two specific industrial hubs: the towns of Gus-Khrustalny and Podolsk. The levels of pollution of diagnostic biological materials (hair and blood) from children living in various urban districts are studied in the light of specific features of the manufacturing industry in these towns and the life environment of child population. The results of studies focused on evaluating the effect of environment on the health of child population are detailed. The actual damage to child health, their neuropsychic development and behavior, and the effect of socioeconomic factors are determined. Preventive problems among the child population exposed to lead and other toxic metals are evaluated, and ways to solve them are proposed. A system of early diagnosis and preventive measures for the mitigation of adverse effect of toxic metals (Pb, Cu, Mn, Zn, Cr, Ni, As, etc.) on the neuropsychic development of children is developed based on an actual ecogeochemical estimation of the state of the region under study.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mills, Evan

    There exist hundreds of building energy software tools, both web- and disk-based. These tools exhibit considerable range in approach and creativity, with some being highly specialized and others able to consider the building as a whole. However, users are faced with a dizzying array of choices and, often, conflicting results. The fragmentation of development and deployment efforts has hampered tool quality and market penetration. The purpose of this review is to provide information for defining the desired characteristics of residential energy tools, and to encourage future tool development that improves on current practice. This project entails (1) creating a frameworkmore » for describing possible technical and functional characteristics of such tools, (2) mapping existing tools onto this framework, (3) exploring issues of tool accuracy, and (4) identifying ''best practice'' and strategic opportunities for tool design. evaluated 50 web-based residential calculators, 21 of which we regard as ''whole-house'' tools(i.e., covering a range of end uses). Of the whole-house tools, 13 provide open-ended energy calculations, 5 normalize the results to actual costs (a.k.a ''bill-disaggregation tools''), and 3 provide both options. Across the whole-house tools, we found a range of 5 to 58 house-descriptive features (out of 68 identified in our framework) and 2 to 41 analytical and decision-support features (55 possible). We also evaluated 15 disk-based residential calculators, six of which are whole-house tools. Of these tools, 11 provide open-ended calculations, 1 normalizes the results to actual costs, and 3 provide both options. These tools offered ranges of 18 to 58 technical features (70 possible) and 10 to 40 user- and decision-support features (56 possible). The comparison shows that such tools can employ many approaches and levels of detail. Some tools require a relatively small number of well-considered inputs while others ask a myriad of questions and still miss key issues. The value of detail has a lot to do with the type of question(s) being asked by the user (e.g., the availability of dozens of miscellaneous appliances is immaterial for a user attempting to evaluate the potential for space-heating savings by installing a new furnace). More detail does not, according to our evaluation, automatically translate into a ''better'' or ''more accurate'' tool. Efforts to quantify and compare the ''accuracy'' of these tools are difficult at best, and prior tool-comparison studies have not undertaken this in a meaningful way. The ability to evaluate accuracy is inherently limited by the availability of measured data. Furthermore, certain tool outputs can only be measured against ''actual'' values that are themselves calculated (e.g., HVAC sizing), while others are rarely if ever available (e.g., measured energy use or savings for specific measures). Similarly challenging is to understand the sources of inaccuracies. There are many ways in which quantitative errors can occur in tools, ranging from programming errors to problems inherent in a tool's design. Due to hidden assumptions and non-variable ''defaults'', most tools cannot be fully tested across the desirable range of building configurations, operating conditions, weather locations, etc. Many factors conspire to confound performance comparisons among tools. Differences in inputs can range from weather city, to types of HVAC systems, to appliance characteristics, to occupant-driven effects such as thermostat management. Differences in results would thus no doubt emerge from an extensive comparative exercise, but the sources or implications of these differences for the purposes of accuracy evaluation or tool development would remain largely unidentifiable (especially given the paucity of technical documentation available for most tools). For the tools that we tested, the predicted energy bills for a single test building ranged widely (by nearly a factor of three), and far more so at the end-use level. Most tools over-predicted energy bills and all over-predicted consumption. Variability was lower among disk-based tools,but they more significantly over-predicted actual use. The deviations (over-predictions) we observed from actual bills corresponded to up to $1400 per year (approx. 250 percent of the actual bills). For bill-disaggregation tools, wherein the results are forced to equal actual bills, the accuracy issue shifts to whether or not the total is properly attributed to the various end uses and to whether savings calculations are done accurately (a challenge that demands relatively rare end-use data). Here, too, we observed a number of dubious results. Energy savings estimates automatically generated by the web-based tools varied from $46/year (5 percent of predicted use) to $625/year (52 percent of predicted use).« less

  10. UPJ obstruction

    MedlinePlus

    ... ureter As a result, urine builds up and damages kidney. In older children and adults, the problem may be due to ... birth may actually improve on its own. Most children do well and have no long-term problems. Serious damage may occur in people who are diagnosed later ...

  11. Adaptable Constrained Genetic Programming: Extensions and Applications

    NASA Technical Reports Server (NTRS)

    Janikow, Cezary Z.

    2005-01-01

    An evolutionary algorithm applies evolution-based principles to problem solving. To solve a problem, the user defines the space of potential solutions, the representation space. Sample solutions are encoded in a chromosome-like structure. The algorithm maintains a population of such samples, which undergo simulated evolution by means of mutation, crossover, and survival of the fittest principles. Genetic Programming (GP) uses tree-like chromosomes, providing very rich representation suitable for many problems of interest. GP has been successfully applied to a number of practical problems such as learning Boolean functions and designing hardware circuits. To apply GP to a problem, the user needs to define the actual representation space, by defining the atomic functions and terminals labeling the actual trees. The sufficiency principle requires that the label set be sufficient to build the desired solution trees. The closure principle allows the labels to mix in any arity-consistent manner. To satisfy both principles, the user is often forced to provide a large label set, with ad hoc interpretations or penalties to deal with undesired local contexts. This unfortunately enlarges the actual representation space, and thus usually slows down the search. In the past few years, three different methodologies have been proposed to allow the user to alleviate the closure principle by providing means to define, and to process, constraints on mixing the labels in the trees. Last summer we proposed a new methodology to further alleviate the problem by discovering local heuristics for building quality solution trees. A pilot system was implemented last summer and tested throughout the year. This summer we have implemented a new revision, and produced a User's Manual so that the pilot system can be made available to other practitioners and researchers. We have also designed, and partly implemented, a larger system capable of dealing with much more powerful heuristics.

  12. A two-view ultrasound CAD system for spina bifida detection using Zernike features

    NASA Astrophysics Data System (ADS)

    Konur, Umut; Gürgen, Fikret; Varol, Füsun

    2011-03-01

    In this work, we address a very specific CAD (Computer Aided Detection/Diagnosis) problem and try to detect one of the relatively common birth defects - spina bifida, in the prenatal period. To do this, fetal ultrasound images are used as the input imaging modality, which is the most convenient so far. Our approach is to decide using two particular types of views of the fetal neural tube. Transcerebellar head (i.e. brain) and transverse (axial) spine images are processed to extract features which are then used to classify healthy (normal), suspicious (probably defective) and non-decidable cases. Decisions raised by two independent classifiers may be individually treated, or if desired and data related to both modalities are available, those decisions can be combined to keep matters more secure. Even more security can be attained by using more than two modalities and base the final decision on all those potential classifiers. Our current system relies on feature extraction from images for cases (for particular patients). The first step is image preprocessing and segmentation to get rid of useless image pixels and represent the input in a more compact domain, which is hopefully more representative for good classification performance. Next, a particular type of feature extraction, which uses Zernike moments computed on either B/W or gray-scale image segments, is performed. The aim here is to obtain values for indicative markers that signal the presence of spina bifida. Markers differ depending on the image modality being used. Either shape or texture information captured by moments may propose useful features. Finally, SVM is used to train classifiers to be used as decision makers. Our experimental results show that a promising CAD system can be actualized for the specific purpose. On the other hand, the performance of such a system would highly depend on the qualities of image preprocessing, segmentation, feature extraction and comprehensiveness of image data.

  13. Factors influencing perceptions of need for and decisions to solicit child mental health services by parents of 9-12 year-old Korean children.

    PubMed

    Cho, Sun-Mi; Kim, Hyun-Chung; Cho, Hyun; Shin, Yun-Mi

    2007-12-01

    As children with emotional or behavioral problems often fail to receive the treatment available to them, this study examined (1) the degree of perceived need (PN) among Korean parents regarding mental health services for their children, (2) the factors associated with such perceptions, (3) the degree to which Korean parents actually engage mental health services for their children, and (4) the factors associated with such use. To determine the degrees of PN and actual use, 1,058 children aged between 9 and 12 years were asked to complete the Children's Depression Inventory, while their parents completed the Child Behavior Checklist. About 11.4% of the parents demonstrated PN, compared to 2.7% who actually engaged child mental health services. While most of the CBCL factors were associated with PN, the child's self-report significantly affected the perception as well. The attention problem score in the CBCL was the only factor that strongly corresponded to the actual use of services in Korea, a country where academic achievement is considered paramount, which suggests that cultural forces may play a powerful role in determining parents' decisions regarding child mental health care.

  14. Overcoming computational uncertainties to reveal chemical sensitivity in single molecule conduction calculations.

    PubMed

    Solomon, Gemma C; Reimers, Jeffrey R; Hush, Noel S

    2005-06-08

    In the calculation of conduction through single molecule's approximations about the geometry and electronic structure of the system are usually made in order to simplify the problem. Previously [G. C. Solomon, J. R. Reimers, and N. S. Hush, J. Chem. Phys. 121, 6615 (2004)], we have shown that, in calculations employing cluster models for the electrodes, proper treatment of the open-shell nature of the clusters is the most important computational feature required to make the results sensitive to variations in the structural and chemical features of the system. Here, we expand this and establish a general hierarchy of requirements involving treatment of geometrical approximations. These approximations are categorized into two classes: those associated with finite-dimensional methods for representing the semi-infinite electrodes, and those associated with the chemisorption topology. We show that ca. 100 unique atoms are required in order to properly characterize each electrode: using fewer atoms leads to nonsystematic variations in conductivity that can overwhelm the subtler changes. The choice of binding site is shown to be the next most important feature, while some effects that are difficult to control experimentally concerning the orientations at each binding site are actually shown to be insignificant. Verification of this result provides a general test for the precision of computational procedures for molecular conductivity. Predictions concerning the dependence of conduction on substituent and other effects on the central molecule are found to be meaningful only when they exceed the uncertainties of the effects associated with binding-site variation.

  15. Overcoming computational uncertainties to reveal chemical sensitivity in single molecule conduction calculations

    NASA Astrophysics Data System (ADS)

    Solomon, Gemma C.; Reimers, Jeffrey R.; Hush, Noel S.

    2005-06-01

    In the calculation of conduction through single molecule's approximations about the geometry and electronic structure of the system are usually made in order to simplify the problem. Previously [G. C. Solomon, J. R. Reimers, and N. S. Hush, J. Chem. Phys. 121, 6615 (2004)], we have shown that, in calculations employing cluster models for the electrodes, proper treatment of the open-shell nature of the clusters is the most important computational feature required to make the results sensitive to variations in the structural and chemical features of the system. Here, we expand this and establish a general hierarchy of requirements involving treatment of geometrical approximations. These approximations are categorized into two classes: those associated with finite-dimensional methods for representing the semi-infinite electrodes, and those associated with the chemisorption topology. We show that ca. 100 unique atoms are required in order to properly characterize each electrode: using fewer atoms leads to nonsystematic variations in conductivity that can overwhelm the subtler changes. The choice of binding site is shown to be the next most important feature, while some effects that are difficult to control experimentally concerning the orientations at each binding site are actually shown to be insignificant. Verification of this result provides a general test for the precision of computational procedures for molecular conductivity. Predictions concerning the dependence of conduction on substituent and other effects on the central molecule are found to be meaningful only when they exceed the uncertainties of the effects associated with binding-site variation.

  16. Students' Environmental Competence Formation as a Pedagogical Problem

    ERIC Educational Resources Information Center

    Ponomarenko, Yelena V.; Yessaliev, Aidarbek A.; Kenzhebekova, Rabiga I.; Moldabek, Kulahmet; Larchekova, Liudmila A.; Dairbekov, Serik S.; Asambaeva, Lazzat

    2016-01-01

    Environmentally conscious and preparation of competent professionals' in higher education system in Kazakhstan is a priority. The need for more effective environmental competence formation for students actualizes the problem of development and scientific substantiation of the theoretical model of students' environmental competence, methods of…

  17. Do word-problem features differentially affect problem difficulty as a function of students' mathematics difficulty with and without reading difficulty?

    PubMed

    Powell, Sarah R; Fuchs, Lynn S; Fuchs, Douglas; Cirino, Paul T; Fletcher, Jack M

    2009-01-01

    This study examined whether and, if so, how word-problem features differentially affect problem difficulty as a function of mathematics difficulty (MD) status: no MD (n = 109), MD only (n = 109), or MD in combination with reading difficulties (MDRD; n = 109). The problem features were problem type (total, difference, or change) and position of missing information in the number sentence representing the word problem (first, second, or third position). Students were assessed on 14 word problems near the beginning of third grade. Consistent with the hypothesis that mathematical cognition differs as a function of MD subtype, problem type affected problem difficulty differentially for MDRD versus MD-only students; however, the position of missing information in word problems did not. Implications for MD subtyping and for instruction are discussed.

  18. Examining End-Of-Chapter Problems Across Editions of an Introductory Calculus-Based Physics Textbook

    NASA Astrophysics Data System (ADS)

    Xiao, Bin

    End-Of-Chapter (EOC) problems have been part of many physics education studies. Typically, only problems "localized" as relevant to a single chapter were used. This work examines how well this type of problem represents all EOC problems and whether EOC problems found in leading textbooks have changed over the past several decades. To investigate whether EOC problems have connections between chapters, I solved all problems of the E&M; chapters of the most recent edition of a popular introductory level calculus-based textbook and coded the equations used to solve each problem. These results were compared to the first edition of the same text. Also, several relevant problem features were coded for those problems and results were compared for sample chapters across all editions. My findings include two parts. The result of equation usage shows that problems in the E&M; chapters do use equations from both other E&M; chapters and non-E&M; chapters. This out-of-chapter usage increased from the first edition to the last edition. Information about the knowledge structure of E&M; chapters was also revealed. The results of the problem feature study show that most EOC problems have common features but there was an increase of diversity in some of the problem features across editions.

  19. Science Fairs and Observational Science: A Case History from Earth Orbit

    NASA Technical Reports Server (NTRS)

    Lowman, Paul D., Jr.; Smith, David E. (Technical Monitor)

    2002-01-01

    Having judged dozens of science fairs over the years, I am repeatedly disturbed by the ground rules under which students must prepare their entries. They are almost invariably required to follow the "scientific method," involving formulating a hypothesis, a test of the hypothesis, and then a project in which this test is carried out. As a research scientist for over 40 years, I consider this approach to science fairs fundamentally unsound. It is not only too restrictive, but actually avoids the most important (and difficult) part of scientific research: recognizing a scientific problem in the first place. A well-known example is one of the problems that, by his own account, stimulated Einstein's theory of special relativity: the obvious fact that when an electric current is induced in a conductor by a magnetic field , it makes no difference whether the field or the conductor is actually (so to speak) moving. There is in other words no such thing as absolute motion. Physics was transformed by Einstein's recognition of a problem. Most competent scientists can solve problems after they have been recognized and a hypothesis properly formulated, but the ability to find problems in the first Place is much rarer. Getting down to specifics, the "scientific method" under which almost all students must operate is actually the experimental method, involving controlled variables, one of which, ideally, is changed at a time. However, there is another type of science that can be called observational science. As it happens, almost all the space research I have carried out since 1959 has been this type, not experimental science.

  20. On the Problems Existed in Chinese Art Education and the Way Out

    ERIC Educational Resources Information Center

    Yue, Youxi

    2009-01-01

    Nowadays Chinese art education has mostly four problems: The first one is to make art education skilling; The second is to make art education moralization; The third is to make art education mechanization; The fourth is to make art education marginalization. The root of the problems has two aspects: First, the actuality of education system affects…

  1. Do Children Do What They Say? Responses to Hypothetical and Real-Life Social Problems in Children with Mild Intellectual Disabilities and Behaviour Problems

    ERIC Educational Resources Information Center

    van Nieuwenhuijzen, M.; Bijman, E. R.; Lamberix, I. C. W.; Wijnroks, L.; de Castro, B. Orobio; Vermeer, A.; Matthys, W.

    2005-01-01

    Abstract: Background Most research on children's social problem-solving skills is based on responses to hypothetical vignettes. Just how these responses relate to actual behaviour in real-life social situations is, however, unclear, particularly for children with mild intellectual disabilities (MID). Method: In the present study, the spontaneous…

  2. Seeing around a Ball: Complex, Technology-Based Problems in Calculus with Applications in Science and Engineering-Redux

    ERIC Educational Resources Information Center

    Winkel, Brian

    2008-01-01

    A complex technology-based problem in visualization and computation for students in calculus is presented. Strategies are shown for its solution and the opportunities for students to put together sequences of concepts and skills to build for success are highlighted. The problem itself involves placing an object under water in order to actually see…

  3. Do Black Families Value Education? White Teachers, Institutional Cultural Narratives, & Beliefs about African Americans

    ERIC Educational Resources Information Center

    Puchner, Laurel; Markowitz, Linda

    2015-01-01

    In this article Puchner and Markowitz illustrate a major problem in education and in teacher education, the underlying dynamics of which are a national problem. The problem of negative beliefs about African American families in schools is not a new idea but actually stems from unfounded and untested assumptions about the way the world works and…

  4. Children's Problem-Solving in Serious Games: The "Fine-Tuning System (FTS)" Elaborated

    ERIC Educational Resources Information Center

    Obikwelu, Chinedu; Read, Janet; Sim, Gavin

    2013-01-01

    For a child to learn through Problem-Solving in Serious games, the game scaffolding mechanism has to be effective. Scaffolding is based on the Vygotzkian Zone of Proximal Development (ZPD) concept which refers to the distance between the actual development level as determined by independent problem solving and the level of potential development as…

  5. [The genetics of collagen diseases].

    PubMed

    Kaplan, J; Maroteaux, P; Frezal, J

    1986-01-01

    Heritable disorders of collagen include Ehler-Danlos syndromes (11 types are actually known), Larsen syndrome and osteogenesis imperfecta. Their clinical, genetic and biochemical features are reviewed. Marfan syndrome is closely related to heritable disorders of collagen.

  6. Tribute to an Astronomer: The Work of Max Ernst on Wilhelm Tempel

    NASA Astrophysics Data System (ADS)

    Nazé, Yaël

    2016-05-01

    In 1964-1974, the German artist Max Ernst created, with the help of two friends, a series of works (books, movie, and paintings) related to the astronomer Wilhelm Tempel. Mixing actual texts by Tempel and artistic features, this series pays homage to the astronomer by recalling his life and discoveries. Moreover, the core of the project, the book Maximiliana or the Illegal Practice of Astronomy, actually depicts the way science works, making this work of art a most original tribute to a scientist.

  7. Analysis of intracerebral EEG recordings of epileptic spikes: insights from a neural network model

    PubMed Central

    Demont-Guignard, Sophie; Benquet, Pascal; Gerber, Urs; Wendling, Fabrice

    2009-01-01

    The pathophysiological interpretation of EEG signals recorded with depth electrodes (i.e. local field potentials, LFPs) during interictal (between seizures) or ictal (during seizures) periods is fundamental in the pre-surgical evaluation of patients with drug-resistant epilepsy. Our objective was to explain specific shape features of interictal spikes in the hippocampus (observed in LFPs) in terms of cell and network-related parameters of neuronal circuits that generate these events. We developed a neural network model based on “minimal” but biologically-relevant neuron models interconnected through GABAergic and glutamatergic synapses that reproduces the main physiological features of the CA1 subfield. Simulated LFPs were obtained by solving the forward problem (dipole theory) from networks including a large number (~3000) of cells. Insertion of appropriate parameters allowed the model to simulate events that closely resemble actual epileptic spikes. Moreover, the shape of the early fast component (‘spike’) and the late slow component (‘negative wave’) was linked to the relative contribution of glutamatergic and GABAergic synaptic currents in pyramidal cells. In addition, the model provides insights about the sensitivity of electrode localization with respect to recorded tissue volume and about the relationship between the LFP and the intracellular activity of principal cells and interneurons represented in the network. PMID:19651549

  8. A Transparent Window into Biology: A Primer on Caenorhabditis elegans.

    PubMed

    Corsi, Ann K; Wightman, Bruce; Chalfie, Martin

    2015-06-01

    A little over 50 years ago, Sydney Brenner had the foresight to develop the nematode (round worm) Caenorhabditis elegans as a genetic model for understanding questions of developmental biology and neurobiology. Over time, research on C. elegans has expanded to explore a wealth of diverse areas in modern biology including studies of the basic functions and interactions of eukaryotic cells, host-parasite interactions, and evolution. C. elegans has also become an important organism in which to study processes that go awry in human diseases. This primer introduces the organism and the many features that make it an outstanding experimental system, including its small size, rapid life cycle, transparency, and well-annotated genome. We survey the basic anatomical features, common technical approaches, and important discoveries in C. elegans research. Key to studying C. elegans has been the ability to address biological problems genetically, using both forward and reverse genetics, both at the level of the entire organism and at the level of the single, identified cell. These possibilities make C. elegans useful not only in research laboratories, but also in the classroom where it can be used to excite students who actually can see what is happening inside live cells and tissues. Copyright © 2015 Corsi, Wightman, and Chalfie.

  9. A Cross-Domain Collaborative Filtering Algorithm Based on Feature Construction and Locally Weighted Linear Regression

    PubMed Central

    Jiang, Feng; Han, Ji-zhong

    2018-01-01

    Cross-domain collaborative filtering (CDCF) solves the sparsity problem by transferring rating knowledge from auxiliary domains. Obviously, different auxiliary domains have different importance to the target domain. However, previous works cannot evaluate effectively the significance of different auxiliary domains. To overcome this drawback, we propose a cross-domain collaborative filtering algorithm based on Feature Construction and Locally Weighted Linear Regression (FCLWLR). We first construct features in different domains and use these features to represent different auxiliary domains. Thus the weight computation across different domains can be converted as the weight computation across different features. Then we combine the features in the target domain and in the auxiliary domains together and convert the cross-domain recommendation problem into a regression problem. Finally, we employ a Locally Weighted Linear Regression (LWLR) model to solve the regression problem. As LWLR is a nonparametric regression method, it can effectively avoid underfitting or overfitting problem occurring in parametric regression methods. We conduct extensive experiments to show that the proposed FCLWLR algorithm is effective in addressing the data sparsity problem by transferring the useful knowledge from the auxiliary domains, as compared to many state-of-the-art single-domain or cross-domain CF methods. PMID:29623088

  10. A Cross-Domain Collaborative Filtering Algorithm Based on Feature Construction and Locally Weighted Linear Regression.

    PubMed

    Yu, Xu; Lin, Jun-Yu; Jiang, Feng; Du, Jun-Wei; Han, Ji-Zhong

    2018-01-01

    Cross-domain collaborative filtering (CDCF) solves the sparsity problem by transferring rating knowledge from auxiliary domains. Obviously, different auxiliary domains have different importance to the target domain. However, previous works cannot evaluate effectively the significance of different auxiliary domains. To overcome this drawback, we propose a cross-domain collaborative filtering algorithm based on Feature Construction and Locally Weighted Linear Regression (FCLWLR). We first construct features in different domains and use these features to represent different auxiliary domains. Thus the weight computation across different domains can be converted as the weight computation across different features. Then we combine the features in the target domain and in the auxiliary domains together and convert the cross-domain recommendation problem into a regression problem. Finally, we employ a Locally Weighted Linear Regression (LWLR) model to solve the regression problem. As LWLR is a nonparametric regression method, it can effectively avoid underfitting or overfitting problem occurring in parametric regression methods. We conduct extensive experiments to show that the proposed FCLWLR algorithm is effective in addressing the data sparsity problem by transferring the useful knowledge from the auxiliary domains, as compared to many state-of-the-art single-domain or cross-domain CF methods.

  11. Mastery Multiplied

    ERIC Educational Resources Information Center

    Shumway, Jessica F.; Kyriopoulos, Joan

    2014-01-01

    Being able to find the correct answer to a math problem does not always indicate solid mathematics mastery. A student who knows how to apply the basic algorithms can correctly solve problems without understanding the relationships between numbers or why the algorithms work. The Common Core standards require that students actually understand…

  12. Investigations on the Behavior of HVOF and Cold Sprayed Ni-20Cr Coating on T22 Boiler Steel in Actual Boiler Environment

    NASA Astrophysics Data System (ADS)

    Bala, Niraj; Singh, Harpreet; Prakash, Satya; Karthikeyan, J.

    2012-01-01

    High temperature corrosion accompanied by erosion is a severe problem, which may result in premature failure of the boiler tubes. One countermeasure to overcome this problem is the use of thermal spray protective coatings. In the current investigation high velocity oxy-fuel (HVOF) and cold spray processes have been used to deposit commercial Ni-20Cr powder on T22 boiler steel. To evaluate the performance of the coatings in actual conditions the bare as well as the coated steels were subjected to cyclic exposures, in the superheater zone of a coal fired boiler for 15 cycles. The weight change and thickness loss data were used to establish kinetics of the erosion-corrosion. X-ray diffraction, surface and cross-sectional field emission scanning electron microscope/energy dispersive spectroscopy (FE-SEM/EDS) and x-ray mapping techniques were used to analyse the as-sprayed and corroded specimens. The HVOF sprayed coating performed better than its cold sprayed counterpart in actual boiler environment.

  13. Organizational management practices for achieving software process improvement

    NASA Technical Reports Server (NTRS)

    Kandt, Ronald Kirk

    2004-01-01

    The crisis in developing software has been known for over thirty years. Problems that existed in developing software in the early days of computing still exist today. These problems include the delivery of low-quality products, actual development costs that exceed expected development costs, and actual development time that exceeds expected development time. Several solutions have been offered to overcome out inability to deliver high-quality software, on-time and within budget. One of these solutions involves software process improvement. However, such efforts often fail because of organizational management issues. This paper discusses business practices that organizations should follow to improve their chances of initiating and sustaining successful software process improvement efforts.

  14. Tomographic inversion of satellite photometry. II

    NASA Technical Reports Server (NTRS)

    Solomon, S. C.; Hays, P. B.; Abreu, V. J.

    1985-01-01

    A method for combining nadir observations of emission features in the upper atmosphere with the result of a tomographic inversion of limb brightness measurements is presented. Simulated and actual results are provided, and error sensitivity is investigated.

  15. Judging Thieves of Attention: Commentary on "Assessing Cognitive Distraction in the Automobile," by Strayer, Turrill, Cooper, Coleman, Medeiros-Ward, and Biondi (2015).

    PubMed

    Hancock, Peter A; Sawyer, Ben D

    2015-12-01

    The laudable effort by Strayer and his colleagues to derive a systematic method to assess forms of cognitive distraction in the automobile is beset by the problem of nonstationary in driver response capacity. At the level of the overall goal of driving, this problem conflates actual on-road behavior; characterized by underspecified task satisficing, with our own understandable, scientifically inspired aspiration for measuring deterministic performance optimization. Measures of response conceived under this latter imperative are, at best, only shadowy reflections of the actual phenomenological experience involved in real-world vehicle control. Whether we, as a research community, can resolve this issue remains uncertain. However, we believe we can mount a positive attack on what is arguably another equally important dimension of the collision problem. © 2015, Human Factors and Ergonomics Society.

  16. Comparison of iterative inverse coarse-graining methods

    NASA Astrophysics Data System (ADS)

    Rosenberger, David; Hanke, Martin; van der Vegt, Nico F. A.

    2016-10-01

    Deriving potentials for coarse-grained Molecular Dynamics (MD) simulations is frequently done by solving an inverse problem. Methods like Iterative Boltzmann Inversion (IBI) or Inverse Monte Carlo (IMC) have been widely used to solve this problem. The solution obtained by application of these methods guarantees a match in the radial distribution function (RDF) between the underlying fine-grained system and the derived coarse-grained system. However, these methods often fail in reproducing thermodynamic properties. To overcome this deficiency, additional thermodynamic constraints such as pressure or Kirkwood-Buff integrals (KBI) may be added to these methods. In this communication we test the ability of these methods to converge to a known solution of the inverse problem. With this goal in mind we have studied a binary mixture of two simple Lennard-Jones (LJ) fluids, in which no actual coarse-graining is performed. We further discuss whether full convergence is actually needed to achieve thermodynamic representability.

  17. A Strategy for Autogeneration of Space Shuttle Ground Processing Simulation Models for Project Makespan Estimations

    NASA Technical Reports Server (NTRS)

    Madden, Michael G.; Wyrick, Roberta; O'Neill, Dale E.

    2005-01-01

    Space Shuttle Processing is a complicated and highly variable project. The planning and scheduling problem, categorized as a Resource Constrained - Stochastic Project Scheduling Problem (RC-SPSP), has a great deal of variability in the Orbiter Processing Facility (OPF) process flow from one flight to the next. Simulation Modeling is a useful tool in estimation of the makespan of the overall process. However, simulation requires a model to be developed, which itself is a labor and time consuming effort. With such a dynamic process, often the model would potentially be out of synchronization with the actual process, limiting the applicability of the simulation answers in solving the actual estimation problem. Integration of TEAMS model enabling software with our existing schedule program software is the basis of our solution. This paper explains the approach used to develop an auto-generated simulation model from planning and schedule efforts and available data.

  18. [Trial of artifact reduction in body diffusion weighted imaging development and basic examination of "TRacking Only Navigator"(TRON method)].

    PubMed

    Horie, Tomohiko; Takahara, Tarou; Ogino, Tetsuo; Okuaki, Tomoyuki; Honda, Masatoshi; Okumura, Yasuhiro; Kajihara, Nao; Usui, Keisuke; Muro, Isao; Imai, Yutaka

    2008-09-20

    In recent years, the utility of body diffusion weighted imaging as represented by diffusion weighted whole body imaging with background body signal suppression (DWIBS), the DWIBS method, is very high. However, there was a problem in the DWIBS method involving the artifact corresponding to the distance of the diaphragm. To provide a solution, the respiratory trigger (RT) method and the navigator echo method were used together. A problem was that scan time extended to the compensation and did not predict the extension rate, although both artifacts were reduced. If we used only navigator real time slice tracking (NRST) from the findings obtained by the DWIBS method, we presumed the artifacts would be ameliorable without the extension of scan time. Thus, the TRacking Only Navigator (TRON) method was developed, and a basic examination was carried out for the liver. An important feature of the TRON method is the lack of the navigator gating window (NGW) and addition of the method of linear interpolation prior to NRST. The method required the passing speed and the distance from the volunteer's diaphragm. The estimated error from the 2D-selective RF pulse (2DSRP) of the TRON method to slice excitation was calculated. The condition of 2D SRP, which did not influence the accuracy of NRST, was required by the movement phantom. The volunteer was scanned, and the evaluation and actual scan time of the image quality were compared with the RT and DWIBS methods. Diaphragm displacement speed and the quantity of displacement were determined in the head and foot directions, and the result was 9 mm/sec, and 15 mm. The estimated error was within 2.5 mm in b-factor 1000 sec/mm(2). The FA of 2DSRP was 15 degrees, and the navigator echo length was 120 mm, which was excellent. In the TRON method, the accuracy of NRST was steady because of line interpolation. The TRON method obtained image quality equal to that of the RT method with the b-factor in the volunteer scanning at short actual scan time. The TRON method can obtain image quality equal to that of the RT method in body diffusion weighted imaging within a short time. Moreover, because scan time during planning becomes actual scan time, inspection can be efficiently executed.

  19. Probabilistic hazard assessment for skin sensitization potency by dose–response modeling using feature elimination instead of quantitative structure–activity relationships

    PubMed Central

    McKim, James M.; Hartung, Thomas; Kleensang, Andre; Sá-Rocha, Vanessa

    2016-01-01

    Supervised learning methods promise to improve integrated testing strategies (ITS), but must be adjusted to handle high dimensionality and dose–response data. ITS approaches are currently fueled by the increasing mechanistic understanding of adverse outcome pathways (AOP) and the development of tests reflecting these mechanisms. Simple approaches to combine skin sensitization data sets, such as weight of evidence, fail due to problems in information redundancy and high dimension-ality. The problem is further amplified when potency information (dose/response) of hazards would be estimated. Skin sensitization currently serves as the foster child for AOP and ITS development, as legislative pressures combined with a very good mechanistic understanding of contact dermatitis have led to test development and relatively large high-quality data sets. We curated such a data set and combined a recursive variable selection algorithm to evaluate the information available through in silico, in chemico and in vitro assays. Chemical similarity alone could not cluster chemicals’ potency, and in vitro models consistently ranked high in recursive feature elimination. This allows reducing the number of tests included in an ITS. Next, we analyzed with a hidden Markov model that takes advantage of an intrinsic inter-relationship among the local lymph node assay classes, i.e. the monotonous connection between local lymph node assay and dose. The dose-informed random forest/hidden Markov model was superior to the dose-naive random forest model on all data sets. Although balanced accuracy improvement may seem small, this obscures the actual improvement in misclassifications as the dose-informed hidden Markov model strongly reduced "false-negatives" (i.e. extreme sensitizers as non-sensitizer) on all data sets. PMID:26046447

  20. Astronomy Education and Research With Digital Viewing: Forming a New Network of Small Observatories

    NASA Astrophysics Data System (ADS)

    Bogard, Arthur; Hamilton, T. S.

    2011-01-01

    Small observatories face two major hindrances in teaching astronomy to students: weather and getting students to recognize what they're seeing. The normal astronomy class use of a single telescope with an eyepiece is restricted to good skies, and it allows only one viewer at a time. Since astronomy labs meet at regular times, bad weather can mean the loss of an entire week. As for the second problem, students often have difficulties recognizing what they are seeing through an eyepiece, and the instructor cannot point out the target's features. Commercial multimedia resources, although structured and easy to explain to students, do not give students the same level of interactivity. A professor cannot improvise a new target nor can he adjust the image to view different features of an object. Luckily, advancements in technology provide solutions for both of these limitations without breaking the bank. Astronomical video cameras can automatically stack, align, and integrate still frames, providing instructors with the ability to explain things to groups of students in real time under actual seeing conditions. Using Shawnee State University's Mallincam on an 8" Cassegrain, our students are now able to understand and classify both planetary and deep sky objects better than they can through an eyepiece. To address the problems with weather, SSU proposes forming a network among existing small observatories. With inexpensive software and cameras, telescopes can be aligned and operated over the web, and with reciprocal viewing agreements, users who are clouded out could view from another location. By partnering with institutions in the eastern hemisphere, even daytime viewing would be possible. Not only will this network aid in instruction, but the common user interface will make student research projects much easier.

  1. Probabilistic hazard assessment for skin sensitization potency by dose-response modeling using feature elimination instead of quantitative structure-activity relationships.

    PubMed

    Luechtefeld, Thomas; Maertens, Alexandra; McKim, James M; Hartung, Thomas; Kleensang, Andre; Sá-Rocha, Vanessa

    2015-11-01

    Supervised learning methods promise to improve integrated testing strategies (ITS), but must be adjusted to handle high dimensionality and dose-response data. ITS approaches are currently fueled by the increasing mechanistic understanding of adverse outcome pathways (AOP) and the development of tests reflecting these mechanisms. Simple approaches to combine skin sensitization data sets, such as weight of evidence, fail due to problems in information redundancy and high dimensionality. The problem is further amplified when potency information (dose/response) of hazards would be estimated. Skin sensitization currently serves as the foster child for AOP and ITS development, as legislative pressures combined with a very good mechanistic understanding of contact dermatitis have led to test development and relatively large high-quality data sets. We curated such a data set and combined a recursive variable selection algorithm to evaluate the information available through in silico, in chemico and in vitro assays. Chemical similarity alone could not cluster chemicals' potency, and in vitro models consistently ranked high in recursive feature elimination. This allows reducing the number of tests included in an ITS. Next, we analyzed with a hidden Markov model that takes advantage of an intrinsic inter-relationship among the local lymph node assay classes, i.e. the monotonous connection between local lymph node assay and dose. The dose-informed random forest/hidden Markov model was superior to the dose-naive random forest model on all data sets. Although balanced accuracy improvement may seem small, this obscures the actual improvement in misclassifications as the dose-informed hidden Markov model strongly reduced " false-negatives" (i.e. extreme sensitizers as non-sensitizer) on all data sets. Copyright © 2015 John Wiley & Sons, Ltd.

  2. Meteoroid and Debris Impact Features Documented on the Long Duration Exposure Facility

    DTIC Science & Technology

    1990-08-01

    surfaces was very different from the hole production (penetration) mechanism in true thin films; the laminated structure was never actually penetrated...16 METEOROID & DEBRIS SPECIAL INVESTIGATION GROUP Impacts into laminated polymeric films, such as the Kapton test specimens on experiment A0138...several layers of carbon, glass, and/or Kevlar woven fiber cloth laminated together with resin binders. Impact features in these materials were

  3. Use of paired management action grids for ease in depicting differences between users' and managers' perceptions of problems

    Treesearch

    R. J. Steele; James E. Fletcher

    1992-01-01

    This research was to determine whether differences exist between users and managers concerning perceptions of actual and perceived problems in parks and primarily to present a method of graphically depicting the differing perceptions of problems which exist between users and park managers which can be easily employed by area managers and related to the public, upper...

  4. Sports, Race, and Ressentiment.

    ERIC Educational Resources Information Center

    Dowling, William C.

    2000-01-01

    Discusses the problem of college sports corruption and the debate over "the plight of the black athlete," suggesting that this debate is actually not about race or athletics but a code for examining contradictions between education and mass democracy. Calls this the problem of "ressentiment." Examines how athletes have used the "plight of the…

  5. Classroom Crisis Intervention through Contracting: A Moral Development Model.

    ERIC Educational Resources Information Center

    Smaby, Marlowe H.; Tamminen, Armas W.

    1981-01-01

    A counselor can arbitrate problem situations using a systematic approach to classroom intervention which includes meetings with the teacher and students. This crisis intervention model based on moral development can be more effective than reliance on guidance activities disconnected from the actual classroom settings where the problems arise.…

  6. Problem-Solving Practices and Complexity in School Psychology

    ERIC Educational Resources Information Center

    Brady, John; Espinosa, William R.

    2017-01-01

    How do experienced school psychologists solve problems in their practice? What can trainers of school psychologists learn about how to structure training and mentoring of graduate students from what actually happens in schools, and how can this inform our teaching at the university? This qualitative multi-interview study explored the processes…

  7. Learning from patients: Identifying design features of medicines that cause medication use problems.

    PubMed

    Notenboom, Kim; Leufkens, Hubert Gm; Vromans, Herman; Bouvy, Marcel L

    2017-01-30

    Usability is a key factor in ensuring safe and efficacious use of medicines. However, several studies showed that people experience a variety of problems using their medicines. The purpose of this study was to identify design features of oral medicines that cause use problems among older patients in daily practice. A qualitative study with semi-structured interviews on the experiences of older people with the use of their medicines was performed (n=59). Information on practical problems, strategies to overcome these problems and the medicines' design features that caused these problems were collected. The practical problems and management strategies were categorised into 'use difficulties' and 'use errors'. A total of 158 use problems were identified, of which 45 were categorized as use difficulties and 113 as use error. Design features that contributed the most to the occurrence of use difficulties were the dimensions and surface texture of the dosage form (29.6% and 18.5%, respectively). Design features that contributed the most to the occurrence of use errors were the push-through force of blisters (22.1%) and tamper evident packaging (12.1%). These findings will help developers of medicinal products to proactively address potential usability issues with their medicines. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  8. Dilemmas in the (un)veiling of the diagnosis of Alzheimer's disease: walking an ethical and professional tight rope.

    PubMed

    Karnieli-Miller, Orit; Werner, Perla; Aharon-Peretz, Judith; Eidelman, Shmuel

    2007-08-01

    To enhance the understanding and effect of physician's difficulties, attitudes and communication styles on the disclosure of the diagnosis of AD in practice. Qualitative, phenomenological study, combining pre-encounter interviews with physicians, observations of actual encounters of diagnosis disclosure of AD, and post-encounter interviews. There were various ways or tactics to (un)veil the bad news that may be perceived as different ways of dulling the impact and avoiding full and therefore problematic statements. In the actual encounters this was accomplished by keeping encounters short, avoiding elaboration, confirmation of comprehension and explicit terminology and using fractured sentences. The present study's findings highlight the difficulties encountered in breaking the news about AD, in the way it is actually done, and the problems that may arise from this way of un/veiling the news. The main problem is that the reluctance to make a candid disclosure of the diagnosis as was demonstrated in this study may violate basic moral and legal rights and may also deprive patients and caregivers of some of the benefits of early disclosure of diagnosis. There is a need for assisting physicians to cope with their personal difficulties, problems and pitfalls in breaking the news.

  9. Young people's mental health first aid intentions and beliefs prospectively predict their actions: findings from an Australian National Survey of Youth.

    PubMed

    Yap, Marie Bee Hui; Jorm, Anthony Francis

    2012-04-30

    Little is known about whether mental health first aid knowledge and beliefs of young people actually translate into actual behavior. This study examined whether young people's first aid intentions and beliefs predicted the actions they later took to help a close friend or family member with a mental health problem. Participants in a 2006 national survey of Australian youth (aged 12-25 years) reported on their first aid intentions and beliefs based on one of four vignettes: depression, depression with alcohol misuse, psychosis, and social phobia. At a two-year follow-up interview, they reported on actions they had taken to help any family member or close friend with a problem similar to the vignette character since the initial interview. Of the 2005 participants interviewed at follow-up, 608 reported knowing someone with a similar problem. Overall, young people's first aid intentions and beliefs about the helpfulness of particular first aid actions predicted the actions they actually took to assist a close other. However, the belief in and intention to encourage professional help did not predict subsequent action. Findings suggest that young people's mental health first aid intentions and beliefs may be valid indicators of their subsequent actions. Copyright © 2012 Elsevier Ltd. All rights reserved.

  10. 4 Top Healthy Snacks | NIH MedlinePlus the Magazine

    MedlinePlus

    ... this page please turn Javascript on. Feature: Reducing Childhood Obesity 4 Top Healthy Snacks Past Issues / Spring - Summer ... looking at whether or not the risks for childhood obesity could actually start before birth. The subject needs ...

  11. System Complexity Reduction via Feature Selection

    ERIC Educational Resources Information Center

    Deng, Houtao

    2011-01-01

    This dissertation transforms a set of system complexity reduction problems to feature selection problems. Three systems are considered: classification based on association rules, network structure learning, and time series classification. Furthermore, two variable importance measures are proposed to reduce the feature selection bias in tree…

  12. Thermal Mechanical Fatigue of Coated Blade Materials

    DTIC Science & Technology

    1989-06-01

    temperature and strain greatly affect TMF life. The temperature-strain phase angle may vary from 180 degrees out of phase, for fast transients at...simplified constitutive technique. The life prediction model was specifically not designed to be a constitutive excercise , and therefore the observed test...the actual test. In one case (S/N 25) the actual tensile stresses were larger than the predicted values. This was caused by intermittent problems with

  13. Comparing Models of Helper Behavior to Actual Practice in Telephone Crisis Intervention: A Silent Monitoring Study of Calls to the U.S. 1-800-SUICIDE Network

    ERIC Educational Resources Information Center

    Mishara, Brian L.; Chagnon, Francois; Daigle, Marc; Balan, Bogdan; Raymond, Sylvaine; Marcoux, Isabelle; Bardon, Cecile; Campbell, Julie K.; Berman, Alan

    2007-01-01

    Models of telephone crisis intervention in suicide prevention and best practices were developed from a literature review and surveys of crisis centers. We monitored 2,611 calls to 14 centers using reliable behavioral ratings to compare actual interventions with the models. Active listening and collaborative problem-solving models describe help…

  14. [Simulation and Design of Infant Incubator Assembly Line].

    PubMed

    Ke, Huqi; Hu, Xiaoyong; Ge, Xia; Hu, Yanhai; Chen, Zaihong

    2015-11-01

    According to current assembly situation of infant incubator in company A, basic industrial engineering means such as time study was used to analyze the actual products assembly production and an assembly line was designed. The assembly line was modeled and simulated with software Flexsim. The problem of the assembly line was found by comparing simulation result and actual data, then through optimization to obtain high efficiency assembly line.

  15. Approach to the problem of the parameters optimization of the shooting system

    NASA Astrophysics Data System (ADS)

    Demidova, L. A.; Sablina, V. A.; Sokolova, Y. S.

    2018-02-01

    The problem of the objects identification on the base of their hyperspectral features has been considered. It is offered to use the SVM classifiers’ ensembles, adapted to specifics of the problem of the objects identification on the base of their hyperspectral features. The results of the objects identification on the base of their hyperspectral features with using of the SVM classifiers have been presented.

  16. The Methodology of Calculation of Cutting Forces When Machining Composite Materials

    NASA Astrophysics Data System (ADS)

    Rychkov, D. A.; Yanyushkin, A. S.

    2016-08-01

    Cutting of composite materials has specific features and is different from the processing of metals. When this characteristic intense wear of the cutting tool. An important criterion in the selection process parameters composite processing is the value of the cutting forces, which depends on many factors and is determined experimentally, it is not always appropriate. The study developed a method of determining the cutting forces when machining composite materials and the comparative evaluation of the calculated and actual values of cutting forces. The methodology for calculating cutting forces into account specific features of the cutting tool and the extent of wear, the strength properties of the processed material and cutting conditions. Experimental studies conducted with fiberglass milling cutter equipped with elements of hard metal VK3M. The discrepancy between the estimated and the actual values of the cutting force is not more than 10%.

  17. Public health insurance under a nonbenevolent state.

    PubMed

    Lemieux, Pierre

    2008-10-01

    This paper explores the consequences of the oft ignored fact that public health insurance must actually be supplied by the state. Depending how the state is modeled, different health insurance outcomes are expected. The benevolent model of the state does not account for many actual features of public health insurance systems. One alternative is to use a standard public choice model, where state action is determined by interaction between self-interested actors. Another alternative--related to a strand in public choice theory--is to model the state as Leviathan. Interestingly, some proponents of public health insurance use an implicit Leviathan model, but not consistently. The Leviathan model of the state explains many features of public health insurance: its uncontrolled growth, its tendency toward monopoly, its capacity to buy trust and loyalty from the common people, its surveillance ability, its controlling nature, and even the persistence of its inefficiencies and waiting lines.

  18. Application of Monte Carlo techniques to optimization of high-energy beam transport in a stochastic environment

    NASA Technical Reports Server (NTRS)

    Parrish, R. V.; Dieudonne, J. E.; Filippas, T. A.

    1971-01-01

    An algorithm employing a modified sequential random perturbation, or creeping random search, was applied to the problem of optimizing the parameters of a high-energy beam transport system. The stochastic solution of the mathematical model for first-order magnetic-field expansion allows the inclusion of state-variable constraints, and the inclusion of parameter constraints allowed by the method of algorithm application eliminates the possibility of infeasible solutions. The mathematical model and the algorithm were programmed for a real-time simulation facility; thus, two important features are provided to the beam designer: (1) a strong degree of man-machine communication (even to the extent of bypassing the algorithm and applying analog-matching techniques), and (2) extensive graphics for displaying information concerning both algorithm operation and transport-system behavior. Chromatic aberration was also included in the mathematical model and in the optimization process. Results presented show this method as yielding better solutions (in terms of resolutions) to the particular problem than those of a standard analog program as well as demonstrating flexibility, in terms of elements, constraints, and chromatic aberration, allowed by user interaction with both the algorithm and the stochastic model. Example of slit usage and a limited comparison of predicted results and actual results obtained with a 600 MeV cyclotron are given.

  19. Inferring the mesoscale structure of layered, edge-valued, and time-varying networks

    NASA Astrophysics Data System (ADS)

    Peixoto, Tiago P.

    2015-10-01

    Many network systems are composed of interdependent but distinct types of interactions, which cannot be fully understood in isolation. These different types of interactions are often represented as layers, attributes on the edges, or as a time dependence of the network structure. Although they are crucial for a more comprehensive scientific understanding, these representations offer substantial challenges. Namely, it is an open problem how to precisely characterize the large or mesoscale structure of network systems in relation to these additional aspects. Furthermore, the direct incorporation of these features invariably increases the effective dimension of the network description, and hence aggravates the problem of overfitting, i.e., the use of overly complex characterizations that mistake purely random fluctuations for actual structure. In this work, we propose a robust and principled method to tackle these problems, by constructing generative models of modular network structure, incorporating layered, attributed and time-varying properties, as well as a nonparametric Bayesian methodology to infer the parameters from data and select the most appropriate model according to statistical evidence. We show that the method is capable of revealing hidden structure in layered, edge-valued, and time-varying networks, and that the most appropriate level of granularity with respect to the additional dimensions can be reliably identified. We illustrate our approach on a variety of empirical systems, including a social network of physicians, the voting correlations of deputies in the Brazilian national congress, the global airport network, and a proximity network of high-school students.

  20. Neural-network-based system for recognition of partially occluded shapes and patterns

    NASA Astrophysics Data System (ADS)

    Mital, Dinesh P.; Teoh, Eam-Khwang; Amarasinghe, S. K.; Suganthan, P. N.

    1996-10-01

    The purpose of this paper is to demonstrate how a structural matching approach can be used to perfonn effective rotational invariant fingerprint identification. In this approach, each of the exiracted features is correlated with Live of its nearest neighbouring features to form a local feature gmup for a first-stage matching. After that, the feature with the highest match is used as a central feature whereby all the other features are correlated to form a global feature group for a second.stage matching. The correlation between the features is in terms of distance and relative angle. This approach actually make the matching method rotational invariant A substantial amount of testing was carried out and it shows that this matching technique is capable of matching the four basic fingerprint patterns with an average matching time of4 seconds on a 66Mhz, 486 DX personal computer.

  1. Designing attractive gamification features for collaborative storytelling websites.

    PubMed

    Hsu, Shang Hwa; Chang, Jen-Wei; Lee, Chun-Chia

    2013-06-01

    Gamification design is considered as the predictor of collaborative storytelling websites' success. Although aforementioned studies have mentioned a broad range of factors that may influence gamification, they neither depicted the actual design features nor relative attractiveness among them. This study aims to identify attractive gamification features for collaborative storytelling websites. We first constructed a hierarchical system structure of gamification design of collaborative storytelling websites and conducted a focus group interview with eighteen frequent users to identify 35gamification features. After that, this study determined the relative attractiveness of these gamification features by administrating an online survey to 6333 collaborative storytelling websites users. The results indicated that the top 10 most attractive gamification features could account for more than 50% of attractiveness among these 35 gamification features. The feature of unpredictable time pressure is important to website users, yet not revealed in previous relevant studies. Implications of the findings were discussed.

  2. Public policies and health systems in Sahelian Africa: theoretical context and empirical specificity

    PubMed Central

    2015-01-01

    This research on user fee removal in three African countries is located at the interface of public policy analysis and health systems research. Public policy analysis has gradually become a vast and multifaceted area of research consisting of a number of perspectives. But the context of public policies in Sahelian Africa has some specific characteristics. They are largely shaped by international institutions and development agencies, on the basis of very common 'one-size-fits-all' models; the practical norms that govern the actual behaviour of employees are far removed from official norms; public goods and services are co-delivered by a string of different actors and institutions, with little coordination between them; the State is widely regarded by the majority of citizens as untrustworthy. In such a context, setting up and implementing health user fee exemptions in Burkina Faso, Mali and Niger was beset by major problems, lack of coherence and bottlenecks that affect public policy-making and implementation in these countries. Health systems research for its part started to gain momentum less than twenty years ago and is becoming a discipline in its own right. But French-speaking African countries scarcely feature in it, and social sciences are not yet fully integrated. This special issue wants to fill the gap. In the Sahel, the bad health indicators reflect a combination of converging factors: lack of health centres, skilled staff, and resources; bad quality of care delivery, corruption, mismanagement; absence of any social security or meaningful commitment to the worst-off; growing competition from drug peddlers on one side, from private clinics on the other. Most reforms of the health system have various 'blind spots'. They do not take in account the daily reality of its functioning, its actual governance, the implicit rationales of the actors involved, and the quality of healthcare provision. In order to document the numerous neglected problems of the health system, a combination of quantitative and qualitative methods is needed to produce evidence. PMID:26559118

  3. Understanding Scientific Methodology in the Historical and Experimental Sciences via Language Analysis

    NASA Astrophysics Data System (ADS)

    Dodick, Jeff; Argamon, Shlomo; Chase, Paul

    2009-08-01

    A key focus of current science education reforms involves developing inquiry-based learning materials. However, without an understanding of how working scientists actually do science, such learning materials cannot be properly developed. Until now, research on scientific reasoning has focused on cognitive studies of individual scientific fields. However, the question remains as to whether scientists in different fields fundamentally rely on different methodologies. Although many philosophers and historians of science do indeed assert that there is no single monolithic scientific method, this has never been tested empirically. We therefore approach this problem by analyzing patterns of language used by scientists in their published work. Our results demonstrate systematic variation in language use between types of science that are thought to differ in their characteristic methodologies. The features of language use that were found correspond closely to a proposed distinction between Experimental Sciences (e.g., chemistry) and Historical Sciences (e.g., paleontology); thus, different underlying rhetorical and conceptual mechanisms likely operate for scientific reasoning and communication in different contexts.

  4. Evolving neural networks with genetic algorithms to study the string landscape

    NASA Astrophysics Data System (ADS)

    Ruehle, Fabian

    2017-08-01

    We study possible applications of artificial neural networks to examine the string landscape. Since the field of application is rather versatile, we propose to dynamically evolve these networks via genetic algorithms. This means that we start from basic building blocks and combine them such that the neural network performs best for the application we are interested in. We study three areas in which neural networks can be applied: to classify models according to a fixed set of (physically) appealing features, to find a concrete realization for a computation for which the precise algorithm is known in principle but very tedious to actually implement, and to predict or approximate the outcome of some involved mathematical computation which performs too inefficient to apply it, e.g. in model scans within the string landscape. We present simple examples that arise in string phenomenology for all three types of problems and discuss how they can be addressed by evolving neural networks from genetic algorithms.

  5. Research on artistic gymnastics training guidance model

    NASA Astrophysics Data System (ADS)

    Luo, Lin; Sun, Xianzhong

    2017-04-01

    Rhythmic gymnastics training guidance model, taking into consideration the features of artistic gymnastics training, is put forward to help gymnasts identify their deficiencies and unskilled technical movements and improve their training effects. The model is built on the foundation of both physical quality indicator model and artistic gymnastics training indicator model. Physical quality indicator model composed of bodily factor, flexibility-strength factor and speed-dexterity factor delivers an objective evaluation with reference to basic sport testing data. Training indicator model, based on physical fitness indicator, helps analyze the technical movements, through which the impact from each bodily factor on technical movements is revealed. AG training guidance model, in further combination with actual training data and in comparison with the data shown in the training indicator model, helps identify the problems in trainings, and thus improve the training effect. These three models when in combined use and in comparison with historical model data can check and verify the improvement in training effect over a certain period of time.

  6. Comparison of BrainTool to other UML modeling and model transformation tools

    NASA Astrophysics Data System (ADS)

    Nikiforova, Oksana; Gusarovs, Konstantins

    2017-07-01

    In the last 30 years there were numerous model generated software systems offered targeting problems with the development productivity and the resulting software quality. CASE tools developed due today's date are being advertised as having "complete code-generation capabilities". Nowadays the Object Management Group (OMG) is calling similar arguments in regards to the Unified Modeling Language (UML) models at different levels of abstraction. It is being said that software development automation using CASE tools enables significant level of automation. Actual today's CASE tools are usually offering a combination of several features starting with a model editor and a model repository for a traditional ones and ending with code generator (that could be using a scripting or domain-specific (DSL) language), transformation tool to produce the new artifacts from the manually created and transformation definition editor to define new transformations for the most advanced ones. Present paper contains the results of CASE tool (mainly UML editors) comparison against the level of the automation they are offering.

  7. Beamline 10.3.2 at ALS: a hard X-ray microprobe for environmental and materials sciences.

    PubMed

    Marcus, Matthew A; MacDowell, Alastair A; Celestre, Richard; Manceau, Alain; Miller, Tom; Padmore, Howard A; Sublett, Robert E

    2004-05-01

    Beamline 10.3.2 at the ALS is a bend-magnet line designed mostly for work on environmental problems involving heavy-metal speciation and location. It offers a unique combination of X-ray fluorescence mapping, X-ray microspectroscopy and micro-X-ray diffraction. The optics allow the user to trade spot size for flux in a size range of 5-17 microm in an energy range of 3-17 keV. The focusing uses a Kirkpatrick-Baez mirror pair to image a variable-size virtual source onto the sample. Thus, the user can reduce the effective size of the source, thereby reducing the spot size on the sample, at the cost of flux. This decoupling from the actual source also allows for some independence from source motion. The X-ray fluorescence mapping is performed with a continuously scanning stage which avoids the time overhead incurred by step-and-repeat mapping schemes. The special features of this beamline are described, and some scientific results shown.

  8. Care management for older people with mental health problems: from evidence to practice.

    PubMed

    Tucker, Sue; Hughes, Jane; Sutcliffe, Caroline; Challis, David

    2008-05-01

    To explore the implications of providing intensive care management in a typical old age mental health service in North West England. The time spent by core groups of specialist mental health and social services staff on a range of activities deemed central to the provision of intensive care management was explored by means of a diary exercise. The difference between what is actually being done and what evidence suggests is needed was examined. More than 1500 hours of activity were appraised. Assessment and care management-related tasks accounted for more than 40% and 30% of social work and nursing staff's time, respectively. However, several fundamental features of intensive care management were lacking, including health staff's adoption of the care manager role, arrangements to facilitate appropriate information sharing and sufficient time for practitioners to provide the necessary careful assessment of needs, liaison with other agencies, and close and regular contact with the elderly person and their care network.

  9. Vega-Constellation Tools to Analize Hyperspectral Images

    NASA Astrophysics Data System (ADS)

    Savorskiy, V.; Loupian, E.; Balashov, I.; Kashnitskii, A.; Konstantinova, A.; Tolpin, V.; Uvarov, I.; Kuznetsov, O.; Maklakov, S.; Panova, O.; Savchenko, E.

    2016-06-01

    Creating high-performance means to manage massive hyperspectral data (HSD) arrays is an actual challenge when it is implemented to deal with disparate information resources. Aiming to solve this problem the present work develops tools to work with HSD in a distributed information infrastructure, i.e. primarily to use those tools in remote access mode. The main feature of presented approach is in the development of remotely accessed services, which allow users both to conduct search and retrieval procedures on HSD sets and to provide target users with tools to analyze and to process HSD in remote mode. These services were implemented within VEGA-Constellation family information systems that were extended by adding tools oriented to support the studies of certain classes of natural objects by exploring their HSD. Particular developed tools provide capabilities to conduct analysis of such objects as vegetation canopies (forest and agriculture), open soils, forest fires, and areas of thermal anomalies. Developed software tools were successfully tested on Hyperion data sets.

  10. Dose masking feature for BNCT radiotherapy planning

    DOEpatents

    Cook, Jeremy L.; Wessol, Daniel E.; Wheeler, Floyd J.

    2000-01-01

    A system for displaying an accurate model of isodoses to be used in radiotherapy so that appropriate planning can be performed prior to actual treatment on a patient. The nature of the simulation of the radiotherapy planning for BNCT and Fast Neutron Therapy, etc., requires that the doses be computed in the entire volume. The "entire volume" includes the patient and beam geometries as well as the air spaces in between. Isodoses derived from the computed doses will therefore extend into the air regions between the patient and beam geometries and thus depict the unrealistic possibility that radiation deposition occurs in regions containing no physical media. This problem is solved by computing the doses for the entire geometry and then masking the physical and air regions along with the isodose contours superimposed over the patient image at the corresponding plane. The user is thus able to mask out (remove) the contour lines from the unwanted areas of the image by selecting the appropriate contour masking region from the raster image.

  11. FSMRank: feature selection algorithm for learning to rank.

    PubMed

    Lai, Han-Jiang; Pan, Yan; Tang, Yong; Yu, Rong

    2013-06-01

    In recent years, there has been growing interest in learning to rank. The introduction of feature selection into different learning problems has been proven effective. These facts motivate us to investigate the problem of feature selection for learning to rank. We propose a joint convex optimization formulation which minimizes ranking errors while simultaneously conducting feature selection. This optimization formulation provides a flexible framework in which we can easily incorporate various importance measures and similarity measures of the features. To solve this optimization problem, we use the Nesterov's approach to derive an accelerated gradient algorithm with a fast convergence rate O(1/T(2)). We further develop a generalization bound for the proposed optimization problem using the Rademacher complexities. Extensive experimental evaluations are conducted on the public LETOR benchmark datasets. The results demonstrate that the proposed method shows: 1) significant ranking performance gain compared to several feature selection baselines for ranking, and 2) very competitive performance compared to several state-of-the-art learning-to-rank algorithms.

  12. Biased Dropout and Crossmap Dropout: Learning towards effective Dropout regularization in convolutional neural network.

    PubMed

    Poernomo, Alvin; Kang, Dae-Ki

    2018-08-01

    Training a deep neural network with a large number of parameters often leads to overfitting problem. Recently, Dropout has been introduced as a simple, yet effective regularization approach to combat overfitting in such models. Although Dropout has shown remarkable results on many deep neural network cases, its actual effect on CNN has not been thoroughly explored. Moreover, training a Dropout model will significantly increase the training time as it takes longer time to converge than a non-Dropout model with the same architecture. To deal with these issues, we address Biased Dropout and Crossmap Dropout, two novel approaches of Dropout extension based on the behavior of hidden units in CNN model. Biased Dropout divides the hidden units in a certain layer into two groups based on their magnitude and applies different Dropout rate to each group appropriately. Hidden units with higher activation value, which give more contributions to the network final performance, will be retained by a lower Dropout rate, while units with lower activation value will be exposed to a higher Dropout rate to compensate the previous part. The second approach is Crossmap Dropout, which is an extension of the regular Dropout in convolution layer. Each feature map in a convolution layer has a strong correlation between each other, particularly in every identical pixel location in each feature map. Crossmap Dropout tries to maintain this important correlation yet at the same time break the correlation between each adjacent pixel with respect to all feature maps by applying the same Dropout mask to all feature maps, so that all pixels or units in equivalent positions in each feature map will be either dropped or active during training. Our experiment with various benchmark datasets shows that our approaches provide better generalization than the regular Dropout. Moreover, our Biased Dropout takes faster time to converge during training phase, suggesting that assigning noise appropriately in hidden units can lead to an effective regularization. Copyright © 2018 Elsevier Ltd. All rights reserved.

  13. Inductive Approaches to Improving Diagnosis and Design for Diagnosability

    NASA Technical Reports Server (NTRS)

    Fisher, Douglas H. (Principal Investigator)

    1995-01-01

    The first research area under this grant addresses the problem of classifying time series according to their morphological features in the time domain. A supervised learning system called CALCHAS, which induces a classification procedure for signatures from preclassified examples, was developed. For each of several signature classes, the system infers a model that captures the class's morphological features using Bayesian model induction and the minimum message length approach to assign priors. After induction, a time series (signature) is classified in one of the classes when there is enough evidence to support that decision. Time series with sufficiently novel features, belonging to classes not present in the training set, are recognized as such. A second area of research assumes two sources of information about a system: a model or domain theory that encodes aspects of the system under study and data from actual system operations over time. A model, when it exists, represents strong prior expectations about how a system will perform. Our work with a diagnostic model of the RCS (Reaction Control System) of the Space Shuttle motivated the development of SIG, a system which combines information from a model (or domain theory) and data. As it tracks RCS behavior, the model computes quantitative and qualitative values. Induction is then performed over the data represented by both the 'raw' features and the model-computed high-level features. Finally, work on clustering for operating mode discovery motivated some important extensions to the clustering strategy we had used. One modification appends an iterative optimization technique onto the clustering system; this optimization strategy appears to be novel in the clustering literature. A second modification improves the noise tolerance of the clustering system. In particular, we adapt resampling-based pruning strategies used by supervised learning systems to the task of simplifying hierarchical clusterings, thus making post-clustering analysis easier.

  14. Designation of Soap Molder Machine and Procedure for Transparent Soap

    NASA Astrophysics Data System (ADS)

    Mat Sharif, Zainon Binti; Taib, Norhasnina Binti Mohd; Yusof, Mohd Sallehuddin Bin; Rahim, Mohammad Zulafif Bin; Tobi, Abdul Latif Bin Mohd; Othman, Mohd Syafiq Bin

    2017-05-01

    Transparent soap is actually the combination of actual soap and solvent. The solvent is added into the soap solution to produce the transparent characteristic. The problem from the previous production is that tiny air bubbles were observed inside the soap resulted in less attractive appearance. Current method of producing the soap bar had taken more than 8 hours and having difficulties to take out the soap bar from the plastic mold with low production rate. It is expected that the air bubble problem can be solved using this new soup molder machine. The soap production rate is believed to increase with the invention of soap molder machine. By reducing the production time from 8 hours to 2 hours, it improve production rate significantly.

  15. French Regulatory practice and experience feedback on steam generator tube integrity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sandon, G.

    1997-02-01

    This paper summarizes the way the French Safety Authority applies regulatory rules and practices to the problem of steam generator tube cracking in French PWR reactors. There are 54 reactors providing 80% of French electrical consumption. The Safety Authority closely monitors the performance of tubes in steam generators, and requires application of a program which deals with problems prior to the actual development of leakage. The actual rules regarding such performance are flexible, responding to the overall performance of operating steam generators. In addition there is an inservice inspection service to examine tubes during shutdown, and to monitor steam generatorsmore » for leakage during operation, with guidelines for when generators must be pulled off line.« less

  16. Playing the Ponies: A $5 Million Embezzlement Case

    ERIC Educational Resources Information Center

    Howe, Martha A.; Malgwi, Charles A.

    2006-01-01

    Fraud is a pervasive problem, and educating future business leaders, managers, and auditors about fraud is one way to attack the problem. This instructional fraud case chronicles the actual details surrounding a major embezzlement at a regional high school (RHS) that culminated in long federal and state prison sentences for the school's treasurer.…

  17. The formaldehyde problem in wood-based products : an annotated bibliography

    Treesearch

    F. H. Max Nestler

    1977-01-01

    Urea-formaldehyde-type adhesives have the inherent characteristic of giving off free formaldehyde under some conditions of use. The vapor can build up to concentrations which can be a nuisance, uncomfortable, or an actual health hazard. The "formaldehyde problem" is reviewed, from literature sources, in five respects : oriqins, analytical, control and removal...

  18. Interparent Childrearing Disagreement, but Not Dissimilarity, Predicts Child Problems after Controlling for Parenting Effectiveness

    ERIC Educational Resources Information Center

    Chen, Mandy; Johnston, Charlotte

    2012-01-01

    Parental differences regarding childrearing may be operationalized as actual dissimilarity in the parenting actions or goals of the parents, or as perceived conflict or disagreement related to these dissimilarities. This study tested whether these two types of parental differences are each associated with child problems, independent of the…

  19. Elementary Students' Metacognitive Processes and Post-Performance Calibration on Mathematical Problem-Solving Tasks

    ERIC Educational Resources Information Center

    García, Trinidad; Rodríguez, Celestino; González-Castro, Paloma; González-Pienda, Julio Antonio; Torrance, Mark

    2016-01-01

    Calibration, or the correspondence between perceived performance and actual performance, is linked to students' metacognitive and self-regulatory skills. Making students more aware of the quality of their performance is important in elementary school settings, and more so when math problems are involved. However, many students seem to be poorly…

  20. The Problem of Bullying in Schools and the Promise of Positive Behaviour Supports

    ERIC Educational Resources Information Center

    Pugh, Roger; Chitiyo, Morgan

    2012-01-01

    Bullying in schools is recognised as a global problem. In the USA, school shootings and increasing school aggression focused research on the causes of bullying and interventions that could reduce or eliminate bullying behaviours. A variety of bullying programs have generated mixed results with some actually increasing bullying behaviours. There…

  1. Quickfire Challenges to Inspire Problem Solving

    ERIC Educational Resources Information Center

    Harper, Suzanne R.; Cox, Dana C.

    2017-01-01

    In the authors' attempts to incorporate problem solving into their mathematics courses, they have found that student ambition and creativity are often hampered by feelings of risk, as many students are conditioned to value a produced solution over the actual process of building one. Eliminating risk is neither possible nor desired. The challenge,…

  2. Appalachian Bridges to the Baccalaureate: How Community Colleges Affect Transfer Success

    ERIC Educational Resources Information Center

    Decker, Amber K.

    2011-01-01

    Statement of the problem. Too few community college students who intend to transfer and earn a baccalaureate degree actually do. This is a problem because postsecondary education is a key factor in economic mobility, and community colleges enroll a disproportionate number of nontraditional, part-time and low-income students. Although individual…

  3. The Impact of Federal Funds on Higher Education.

    ERIC Educational Resources Information Center

    Petersen, Phillip L.

    The various sources of federal funds and the subsequent problems in conforming to federal rules and regulations are considered. The actual U.S. funds for higher education for fiscal year 1978 and current federal programs dealing with financial aid to students are listed. One of the major problems in the administration of federal student aid…

  4. The ideal subject distance for passport pictures.

    PubMed

    Verhoff, Marcel A; Witzel, Carsten; Kreutz, Kerstin; Ramsthaler, Frank

    2008-07-04

    In an age of global combat against terrorism, the recognition and identification of people on document images is of increasing significance. Experiments and calculations have shown that the camera-to-subject distance - not the focal length of the lens - can have a significant effect on facial proportions. Modern passport pictures should be able to function as a reference image for automatic and manual picture comparisons. This requires a defined subject distance. It is completely unclear which subject distance, in the taking of passport photographs, is ideal for the recognition of the actual person. We show here that the camera-to-subject distance that is perceived as ideal is dependent on the face being photographed, even if the distance of 2m was most frequently preferred. So far the problem of the ideal camera-to-subject distance for faces has only been approached through technical calculations. We have, for the first time, answered this question experimentally with a double-blind experiment. Even if there is apparently no ideal camera-to-subject distance valid for every face, 2m can be proposed as ideal for the taking of passport pictures. The first step would actually be the determination of a camera-to-subject distance for the taking of passport pictures within the standards. From an anthropological point of view it would be interesting to find out which facial features allow the preference of a shorter camera-to-subject distance and which allow the preference of a longer camera-to-subject distance.

  5. Suppression of contrast-related artefacts in phase-measuring structured light techniques

    NASA Astrophysics Data System (ADS)

    Burke, Jan; Zhong, Liang

    2017-06-01

    Optical metrology using phase measurements has benefited significantly from the introduction of phase-shifting methods, first in interferometry, then also in fringe projection and fringe reflection. As opposed to interferometry, the latter two techniques generally use a spatiotemporal phase-shifting approach: A sequence of fringe patterns with varying spacing is used, and a phase map of each is generated by temporal phase shifting, to allow unique assignments of projector or screen pixels to camera pixels. One ubiquitous problem with phase-shifting structured-light techniques is that phase artefacts appear near regions of the image where the modulation amplitude of the projected or reflected fringes changes abruptly, e.g. near dirt/dust particles on the surface in deflectometry or bright-dark object colour transitions in fringe projection. Near the bright-dark boundaries, responses in the phase maps appear that are not plausible as actual surface features. The phenomenon has been known for a long time but is usually ignored because it does not compromise the overall reliability of results. In deflectometry, however, often the objective is to find and classify small defects, and of course it is then important to distinguish between bogus phase responses caused by fringe modulation changes, and actual surface defects. We present, for what we believe is the first time, an analytical derivation of the error terms, study the parameters influencing the phase artefacts (in particular the fringe period), and suggest some simple algorithms to minimise them.

  6. Effects of Shielding on Gamma Rays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karpius, Peter Joseph

    2017-03-13

    The interaction of gamma rays with matter results in an effect we call attenuation (i.e. ‘shielding’). Attenuation can dramatically alter the appearance of a spectrum. Attenuating materials may actually create features in a spectrum via x-ray fluorescence

  7. Comparison of thawing and freezing dark energy parametrizations

    NASA Astrophysics Data System (ADS)

    Pantazis, G.; Nesseris, S.; Perivolaropoulos, L.

    2016-05-01

    Dark energy equation of state w (z ) parametrizations with two parameters and given monotonicity are generically either convex or concave functions. This makes them suitable for fitting either freezing or thawing quintessence models but not both simultaneously. Fitting a data set based on a freezing model with an unsuitable (concave when increasing) w (z ) parametrization [like Chevallier-Polarski-Linder (CPL)] can lead to significant misleading features like crossing of the phantom divide line, incorrect w (z =0 ), incorrect slope, etc., that are not present in the underlying cosmological model. To demonstrate this fact we generate scattered cosmological data at both the level of w (z ) and the luminosity distance DL(z ) based on either thawing or freezing quintessence models and fit them using parametrizations of convex and of concave type. We then compare statistically significant features of the best fit w (z ) with actual features of the underlying model. We thus verify that the use of unsuitable parametrizations can lead to misleading conclusions. In order to avoid these problems it is important to either use both convex and concave parametrizations and select the one with the best χ2 or use principal component analysis thus splitting the redshift range into independent bins. In the latter case, however, significant information about the slope of w (z ) at high redshifts is lost. Finally, we propose a new family of parametrizations w (z )=w0+wa(z/1 +z )n which generalizes the CPL and interpolates between thawing and freezing parametrizations as the parameter n increases to values larger than 1.

  8. Centre-based restricted nearest feature plane with angle classifier for face recognition

    NASA Astrophysics Data System (ADS)

    Tang, Linlin; Lu, Huifen; Zhao, Liang; Li, Zuohua

    2017-10-01

    An improved classifier based on the nearest feature plane (NFP), called the centre-based restricted nearest feature plane with the angle (RNFPA) classifier, is proposed for the face recognition problems here. The famous NFP uses the geometrical information of samples to increase the number of training samples, but it increases the computation complexity and it also has an inaccuracy problem coursed by the extended feature plane. To solve the above problems, RNFPA exploits a centre-based feature plane and utilizes a threshold of angle to restrict extended feature space. By choosing the appropriate angle threshold, RNFPA can improve the performance and decrease computation complexity. Experiments in the AT&T face database, AR face database and FERET face database are used to evaluate the proposed classifier. Compared with the original NFP classifier, the nearest feature line (NFL) classifier, the nearest neighbour (NN) classifier and some other improved NFP classifiers, the proposed one achieves competitive performance.

  9. Perceptions and characteristics of registered nurses' involvement in decision making.

    PubMed

    Mangold, Kara L; Pearson, Kristina K; Schmitz, Julie R; Scherb, Cindy A; Specht, Janet P; Loes, Jean L

    2006-01-01

    This study aimed to determine the level of actual and preferred decisional involvement and ascertain whether there is decisional dissonance among registered nurses (RNs). A convenience sample of 196 RNs completed a demographic form and the Decisional Involvement Scale, a tool that measures actual and preferred decisional involvement for RNs in 6 categories: unit staffing, quality of professional practice, professional recruitment, unit governance and leadership, quality of support staff practice, and collaboration/liaison activities. From these data, the level of and difference between RN's actual and preferred decisional involvement was analyzed. In addition, the impact of level of education, years of experience, hours worked per pay period, and work setting on actual and preferred decisional involvement were measured. A statistically significant difference was found between RNs' actual and preferred decisional involvement, with RNs preferring more decisional involvement than they actually experienced. Work setting was the only variable to which a difference could be attributed. Further study is warranted to find causes of decisional dissonance and interventions that could help alleviate the problem and potentially increase job satisfaction.

  10. Adolescent psychological and academic adjustment as a function of discrepancies between actual and ideal self-perceptions.

    PubMed

    Ferguson, Gail M; Hafen, Christopher A; Laursen, Brett

    2010-12-01

    Actual-ideal discrepancies are associated with adolescent emotional distress and there is evidence that the size of discrepancies matters. However, the direction of discrepancies has not been examined, perhaps due to limitations of widely used self-discrepancy measures. Two hundred and twelve 7th, 9th and 11th grade students (59% female) in a public school in Jamaica described their actual and ideal selves in several different domains--friendship, dating, schoolwork, family, sports, and religion/spirituality--using a Pie measure. Students also completed measures of depressive symptoms, self-esteem, and academic achievement. Discrepancies favoring the ideal self and those favoring the actual self were linked to depressive symptoms, low self-esteem, and poor school grades in the domains of friendship, dating, and schoolwork. Effects were stronger among older adolescents than among younger adolescents. Theories of actual/ideal self-discrepancies have focused on problems arising when the ideal self overshadows the actual self; however, the present study finds that self-discrepancies, regardless of their direction, are a liability. Implications for self-discrepancy measurement, adolescent development, and clinical practice are discussed.

  11. Pharmacist care plans and documentation of follow-up before the Iowa Pharmaceutical Case Management program.

    PubMed

    Becker, CoraLynn; Bjornson, Darrel C; Kuhle, Julie W

    2004-01-01

    To document drug therapy problems and their causes and assess pharmacist follow-up of patients with identified drug therapy problems. Cross-sectional analysis. Iowa. 160 pharmacists who submitted 754 pharmaceutical care plans in an effort to qualify for participation in the Iowa Pharmaceutical Case Management program. Care plans were assessed for drug therapy problems and causes and for documentation of pharmacist follow-up (actual, none, or intent to follow up). Pharmacists documented a wide variety of drug therapy problems and causes, including adverse drug reactions (20.1% of care plans), need for additional drug therapy (18.9%), lack of patient adherence to therapy (16.3%), incorrect medication being prescribed (14.1%), and drug dose too high (10.0%). Pharmacist follow-up with patients was not optimal, with 31% of care plans providing documentation of actual follow-up. Another 42.2% of plans indicated that the pharmacist intended to contact the patient for follow-up but either did not do so or did not record the intervention. No actual follow-up or intent to follow up was recorded in 26.8% of care plans. Pharmacists practicing in independent pharmacies followed up with patients more frequently than those in other settings (36.4% of care plans, compared with 22.7%, 23.2%, and 28.4% for chain, clinic, and franchise pharmacies). Pharmacists were more likely to follow up when the identified problem involved drug safety rather than effectiveness (36.2% versus 28.3% of care plans). Documentation of pharmacist follow-up with patients was less than optimal. In addition to identifying drug therapy problems and causes, pharmacists must complete the care continuum through documentation of patient monitoring and follow-up to transform the philosophy and vision of the pharmaceutical care concept into a practice of pharmacy recognized and rewarded by patients and payers.

  12. Scattering features for lung cancer detection in fibered confocal fluorescence microscopy images.

    PubMed

    Rakotomamonjy, Alain; Petitjean, Caroline; Salaün, Mathieu; Thiberville, Luc

    2014-06-01

    To assess the feasibility of lung cancer diagnosis using fibered confocal fluorescence microscopy (FCFM) imaging technique and scattering features for pattern recognition. FCFM imaging technique is a new medical imaging technique for which interest has yet to be established for diagnosis. This paper addresses the problem of lung cancer detection using FCFM images and, as a first contribution, assesses the feasibility of computer-aided diagnosis through these images. Towards this aim, we have built a pattern recognition scheme which involves a feature extraction stage and a classification stage. The second contribution relies on the features used for discrimination. Indeed, we have employed the so-called scattering transform for extracting discriminative features, which are robust to small deformations in the images. We have also compared and combined these features with classical yet powerful features like local binary patterns (LBP) and their variants denoted as local quinary patterns (LQP). We show that scattering features yielded to better recognition performances than classical features like LBP and their LQP variants for the FCFM image classification problems. Another finding is that LBP-based and scattering-based features provide complementary discriminative information and, in some situations, we empirically establish that performance can be improved when jointly using LBP, LQP and scattering features. In this work we analyze the joint capability of FCFM images and scattering features for lung cancer diagnosis. The proposed method achieves a good recognition rate for such a diagnosis problem. It also performs well when used in conjunction with other features for other classical medical imaging classification problems. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. THE EXPERIENCE IN THE UNITED STATES WITH REACTOR OPERATION AND REACTOR SAFEGUARDS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCullough, C.R.

    1958-10-31

    Reactors are operating or planned at locations in the United States in cities, near cities, and at remote locations. There is a general pattern that the higher power reactors are not in, but fairly uear cities, and the testing reactors for more hazardous experiments are at remote locations. A great deal has been done on the theoretical and experimental study of importunt features of reactor design. The metal-water reaction is still a theoretical possibility but tests of fuel element burnout under conditions approaching reactor operation gave no reaction. It appears that nucleate boiling does not necessarily result in steam blanketingmore » and fuel melting. Much attention is being given to the calculation of core kinetics but it is being found that temperature, power, and void coefficients cannot be calculated with accuracy and experiments are required. Some surprises are found giving positive localized void coefficients. Possible oscillatory behavior of reactors is being given careful study. No dangerous oscillations have been found in operating reactors but osciliations hare appeared in experimeats. The design of control and safety systems varies wvith different constructors. The relation of control to the kinetic behavior of the reactor is being studied. The importance of sensing element locations in order to know actual local reactor power level is being recognized. The time constants of instrumentation as related to reactor kinetics are being studied. Pressure vessels for reactors are being designed and manufactured. Many of these are beyond any previous experience. The stress problem is being given careful study. The effect of radiation is being studied experimentally. The stress problems of piping and pressure vessels is a difficult design problem being met successfully in reactor plants. The proper organization and procedure for operation of reactors is being evolved for resourch, testing, and power reactors. The importance of written standards and instructions for both normal and abnormal operating conditions is recogmized. Corfinement of radioactive materials either by tight steel shells, tight buildings, or semi-tight structures vented through filters is considered necessary in the United States. A discussion will be given of specifications, construction, and testing of these structures. The need for emergency plans has been stressed by recent experiences in radioactive releases. The problems of such plans to cover all grades of accidents will be discussed. The theoretical consequences of releases of radioactive materials have been studied and these results will be compared with actual experience. The problem of exposures from normal and abnormal operetion of reactors is a problem of desiga and operation on one hand and the amount of damage to be expected on the other. The safeguard problem is closely related to the acceptable doses of radiouctivity which the ICRP recommend. The future of atomic energy depends upon adequate safeguards and economical design and operation. Accepted criteria are required to guide designers as to the proper balance of caution and boldness. (auth)« less

  14. Assessing the impact of representational and contextual problem features on student use of right-hand rules

    NASA Astrophysics Data System (ADS)

    Kustusch, Mary Bridget

    2016-06-01

    Students in introductory physics struggle with vector algebra and these challenges are often associated with contextual and representational features of the problems. Performance on problems about cross product direction is particularly poor and some research suggests that this may be primarily due to misapplied right-hand rules. However, few studies have had the resolution to explore student use of right-hand rules in detail. This study reviews literature in several disciplines, including spatial cognition, to identify ten contextual and representational problem features that are most likely to influence performance on problems requiring a right-hand rule. Two quantitative measures of performance (correctness and response time) and two qualitative measures (methods used and type of errors made) were used to explore the impact of these problem features on student performance. Quantitative results are consistent with expectations from the literature, but reveal that some features (such as the type of reasoning required and the physical awkwardness of using a right-hand rule) have a greater impact than others (such as whether the vectors are placed together or separate). Additional insight is gained by the qualitative analysis, including identifying sources of difficulty not previously discussed in the literature and revealing that the use of supplemental methods, such as physically rotating the paper, can mitigate errors associated with certain features.

  15. Stabilizing l1-norm prediction models by supervised feature grouping.

    PubMed

    Kamkar, Iman; Gupta, Sunil Kumar; Phung, Dinh; Venkatesh, Svetha

    2016-02-01

    Emerging Electronic Medical Records (EMRs) have reformed the modern healthcare. These records have great potential to be used for building clinical prediction models. However, a problem in using them is their high dimensionality. Since a lot of information may not be relevant for prediction, the underlying complexity of the prediction models may not be high. A popular way to deal with this problem is to employ feature selection. Lasso and l1-norm based feature selection methods have shown promising results. But, in presence of correlated features, these methods select features that change considerably with small changes in data. This prevents clinicians to obtain a stable feature set, which is crucial for clinical decision making. Grouping correlated variables together can improve the stability of feature selection, however, such grouping is usually not known and needs to be estimated for optimal performance. Addressing this problem, we propose a new model that can simultaneously learn the grouping of correlated features and perform stable feature selection. We formulate the model as a constrained optimization problem and provide an efficient solution with guaranteed convergence. Our experiments with both synthetic and real-world datasets show that the proposed model is significantly more stable than Lasso and many existing state-of-the-art shrinkage and classification methods. We further show that in terms of prediction performance, the proposed method consistently outperforms Lasso and other baselines. Our model can be used for selecting stable risk factors for a variety of healthcare problems, so it can assist clinicians toward accurate decision making. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. Quantifying site-specific physical heterogeneity within an estuarine seascape

    USGS Publications Warehouse

    Kennedy, Cristina G.; Mather, Martha E.; Smith, Joseph M.

    2017-01-01

    Quantifying physical heterogeneity is essential for meaningful ecological research and effective resource management. Spatial patterns of multiple, co-occurring physical features are rarely quantified across a seascape because of methodological challenges. Here, we identified approaches that measured total site-specific heterogeneity, an often overlooked aspect of estuarine ecosystems. Specifically, we examined 23 metrics that quantified four types of common physical features: (1) river and creek confluences, (2) bathymetric variation including underwater drop-offs, (3) land features such as islands/sandbars, and (4) major underwater channel networks. Our research at 40 sites throughout Plum Island Estuary (PIE) provided solutions to two problems. The first problem was that individual metrics that measured heterogeneity of a single physical feature showed different regional patterns. We solved this first problem by combining multiple metrics for a single feature using a within-physical feature cluster analysis. With this approach, we identified sites with four different types of confluences and three different types of underwater drop-offs. The second problem was that when multiple physical features co-occurred, new patterns of total site-specific heterogeneity were created across the seascape. This pattern of total heterogeneity has potential ecological relevance to structure-oriented predators. To address this second problem, we identified sites with similar types of total physical heterogeneity using an across-physical feature cluster analysis. Then, we calculated an additive heterogeneity index, which integrated all physical features at a site. Finally, we tested if site-specific additive heterogeneity index values differed for across-physical feature clusters. In PIE, the sites with the highest additive heterogeneity index values were clustered together and corresponded to sites where a fish predator, adult striped bass (Morone saxatilis), aggregated in a related acoustic tracking study. In summary, we have shown general approaches to quantifying site-specific heterogeneity.

  17. Awareness campaign. Orthopedic Hospital of Oklahoma launches awareness campaign.

    PubMed

    2007-01-01

    The Orthopedic Hospital of Oklahoma is a 25-bed inpatient and outpatient center with one focus: Orthopedics. To acquaint people with its services and build brand awareness to drive market share, the hospital launched a print campaign featuring actual patients.

  18. Prospect of EUV mask repair technology using e-beam tool

    NASA Astrophysics Data System (ADS)

    Kanamitsu, Shingo; Hirano, Takashi; Suga, Osamu

    2010-09-01

    Currently, repair machines used for advanced photomasks utilize principle method like as FIB, AFM, and EB. There are specific characteristic respectively, thus they have an opportunity to be used in suitable situation. But when it comes to EUV generation, pattern size is so small highly expected as under 80nm that higher image resolution and repair accuracy is needed for its machines. Because FIB machine has intrinsic damage problem induced by Ga ion and AFM machine has critical tip size issue, those machines are basically difficult to be applied for EUV generation. Consequently, we focused on EB repair tool for research work. EB repair tool has undergone practical milestone about MoSi based masks. We have applied same process which is used for MoSi to EUV blank and confirmed its reaction. Then we found some severe problems which show uncontrollable feature due to its enormously strong reaction between etching gas and absorber material. Though we could etch opaque defect with conventional method and get the edge shaped straight by top-down SEM viewing, there were problems like as sidewall undercut or local erosion depending on defect shape. In order to cope with these problems, the tool vender has developed a new process and reported it through an international conference [1]. We have evaluated the new process mentioned above in detail. In this paper, we will bring the results of those evaluations. Several experiments for repair accuracy, process stability, and other items have been done under estimation of practical condition assuming diversified size and shape defects. A series of actual printability tests will be also included. On the basis of these experiments, we consider the possibility of EB-repair application for 20nm pattern.

  19. Comparison of Naive Bayes and Decision Tree on Feature Selection Using Genetic Algorithm for Classification Problem

    NASA Astrophysics Data System (ADS)

    Rahmadani, S.; Dongoran, A.; Zarlis, M.; Zakarias

    2018-03-01

    This paper discusses the problem of feature selection using genetic algorithms on a dataset for classification problems. The classification model used is the decicion tree (DT), and Naive Bayes. In this paper we will discuss how the Naive Bayes and Decision Tree models to overcome the classification problem in the dataset, where the dataset feature is selectively selected using GA. Then both models compared their performance, whether there is an increase in accuracy or not. From the results obtained shows an increase in accuracy if the feature selection using GA. The proposed model is referred to as GADT (GA-Decision Tree) and GANB (GA-Naive Bayes). The data sets tested in this paper are taken from the UCI Machine Learning repository.

  20. Voyager Outreach Compilation

    NASA Technical Reports Server (NTRS)

    1998-01-01

    This NASA JPL (Jet Propulsion Laboratory) video presents a collection of the best videos that have been published of the Voyager mission. Computer animation/simulations comprise the largest portion of the video and include outer planetary magnetic fields, outer planetary lunar surfaces, and the Voyager spacecraft trajectory. Voyager visited the four outer planets: Jupiter, Saturn, Uranus, and Neptune. The video contains some live shots of Jupiter (actual), the Earth's moon (from orbit), Saturn (actual), Neptune (actual) and Uranus (actual), but is mainly comprised of computer animations of these planets and their moons. Some of the individual short videos that are compiled are entitled: The Solar System; Voyage to the Outer Planets; A Tour of the Solar System; and the Neptune Encounter. Computerized simulations of Viewing Neptune from Triton, Diving over Neptune to Meet Triton, and Catching Triton in its Retrograde Orbit are included. Several animations of Neptune's atmosphere, rotation and weather features as well as significant discussion of the planet's natural satellites are also presented.

  1. Design and analysis of a worm gear turntable off-axis assembly method in a three-grating monochromator.

    PubMed

    Chen, Jianjun; Cui, Jicheng; Yao, Xuefeng; Liu, Jianan; Sun, Ci

    2018-04-01

    To solve the problem where the actual grating aperture decreases with an increasing scanning angle during the scanning of a three-grating monochromator, we propose an off-axis assembly method for the worm gear turntable that makes it possible to suppress this aperture reduction. We simulated and compared the traditional assembly method with the off-axis assembly method in the three-grating monochromator. Results show that the actual grating aperture can be improved by the off-axis assembly method. In fact, for any one of the three gratings, when the monochromator outputs the longest wavelength in the corresponding wavelength band, the actual grating aperture increases by 45.93%. Over the entire monochromator output band, the actual grating aperture increased by an average of 32.56% and can thus improve the monochromator's output energy. Improvement of the actual grating aperture can also reduce the stray light intensity in the monochromator and improve its output signal-to-noise ratio.

  2. [Features of neurologic semiotics at chronic obstructive pulmonary disease].

    PubMed

    Litvinenko, I V; Baranov, V L; Kolcheva, Iu A

    2011-01-01

    Chronic obstructive pulmonary disease (COPD) is actual pathology, when it forms the mixed hypoxemia. In the conditions of a chronic hypoxemia structures of organism with high level of metabolic processes, namely brain tissues, suffer. Character of defeat of the central nervous system at that pathology is insufficiently studied. In this article we studied and analysed the presence of such changes as depression, anxiety, cognitive impairment and features of neurologic semiotics at COPD in 50 patients.

  3. "Righting" the Writing Problem.

    ERIC Educational Resources Information Center

    Shaughnessy, Michael F.; Eastham, Nicholas

    The problem of college students' writing skills or lack thereof is generally agreed upon in academia. One cause is the inordinate amount of multiple choice/true false/fill in the blank type of tests that students take in high school and college. Not only is there is a dearth of actual classes in writing available, few students recognize the need…

  4. Forest processes from stands to landscapes: exploring model forecast uncertainties using cross-scale model comparison

    Treesearch

    Michael J. Papaik; Andrew Fall; Brian Sturtevant; Daniel Kneeshaw; Christian Messier; Marie-Josee Fortin; Neal Simon

    2010-01-01

    Forest management practices conducted primarily at the stand scale result in simplified forests with regeneration problems and low structural and biological diversity. Landscape models have been used to help design management strategies to address these problems. However, there remains a great deal of uncertainty that the actual management practices result in the...

  5. Biological system interactions.

    PubMed Central

    Adomian, G; Adomian, G E; Bellman, R E

    1984-01-01

    Mathematical modeling of cellular population growth, interconnected subsystems of the body, blood flow, and numerous other complex biological systems problems involves nonlinearities and generally randomness as well. Such problems have been dealt with by mathematical methods often changing the actual model to make it tractable. The method presented in this paper (and referenced works) allows much more physically realistic solutions. PMID:6585837

  6. Using Study Guides To Help Students Focus Their Reading in the Basic Course.

    ERIC Educational Resources Information Center

    Blakeman, David A.; Young, Raymond W.

    One problem that surfaced with the speech communication basic course (COM 105) at Valdosta State University (Georgia) was that the actual content covered by individual instructors varied widely, so widely that two given sections taught by different instructors may bear little resemblance to one another. This problem was addressed first through a…

  7. Do They "Really" Get It? Evaluating Evidence of Student Understanding of Power Series

    ERIC Educational Resources Information Center

    Kung, David; Speer, Natasha

    2013-01-01

    Most teachers agree that if a student understands a particular mathematical topic well, he/she will probably be able to do problems correctly. The converse, however, frequently fails: students who do problems correctly sometimes do not actually have robust understandings of the topic in question. In this paper we explore this phenomenon in the…

  8. Parsing Protocols Using Problem Solving Grammars. AI Memo 385.

    ERIC Educational Resources Information Center

    Miller, Mark L.; Goldstein, Ira P.

    A theory of the planning and debugging of computer programs is formalized as a context free grammar, which is used to reveal the constituent structure of problem solving episodes by parsing protocols in which programs are written, tested, and debugged. This is illustrated by the detailed analysis of an actual session with a beginning student…

  9. The Problem of Correspondence of Educational and Professional Standards (Results of Empirical Research)

    ERIC Educational Resources Information Center

    Piskunova, Elena; Sokolova, Irina; Kalimullin, Aydar

    2016-01-01

    In the article, the problem of correspondence of educational standards of higher pedagogical education and teacher professional standards in Russia is actualized. Modern understanding of the quality of vocational education suggests that in the process of education the student develops a set of competencies that will enable him or her to carry out…

  10. Motivational Influences of Using Peer Evaluation in Problem-Based Learning in Medical Education

    ERIC Educational Resources Information Center

    Abercrombie, Sara; Parkes, Jay; McCarty, Teresita

    2015-01-01

    This study investigates the ways in which medical students' achievement goal orientations (AGO) affect their perceptions of learning and actual learning from an online problem-based learning environment, Calibrated Peer Review™. First, the tenability of a four-factor model (Elliot & McGregor, 2001) of AGO was tested with data collected from…

  11. Problems and Procedures in Planning a Situation Based Video Test on Teaching.

    ERIC Educational Resources Information Center

    Masonis, Edward J.

    This paper briefly outlines some problems one must solve when developing a video-based test to evaluate what a teacher knows about learning and instruction. Consideration is given to the effect the use of videotapes of actual classroom behavior have on test planning. Two methods of incorporating such situational material into the test…

  12. A Problem-Based Learning Approach to Teacher Training: Findings after Program Redesign

    ERIC Educational Resources Information Center

    Caukin, Nancy; Dillard, Heather; Goodin, Terry

    2016-01-01

    This study reports on Residency I, the first semester of a yearlong residency that utilizes problem-based learning scenarios, combined with field work, that covers both content and context and is meant to positively impact teacher candidates' self-efficacy as well as their actual efficacy as measured by scores on the edTPA. This quantitative…

  13. Implementing Problem-Based Learning in the Counseling Session.

    ERIC Educational Resources Information Center

    Hall, Kimberly R.

    This study examined the use of problem-based learning (PBL) in an actual counseling session and the effects on student assertiveness skills. A group of seventh-grade students, who were all victims of bullies, participated in the study. The students, two boys and one girls, were 13 and 14 years old. Teachers rated the level of assertiveness skills…

  14. The Effects of Spacing and Birth Order on Problem-Solving Competence of Preschool Children. Progress Report.

    ERIC Educational Resources Information Center

    McGillicuddy-DeLisi, Ann V.; Sigel, Irving

    This study investigates the impact of family configuration and parent education-income level on parental beliefs, the relationship between these beliefs and actual parental practices, and the effect of parental practices on children's problem-solving abilities. One hundred twenty intact families participated in the study. Forty families consisted…

  15. Evaluation of Voice Acoustics as Predictors of Clinical Depression Scores.

    PubMed

    Hashim, Nik Wahidah; Wilkes, Mitch; Salomon, Ronald; Meggs, Jared; France, Daniel J

    2017-03-01

    The aim of the present study was to determine if acoustic measures of voice, characterizing specific spectral and timing properties, predict clinical ratings of depression severity measured in a sample of patients using the Hamilton Depression Rating Scale (HAMD) and Beck Depression Inventory (BDI-II). This is a prospective study. Voice samples and clinical depression scores were collected prospectively from consenting adult patients who were referred to psychiatry from the adult emergency department or primary care clinics. The patients were audio-recorded as they read a standardized passage in a nearly closed-room environment. Mean Absolute Error (MAE) between actual and predicted depression scores was used as the primary outcome measure. The average MAE between predicted and actual HAMD scores was approximately two scores for both men and women, and the MAE for the BDI-II scores was approximately one score for men and eight scores for women. Timing features were predictive of HAMD scores in female patients while a combination of timing features and spectral features was predictive of scores in male patients. Timing features were predictive of BDI-II scores in male patients. Voice acoustic features extracted from read speech demonstrated variable effectiveness in predicting clinical depression scores in men and women. Voice features were highly predictive of HAMD scores in men and women, and BDI-II scores in men, respectively. The methodology is feasible for diagnostic applications in diverse clinical settings as it can be implemented during a standard clinical interview in a normal closed room and without strict control on the recording environment. Copyright © 2017 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  16. Experimental issues related to frequency response function measurements for frequency-based substructuring

    NASA Astrophysics Data System (ADS)

    Nicgorski, Dana; Avitabile, Peter

    2010-07-01

    Frequency-based substructuring is a very popular approach for the generation of system models from component measured data. Analytically the approach has been shown to produce accurate results. However, implementation with actual test data can cause difficulties and cause problems with the system response prediction. In order to produce good results, extreme care is needed in the measurement of the drive point and transfer impedances of the structure as well as observe all the conditions for a linear time invariant system. Several studies have been conducted to show the sensitivity of the technique to small variations that often occur during typical testing of structures. These variations have been observed in actual tested configurations and have been substantiated with analytical models to replicate the problems typically encountered. The use of analytically simulated issues helps to clearly see the effects of typical measurement difficulties often observed in test data. This paper presents some of these common problems observed and provides guidance and recommendations for data to be used for this modeling approach.

  17. Optical potential approach to the electron-atom impact ionization threshold problem

    NASA Technical Reports Server (NTRS)

    Temkin, A.; Hahn, Y.

    1973-01-01

    The problem of the threshold law for electron-atom impact ionization is reconsidered as an extrapolation of inelastic cross sections through the ionization threshold. The cross sections are evaluated from a distorted wave matrix element, the final state of which describes the scattering from the Nth excited state of the target atom. The actual calculation is carried for the e-H system, and a model is introduced which is shown to preserve the essential properties of the problem while at the same time reducing the dimensionability of the Schrodinger equation. Nevertheless, the scattering equation is still very complex. It is dominated by the optical potential which is expanded in terms of eigen-spectrum of QHQ. It is shown by actual calculation that the lower eigenvalues of this spectrum descend below the relevant inelastic thresholds; it follows rigorously that the optical potential contains repulsive terms. Analytical solutions of the final state wave function are obtained with several approximations of the optical potential.

  18. Intellectual technologies in the problems of thermal power engineering control: formalization of fuzzy information processing results using the artificial intelligence methodology

    NASA Astrophysics Data System (ADS)

    Krokhin, G.; Pestunov, A.

    2017-11-01

    Exploitation conditions of power stations in variable modes and related changes of their technical state actualized problems of creating models for decision-making and state recognition basing on diagnostics using the fuzzy logic for identification their state and managing recovering processes. There is no unified methodological approach for obtaining the relevant information is a case of fuzziness and inhomogeneity of the raw information about the equipment state. The existing methods for extracting knowledge are usually unable to provide the correspondence between of the aggregates model parameters and the actual object state. The switchover of the power engineering from the preventive repair to the one, which is implemented according to the actual technical state, increased the responsibility of those who estimate the volume and the duration of the work. It may lead to inadequacy of the diagnostics and the decision-making models if corresponding methodological preparations do not take fuzziness into account, because the nature of the state information is of this kind. In this paper, we introduce a new model which formalizes the equipment state using not only exact information, but fuzzy as well. This model is more adequate to the actual state, than traditional analogs, and may be used in order to increase the efficiency and the service period of the power installations.

  19. Manchester visual query language

    NASA Astrophysics Data System (ADS)

    Oakley, John P.; Davis, Darryl N.; Shann, Richard T.

    1993-04-01

    We report a database language for visual retrieval which allows queries on image feature information which has been computed and stored along with images. The language is novel in that it provides facilities for dealing with feature data which has actually been obtained from image analysis. Each line in the Manchester Visual Query Language (MVQL) takes a set of objects as input and produces another, usually smaller, set as output. The MVQL constructs are mainly based on proven operators from the field of digital image analysis. An example is the Hough-group operator which takes as input a specification for the objects to be grouped, a specification for the relevant Hough space, and a definition of the voting rule. The output is a ranked list of high scoring bins. The query could be directed towards one particular image or an entire image database, in the latter case the bins in the output list would in general be associated with different images. We have implemented MVQL in two layers. The command interpreter is a Lisp program which maps each MVQL line to a sequence of commands which are used to control a specialized database engine. The latter is a hybrid graph/relational system which provides low-level support for inheritance and schema evolution. In the paper we outline the language and provide examples of useful queries. We also describe our solution to the engineering problems associated with the implementation of MVQL.

  20. Pinkeye

    ERIC Educational Resources Information Center

    Delaney, Patricia A.

    1972-01-01

    There are actually four major causes of conjunctivitis (Pinkeye)- infection, allergy, injury and systematic diseases such as measles. The most common cause and the one which causes the greatest problem in schools is infection. (Author)

  1. Experiences and results multitasking a hydrodynamics code on global and local memory machines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mandell, D.

    1987-01-01

    A one-dimensional, time-dependent Lagrangian hydrodynamics code using a Godunov solution method has been multitasked for the Cray X-MP/48, the Intel iPSC hypercube, the Alliant FX series and the IBM RP3 computers. Actual multitasking results have been obtained for the Cray, Intel and Alliant computers and simulated results were obtained for the Cray and RP3 machines. The differences in the methods required to multitask on each of the machines is discussed. Results are presented for a sample problem involving a shock wave moving down a channel. Comparisons are made between theoretical speedups, predicted by Amdahl's law, and the actual speedups obtained.more » The problems of debugging on the different machines are also described.« less

  2. Multiple response to sound in dysfunctional children.

    PubMed

    Condon, W S

    1975-03-01

    Methods and findings derived from over a decade of linguistic-kinesic microanalysis of sound films of human behavior were appled to the analysis of sound films of 25 dysfunctional children. Of the children, 17 were markedly dysfunctional (autistic-like) while 8 had milder reading problems. All of these children appeared to respond to sound more than once: when it actually occurred and again after a delay ranging from a fraction of a second up to a full second, depending on the child. Most of the children did not seem to actually hear the sound more than once; however, there is some indication that a few children may have done so. Evidence was also found suggesting a continuum from the longer delay of autistic-like children to the briefer delay of children with reading problems.

  3. Mental sets in conduct problem youth with psychopathic features: entity versus incremental theories of intelligence.

    PubMed

    Salekin, Randall T; Lester, Whitney S; Sellers, Mary-Kate

    2012-08-01

    The purpose of the current study was to examine the effect of a motivational intervention on conduct problem youth with psychopathic features. Specifically, the current study examined conduct problem youths' mental set (or theory) regarding intelligence (entity vs. incremental) upon task performance. We assessed 36 juvenile offenders with psychopathic features and tested whether providing them with two different messages regarding intelligence would affect their functioning on a task related to academic performance. The study employed a MANOVA design with two motivational conditions and three outcomes including fluency, flexibility, and originality. Results showed that youth with psychopathic features who were given a message that intelligence grows over time, were more fluent and flexible than youth who were informed that intelligence is static. There were no significant differences between the groups in terms of originality. The implications of these findings are discussed including the possible benefits of interventions for adolescent offenders with conduct problems and psychopathic features. (PsycINFO Database Record (c) 2012 APA, all rights reserved).

  4. Students' Problem-Solving in Mechanics: Preference of a Process Based Model.

    ERIC Educational Resources Information Center

    Stavy, Ruth; And Others

    Research in science and mathematics education has indicated that students often use inappropriate models for solving problems because they tend to mentally represent a problem according to surface features instead of referring to scientific concepts and features. The objective of the study reported in this paper was to determine whether 34 Israeli…

  5. Intervention into a turbulent urban situation: A case study. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Caldwell, G. M., Jr.

    1973-01-01

    The application is reported of NASA management philosophy and techniques within New Castle County, Delaware, to meet actual problems of community violence. It resulted in restructuring the county approach to problems of this nature, and development of a comprehensive system for planning, based on the NASA planning process. The method involved federal, state, and local resources with community representatives in solving the problems. The concept of a turbulent environment is presented with parallels drawn between NASA management experience and problems of management within an urban arena.

  6. Invariant-feature-based adaptive automatic target recognition in obscured 3D point clouds

    NASA Astrophysics Data System (ADS)

    Khuon, Timothy; Kershner, Charles; Mattei, Enrico; Alverio, Arnel; Rand, Robert

    2014-06-01

    Target recognition and classification in a 3D point cloud is a non-trivial process due to the nature of the data collected from a sensor system. The signal can be corrupted by noise from the environment, electronic system, A/D converter, etc. Therefore, an adaptive system with a desired tolerance is required to perform classification and recognition optimally. The feature-based pattern recognition algorithm architecture as described below is particularly devised for solving a single-sensor classification non-parametrically. Feature set is extracted from an input point cloud, normalized, and classifier a neural network classifier. For instance, automatic target recognition in an urban area would require different feature sets from one in a dense foliage area. The figure above (see manuscript) illustrates the architecture of the feature based adaptive signature extraction of 3D point cloud including LIDAR, RADAR, and electro-optical data. This network takes a 3D cluster and classifies it into a specific class. The algorithm is a supervised and adaptive classifier with two modes: the training mode and the performing mode. For the training mode, a number of novel patterns are selected from actual or artificial data. A particular 3D cluster is input to the network as shown above for the decision class output. The network consists of three sequential functional modules. The first module is for feature extraction that extracts the input cluster into a set of singular value features or feature vector. Then the feature vector is input into the feature normalization module to normalize and balance it before being fed to the neural net classifier for the classification. The neural net can be trained by actual or artificial novel data until each trained output reaches the declared output within the defined tolerance. In case new novel data is added after the neural net has been learned, the training is then resumed until the neural net has incrementally learned with the new novel data. The associative memory capability of the neural net enables the incremental learning. The back propagation algorithm or support vector machine can be utilized for the classification and recognition.

  7. [Perspective applications of multi-species probiotics in the prevention of antibiotic-associated diarrhea].

    PubMed

    Uspenskiĭ, Iu P; Zakharenko, S M; Fominykh, Iu A

    2013-01-01

    The problem of antibiotic-associated conditions is one of the most actual problems of clinical practice. The antibiotic-associated diarrhea is a multidisciplinary problem. Investigations of the small intestine microecological status and assessment of microflora at the patients receiving antibiotics testifies to dysbiosis existence. In article results of open-label investigation of a multispecies probiotic RioFlora Balance using for antibiotic-associated diarrhea prophylaxis in patients used antibacterial therapy are presented.

  8. Theory of wide-angle photometry from standard stars

    NASA Technical Reports Server (NTRS)

    Usher, Peter D.

    1989-01-01

    Wide angle celestial structures, such as bright comet tails and nearby galaxies and clusters of galaxies, rely on photographic methods for quantified morphology and photometry, primarily because electronic devices with comparable resolution and sky coverage are beyond current technological capability. The problem of the photometry of extended structures and of how this problem may be overcome through calibration by photometric standard stars is examined. The perfect properties of the ideal field of view are stated in the guise of a radiometric paraxial approximation, in the hope that fields of view of actual telescopes will conform. Fundamental radiometric concepts are worked through before the issue of atmospheric attenuation is addressed. The independence of observed atmospheric extinction and surface brightness leads off the quest for formal solutions to the problem of surface photometry. Methods and problems of solution are discussed. The spectre is confronted in the spirit of standard stars and shown to be chimerical in that light, provided certain rituals are adopted. After a brief discussion of Baker-Sampson polynomials and the vexing issue of saturation, a pursuit is made of actual numbers to be expected in real cases. While the numbers crunched are gathered ex nihilo, they demonstrate the feasibility of Newton's method in the solution of this overdetermined, nonlinear, least square, multiparametric, photometric problem.

  9. Depression (For Teens)

    MedlinePlus

    ... good things in life. Depression drains the energy, motivation, and concentration a person needs for normal activities. ... a problem is actually temporary. Low energy and motivation. People with depression may feel tired, drained, or ...

  10. Myths about Cloning

    MedlinePlus

    ... to have health problems all their lives. Myth: Cow clones make human pharmaceuticals in their milk. Myth: ... actual animal. For example, if they’re Holstein cows, the pattern of their spots, or the shape ...

  11. Positioning the actual interference fringe pattern on the tooth flank in measuring gear tooth flanks by laser interferometry

    NASA Astrophysics Data System (ADS)

    Fang, Suping; Wang, Leijie; Liu, Shiqiao; Komori, Masaharu; Kubo, Aizoh

    2011-05-01

    In measuring form deviation of gear tooth flanks by laser interferometry, the collected interference fringe pattern (IFP) is badly distorted, in the case of shape, relative to the actual tooth flank. Meanwhile, a clear and definite mapping relationship between the collected IFP and the actual tooth flank is indispensable for both transforming phase differences into deviation values and positioning the measurement result on the actual tooth flank. In order to solve these problems, this paper proposes a method using the simulation tooth image as a bridge connecting the actual tooth flank and the collected IFP. The mapping relationship between the simulation tooth image and the actual tooth flank has been obtained by ray tracing methods [Fang et al., Appl. Opt. 49(33), 6409-6415 (2010)]. This paper mainly discusses how to build the relationship between the simulation tooth image and the collected IFP by using a matching algorithm of two characteristic point sets. With the combination of the two above-mentioned assistant mapping relationships, the mapping relationship between the collected IFP and the actual tooth flank can be built; the collected IFP can be positioned on the actual tooth flank. Finally, the proposed method is employed in a measurement of the form deviation of a gear tooth flank and the result proves the feasibility of the proposed method.

  12. Detecting Service Chains and Feature Interactions in Sensor-Driven Home Network Services

    PubMed Central

    Inada, Takuya; Igaki, Hiroshi; Ikegami, Kosuke; Matsumoto, Shinsuke; Nakamura, Masahide; Kusumoto, Shinji

    2012-01-01

    Sensor-driven services often cause chain reactions, since one service may generate an environmental impact that automatically triggers another service. We first propose a framework that can formalize and detect such service chains based on ECA (event, condition, action) rules. Although the service chain can be a major source of feature interactions, not all service chains lead to harmful interactions. Therefore, we then propose a method that identifies feature interactions within the service chains. Specifically, we characterize the degree of deviation of every service chain by evaluating the gap between expected and actual service states. An experimental evaluation demonstrates that the proposed method successfully detects 11 service chains and 6 feature interactions within 7 practical sensor-driven services. PMID:23012499

  13. Proceedings of the Annual Conference on Technology and Innovations in Training and Education (9th)

    DTIC Science & Technology

    1991-01-01

    training features. Figure 15 illustrates community well. The push and the training system features pull of interactivity, as well analysis process. Note that...offers a restructuring of its staff to balance the number of pull down menus that enable a user to easily create technical staff members with the...Merlin lessons using a series of "preformatted templates" Objetive number which negates the need to actually use Merlin code. It offers pull down menus

  14. Evaluation and Improvement of Spectral Features for the Detection of Buried Explosive Hazards Using Forward-Looking Ground-Penetrating Radar

    DTIC Science & Technology

    2012-07-01

    cross track direction is calculated. This is accomplished by taking a 101 point horizontal slice of pixels centered on the alarm. Then, a 101 point...Hamming window, is the 101 -length row vector of FLGPR image pixels surrounding alarm A. We then store the first 50 frequency values (excluding the...Figure 3. Illustration of spectral features in the cross track direction and the difference between actual targets and FAs. Eleven rows of 101

  15. Multiclass Bayes error estimation by a feature space sampling technique

    NASA Technical Reports Server (NTRS)

    Mobasseri, B. G.; Mcgillem, C. D.

    1979-01-01

    A general Gaussian M-class N-feature classification problem is defined. An algorithm is developed that requires the class statistics as its only input and computes the minimum probability of error through use of a combined analytical and numerical integration over a sequence simplifying transformations of the feature space. The results are compared with those obtained by conventional techniques applied to a 2-class 4-feature discrimination problem with results previously reported and 4-class 4-feature multispectral scanner Landsat data classified by training and testing of the available data.

  16. How do gamblers end gambling: longitudinal analysis of Internet gambling behaviors prior to account closure due to gambling related problems.

    PubMed

    Xuan, Ziming; Shaffer, Howard

    2009-06-01

    To examine behavioral patterns of actual Internet gamblers who experienced gambling-related problems and voluntarily closed their accounts. A nested case-control design was used to compare gamblers who closed their accounts because of gambling problems to those who maintained open accounts. Actual play patterns of in vivo Internet gamblers who subscribed to an Internet gambling site. 226 gamblers who closed accounts due to gambling problems were selected from a cohort of 47,603 Internet gamblers who subscribed to an Internet gambling site during February 2005; 226 matched-case controls were selected from the group of gamblers who did not close their accounts. Daily aggregates of behavioral data were collected during an 18-month study period. Main outcomes of interest were daily aggregates of stake, odds, and net loss, which were standardized by the daily aggregate number of bets. We also examined the number of bets to measure trajectory of gambling frequency. Account closers due to gambling problems experienced increasing monetary loss as the time to closure approached; they also increased their stake per bet. Yet they did not chase longer odds; their choices of wagers were more probabilistically conservative (i.e., short odds) compared with the controls. The changes of monetary involvement and risk preference occurred concurrently during the last few days prior to voluntary closing. Our finding of an involvement-seeking yet risk-averse tendency among self-identified problem gamblers challenges the notion that problem gamblers seek "long odds" during "chasing."

  17. A case study of the use of GPR for rehabilitation of a classified Art Deco building: The InovaDomus house

    NASA Astrophysics Data System (ADS)

    Barraca, Nuno; Almeida, Miguel; Varum, Humberto; Almeida, Fernando; Matias, Manuel Senos

    2016-04-01

    Ancient buildings in historical town centers can be protected by Cultural Heritage legislation, thus implying that any rehabilitation must respect their main architectural features. These concerns also apply to Modern and Contemporary buildings, in particular if they are important examples of architectural styles from those periods. These extra problems, or motivations, add to the inherent structural delicacy of ancient building restoration that requires detailed knowledge of the building foundations, characteristics and materials, modification history, infrastructure mapping, current pathologies, etc., all relevant information for an informed rehabilitation project. Such knowledge is seldom available before the actual rehabilitation works begin, and the usual invasive preliminary surveys are frequently expensive, time-consuming and likely significantly alter/damage the building's main features or structural integrity. Hence, the current demand for indirect, non-invasive, reliable and high resolution imagery techniques able to produce relevant information at the early stages of a rehabilitation project. The present work demonstrates that Ground Penetrating Radar (GPR or Georadar) surveys can provide a priori knowledge on the structure, construction techniques, materials, history and pathologies in a classified Modern Age building. It is also shown that the use of GPR on these projects requires carefully designed surveys, taking into account the known information, spatial constraints, environmental noise, nature and dimensions of the expected targets and suitable data processing sequences. Thus, if properly applied, GPR produces high-resolution results crucial for sound engineering/architectural interventions aiming to restore and renovate Modern and Contemporary buildings, with (1) focus on the overall quality of the end-result, (2) no damage inflicted to the existing structure, (3) respect of the building's historical coherence and architectural elements and characteristics, that is, its Cultural Heritage value. Most of the findings and applications discussed in this work can be seen as an approximation to model studies, so that, relevant information can be drawn from the different investigated situations. Therefore, owing to the nature and the range of the problems encountered in this case study, it is also expected that the presented GPR data and interpretation will provide important clues and guidance in the planning and investigation of similar projects and problems.

  18. Prevalence and Correlates of Sleep Problems in Adult Israeli Jews Exposed to Actual or Threatened Terrorist or Rocket Attacks

    PubMed Central

    Palmieri, Patrick A.; Chipman, Katie J.; Canetti, Daphna; Johnson, Robert J.; Hobfoll, Stevan E.

    2010-01-01

    Study Objectives: To estimate the prevalence of, and to identify correlates of clinically significant sleep problems in adult Israeli citizens exposed to chronic terrorism and war trauma or threat thereof. Methods: A population-based, cross-sectional study of 1001 adult Israeli citizens interviewed by phone between July 15 and August 26, 2008. The phone survey was conducted in Hebrew and assessed demographics, trauma/stressor exposure, probable posttraumatic stress disorder (PTSD), probable depression, and sleep problems. Probable PTSD and depression were assessed with the PTSD Symptom Scale (PSS) and Patient Health Questionnaire (PHQ-9), respectively, following DSM-IV diagnostic criteria. Sleep problems in the past month were assessed with the Pittsburgh Sleep Quality Index (PSQI), on which a global composite score ≥ 6 indicates a clinical-level sleep problem. Results: Prevalence of probable PTSD and depression was 5.5% and 5.8%, respectively. Prevalence of clinically significant sleep problems was 37.4% overall, but was significantly higher for probable PTSD (81.8%) and probable depression (79.3%) subgroups. Independent correlates of poor sleep included being female, older, less educated, experiencing major life stressors, and experiencing psychosocial resource loss. Psychosocial resource loss due to terrorist attacks emerged as the strongest potentially modifiable risk factor for sleep problems. Conclusions: Sleep problems are common among Israeli adults living under chronic traumatic threat and trauma exposure. Given the continuing threat of war, interventions that bolster psychosocial resources may play an important role in preventing or alleviating sleep problems in this population. Citation: Palmieri PA; Chipman KJ; Canetti D; Johnson RJ; Hobfoll SE. Prevalence and correlates of sleep problems in adult Israeli Jews exposed to actual or threatened terrorist or rocket attacks. J Clin Sleep Med 2010;6(6):557-564. PMID:21206544

  19. Hardware Development for a Mobile Educational Robot.

    ERIC Educational Resources Information Center

    Mannaa, A. M.; And Others

    1987-01-01

    Describes the development of a robot whose mainframe is essentially transparent and walks on four legs. Discusses various gaits in four-legged motion. Reports on initial trials of a full-sized model without computer-control, including smoothness of motion and actual obstacle crossing features. (CW)

  20. Actual vs perceived performance debriefing in surgery: practice far from perfect.

    PubMed

    Ahmed, Maria; Sevdalis, Nick; Vincent, Charles; Arora, Sonal

    2013-04-01

    Performance feedback or debriefing in surgery is increasingly recognized as an essential means to optimize learning in the operating room (OR). However, there is a lack of evidence regarding the current practice and barriers to debriefing in the OR. Phase 1 consisted of semistructured interviews with surgical trainers and trainees to identify features of an effective debriefing and perceived barriers to debriefing. Phase 2 consisted of ethnographic observations of surgical cases to identify current practice and observed barriers to debriefing. Surgical trainers and trainees identified key features of effective debriefing with regard to the approach and content; however, these were not commonly identified in practice. Culture was recognized as a significant barrier to debriefing across both phases of the study. There is a disparity between what the surgical community views as effective debriefing and actual debriefing practices in the OR. Improvements to the current debriefing culture and practice within the field of surgery should be considered to facilitate learning from clinical practice. Copyright © 2013. Published by Elsevier Inc.

  1. Evolving phenotypic networks in silico.

    PubMed

    François, Paul

    2014-11-01

    Evolved gene networks are constrained by natural selection. Their structures and functions are consequently far from being random, as exemplified by the multiple instances of parallel/convergent evolution. One can thus ask if features of actual gene networks can be recovered from evolutionary first principles. I review a method for in silico evolution of small models of gene networks aiming at performing predefined biological functions. I summarize the current implementation of the algorithm, insisting on the construction of a proper "fitness" function. I illustrate the approach on three examples: biochemical adaptation, ligand discrimination and vertebrate segmentation (somitogenesis). While the structure of the evolved networks is variable, dynamics of our evolved networks are usually constrained and present many similar features to actual gene networks, including properties that were not explicitly selected for. In silico evolution can thus be used to predict biological behaviours without a detailed knowledge of the mapping between genotype and phenotype. Copyright © 2014 The Author. Published by Elsevier Ltd.. All rights reserved.

  2. Team Problem Solving Strategies with a Survey of These Methods Used by Faculty Members in Engineering Technology

    ERIC Educational Resources Information Center

    Marcus, Michael L.; Winters, Dixie L.

    2004-01-01

    Students from science, engineering, and technology programs should be able to work together as members of project teams to find solutions to technical problems. The exercise in this paper describes the methods actually used by a project team from a Biomedical Instrumentation Corporation in which scientists, technicians, and engineers from various…

  3. Primary School Children's Strategies in Solving Contingency Table Problems: The Role of Intuition and Inhibition

    ERIC Educational Resources Information Center

    Obersteiner, Andreas; Bernhard, Matthias; Reiss, Kristina

    2015-01-01

    Understanding contingency table analysis is a facet of mathematical competence in the domain of data and probability. Previous studies have shown that even young children are able to solve specific contingency table problems, but apply a variety of strategies that are actually invalid. The purpose of this paper is to describe primary school…

  4. Teachers' Conceptualization and Actual Practice in the Student Evaluation Process at the Upper Secondary School Level in Japan, Focusing on Problem Solving Skills.

    ERIC Educational Resources Information Center

    Wai, Nu Nu; Hirakawa, Yukiko

    2001-01-01

    Studied the participation and performance of upper secondary school teachers in Japan through surveys completed by 360 Geography teachers. Findings suggest that the importance of developing problem-solving skills is widely recognized among these teachers. Implementing training in such skills is much more difficult. Developing effective teaching…

  5. The Soda Can Optimization Problem: Getting Close to the Real Thing

    ERIC Educational Resources Information Center

    Premadasa, Kirthi; Martin, Paul; Sprecher, Bryce; Yang, Lai; Dodge, Noah-Helen

    2016-01-01

    Optimizing the dimensions of a soda can is a classic problem that is frequently posed to freshman calculus students. However, if we only minimize the surface area subject to a fixed volume, the result is a can with a square edge-on profile, and this differs significantly from actual cans. By considering a more realistic model for the can that…

  6. Improving Productivity in the Work Force: Implications for Research and Development in Vocational Education. Occasional Paper No. 72.

    ERIC Educational Resources Information Center

    Sullivan, Dennis J.

    Declining productivity is a major problem in the American economy. Gains in productivity, and finally, actual rates of productivity, have been declining since the late 1960s. Specific problems arising as a result of this decline in productivity are the inflationary pressures that we face as a nation, the increased regulatory environment under…

  7. Applications of a Time Sequence Mechanism in the Simulation Cases of a Web-Based Medical Problem-Based Learning System

    ERIC Educational Resources Information Center

    Chen, Lih-Shyang; Cheng, Yuh-Ming; Weng, Sheng-Feng; Chen, Yong-Guo; Lin, Chyi-Her

    2009-01-01

    The prevalence of Internet applications nowadays has led many medical schools and centers to incorporate computerized Problem-Based Learning (PBL) methods into their training curricula. However, many of these PBL systems do not truly reflect the situations which practitioners may actually encounter in a real medical environment, and hence their…

  8. Predicting Nonresponse Bias from Teacher Ratings of Mental Health Problems in Primary School Children

    ERIC Educational Resources Information Center

    Stormark, Kjell Morten; Heiervang, Einar; Heimann, Mikael; Lundervold, Astri; Gillberg, Christopher

    2008-01-01

    The impact of nonresponse on estimates of mental health problems was examined in a prospective teacher screen in a community survey of 9,155 7-9 year olds. For 6,611 of the children, parents consented to participation in the actual study (Responders), while for 2,544 children parental consent was not obtained (Nonresponders). The teacher screen…

  9. The long-run effects of economic, demographic, and political indices on actual and potential CO2 emissions.

    PubMed

    Adom, Philip Kofi; Kwakwa, Paul Adjei; Amankwaa, Afua

    2018-07-15

    This study examines the long-run drivers of potential and actual CO 2 emissions in Ghana, a sub-Saharan Africa country. The use of the former helps address the reverse causality problem and capture the true long-run effects. The Stock-Watson dynamic OLS is used with data from 1970 to 2014. The result shows that potential CO 2 emissions improve model efficiency. Income (except in "other sector") and financial development (except in manufacturing and construction sector) have compelling positive and negative effects on actual and potential CO 2 emissions, respectively. A higher price (oil and electricity) reduces actual and potential CO 2 emissions, but electricity price is more vital in residential, buildings and commercial and public services sector, while oil price is crucial in the transport sector. Democracy lowers actual and potential CO 2 emissions in the aggregate (insignificant) and transport sectors but raises it in the manufacturing and construction sector. The effect is, however, inconsistent for the remaining sectors. Urbanization raises aggregate actual and potential CO 2 emissions, but the effect is inconsistent for the transport sector. The findings have important implications for policy formulation. Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. Cognitive Models for Integrating Testing and Instruction, Phase II. Methodology Program.

    ERIC Educational Resources Information Center

    Quellmalz, Edys S.; Shaha, Steven

    The potential of a cognitive model task analysis scheme (CMS) that specifies features of test problems shown by research to affect performance is explored. CMS describes the general skill area and the generic task or problem type. It elaborates features of the problem situation and required responses found by research to influence performance.…

  11. Choosing the Right Solution Approach: The Crucial Role of Situational Knowledge in Electricity and Magnetism

    ERIC Educational Resources Information Center

    Savelsbergh, Elwin R.; de Jong, Ton; Ferguson-Hessler, Monica G. M.

    2011-01-01

    Novice problem solvers are rather sensitive to surface problem features, and they often resort to trial and error formula matching rather than identifying an appropriate solution approach. These observations have been interpreted to imply that novices structure their knowledge according to surface features rather than according to problem type…

  12. Assessing the Impact of Representational and Contextual Problem Features on Student Use of Right-Hand Rules

    ERIC Educational Resources Information Center

    Kustusch, Mary Bridget

    2016-01-01

    Students in introductory physics struggle with vector algebra and these challenges are often associated with contextual and representational features of the problems. Performance on problems about cross product direction is particularly poor and some research suggests that this may be primarily due to misapplied right-hand rules. However, few…

  13. Convolutional neural network features based change detection in satellite images

    NASA Astrophysics Data System (ADS)

    Mohammed El Amin, Arabi; Liu, Qingjie; Wang, Yunhong

    2016-07-01

    With the popular use of high resolution remote sensing (HRRS) satellite images, a huge research efforts have been placed on change detection (CD) problem. An effective feature selection method can significantly boost the final result. While hand-designed features have proven difficulties to design features that effectively capture high and mid-level representations, the recent developments in machine learning (Deep Learning) omit this problem by learning hierarchical representation in an unsupervised manner directly from data without human intervention. In this letter, we propose approaching the change detection problem from a feature learning perspective. A novel deep Convolutional Neural Networks (CNN) features based HR satellite images change detection method is proposed. The main guideline is to produce a change detection map directly from two images using a pretrained CNN. This method can omit the limited performance of hand-crafted features. Firstly, CNN features are extracted through different convolutional layers. Then, a concatenation step is evaluated after an normalization step, resulting in a unique higher dimensional feature map. Finally, a change map was computed using pixel-wise Euclidean distance. Our method has been validated on real bitemporal HRRS satellite images according to qualitative and quantitative analyses. The results obtained confirm the interest of the proposed method.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kogalovskii, M.R.

    This paper presents a review of problems related to statistical database systems, which are wide-spread in various fields of activity. Statistical databases (SDB) are referred to as databases that consist of data and are used for statistical analysis. Topics under consideration are: SDB peculiarities, properties of data models adequate for SDB requirements, metadata functions, null-value problems, SDB compromise protection problems, stored data compression techniques, and statistical data representation means. Also examined is whether the present Database Management Systems (DBMS) satisfy the SDB requirements. Some actual research directions in SDB systems are considered.

  15. Command and Control in a Complex World

    DTIC Science & Technology

    2012-05-22

    definition of command and control does not adequately address changes introduced through technology trends, our understanding of the global operating...processes. The current joint definition of command and control does not adequately address changes introduced through technology trends, our...the problem is actually solved.  There are no  definitive ,  objective solutions to wicked problems.  For a complete  definition  of wicked problems, see

  16. Analysis of Navy radome failure problems

    NASA Technical Reports Server (NTRS)

    Tatnall, G. J.; Foulke, K.

    1974-01-01

    A survey of radome failure problems in military aircraft under actual operating conditions was conducted. The aircraft were operating from aircraft carriers in the Pacific Ocean. Critical problem areas were identified and a plan was developed for failure prevention. The development and application of repair kits for correcting the erosion damage are reported. It is stated that the rain erosion damage survey established a strong justification for qualification testing of all materials and designs which may have questionable life expectancy on the aircraft.

  17. Problems in the estimation of human exposure to components of acid precipitation precursors.

    PubMed Central

    Ferris, B G; Spengler, J D

    1985-01-01

    Problems associated with estimation of human exposure to ambient air pollutants are discussed. Ideally, we would prefer to have some indication of actual dose. For most pollutants this is not presently feasible. Specific problems discussed are adequacy of outdoor monitors; the need to correct for exposures and time spent indoors; the need to have particle size distributions described and the chemistry of the particles presented. These indicate the need to develop lightweight accurate and reliable personal monitors. Images FIGURE 1. PMID:4076094

  18. Interval versions of statistical techniques with applications to environmental analysis, bioinformatics, and privacy in statistical databases

    NASA Astrophysics Data System (ADS)

    Kreinovich, Vladik; Longpre, Luc; Starks, Scott A.; Xiang, Gang; Beck, Jan; Kandathi, Raj; Nayak, Asis; Ferson, Scott; Hajagos, Janos

    2007-02-01

    In many areas of science and engineering, it is desirable to estimate statistical characteristics (mean, variance, covariance, etc.) under interval uncertainty. For example, we may want to use the measured values x(t) of a pollution level in a lake at different moments of time to estimate the average pollution level; however, we do not know the exact values x(t)--e.g., if one of the measurement results is 0, this simply means that the actual (unknown) value of x(t) can be anywhere between 0 and the detection limit (DL). We must, therefore, modify the existing statistical algorithms to process such interval data. Such a modification is also necessary to process data from statistical databases, where, in order to maintain privacy, we only keep interval ranges instead of the actual numeric data (e.g., a salary range instead of the actual salary). Most resulting computational problems are NP-hard--which means, crudely speaking, that in general, no computationally efficient algorithm can solve all particular cases of the corresponding problem. In this paper, we overview practical situations in which computationally efficient algorithms exist: e.g., situations when measurements are very accurate, or when all the measurements are done with one (or few) instruments. As a case study, we consider a practical problem from bioinformatics: to discover the genetic difference between the cancer cells and the healthy cells, we must process the measurements results and find the concentrations c and h of a given gene in cancer and in healthy cells. This is a particular case of a general situation in which, to estimate states or parameters which are not directly accessible by measurements, we must solve a system of equations in which coefficients are only known with interval uncertainty. We show that in general, this problem is NP-hard, and we describe new efficient algorithms for solving this problem in practically important situations.

  19. The Ground Flash Fraction Retrieval Algorithm Employing Differential Evolution: Simulations and Applications

    NASA Technical Reports Server (NTRS)

    Koshak, William; Solakiewicz, Richard

    2012-01-01

    The ability to estimate the fraction of ground flashes in a set of flashes observed by a satellite lightning imager, such as the future GOES-R Geostationary Lightning Mapper (GLM), would likely improve operational and scientific applications (e.g., severe weather warnings, lightning nitrogen oxides studies, and global electric circuit analyses). A Bayesian inversion method, called the Ground Flash Fraction Retrieval Algorithm (GoFFRA), was recently developed for estimating the ground flash fraction. The method uses a constrained mixed exponential distribution model to describe a particular lightning optical measurement called the Maximum Group Area (MGA). To obtain the optimum model parameters (one of which is the desired ground flash fraction), a scalar function must be minimized. This minimization is difficult because of two problems: (1) Label Switching (LS), and (2) Parameter Identity Theft (PIT). The LS problem is well known in the literature on mixed exponential distributions, and the PIT problem was discovered in this study. Each problem occurs when one allows the numerical minimizer to freely roam through the parameter search space; this allows certain solution parameters to interchange roles which leads to fundamental ambiguities, and solution error. A major accomplishment of this study is that we have employed a state-of-the-art genetic-based global optimization algorithm called Differential Evolution (DE) that constrains the parameter search in such a way as to remove both the LS and PIT problems. To test the performance of the GoFFRA when DE is employed, we applied it to analyze simulated MGA datasets that we generated from known mixed exponential distributions. Moreover, we evaluated the GoFFRA/DE method by applying it to analyze actual MGAs derived from low-Earth orbiting lightning imaging sensor data; the actual MGA data were classified as either ground or cloud flash MGAs using National Lightning Detection Network[TM] (NLDN) data. Solution error plots are provided for both the simulations and actual data analyses.

  20. Support vector machines for nuclear reactor state estimation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zavaljevski, N.; Gross, K. C.

    2000-02-14

    Validation of nuclear power reactor signals is often performed by comparing signal prototypes with the actual reactor signals. The signal prototypes are often computed based on empirical data. The implementation of an estimation algorithm which can make predictions on limited data is an important issue. A new machine learning algorithm called support vector machines (SVMS) recently developed by Vladimir Vapnik and his coworkers enables a high level of generalization with finite high-dimensional data. The improved generalization in comparison with standard methods like neural networks is due mainly to the following characteristics of the method. The input data space is transformedmore » into a high-dimensional feature space using a kernel function, and the learning problem is formulated as a convex quadratic programming problem with a unique solution. In this paper the authors have applied the SVM method for data-based state estimation in nuclear power reactors. In particular, they implemented and tested kernels developed at Argonne National Laboratory for the Multivariate State Estimation Technique (MSET), a nonlinear, nonparametric estimation technique with a wide range of applications in nuclear reactors. The methodology has been applied to three data sets from experimental and commercial nuclear power reactor applications. The results are promising. The combination of MSET kernels with the SVM method has better noise reduction and generalization properties than the standard MSET algorithm.« less

  1. Spectral methods to detect surface mines

    NASA Astrophysics Data System (ADS)

    Winter, Edwin M.; Schatten Silvious, Miranda

    2008-04-01

    Over the past five years, advances have been made in the spectral detection of surface mines under minefield detection programs at the U. S. Army RDECOM CERDEC Night Vision and Electronic Sensors Directorate (NVESD). The problem of detecting surface land mines ranges from the relatively simple, the detection of large anti-vehicle mines on bare soil, to the very difficult, the detection of anti-personnel mines in thick vegetation. While spatial and spectral approaches can be applied to the detection of surface mines, spatial-only detection requires many pixels-on-target such that the mine is actually imaged and shape-based features can be exploited. This method is unreliable in vegetated areas because only part of the mine may be exposed, while spectral detection is possible without the mine being resolved. At NVESD, hyperspectral and multi-spectral sensors throughout the reflection and thermal spectral regimes have been applied to the mine detection problem. Data has been collected on mines in forest and desert regions and algorithms have been developed both to detect the mines as anomalies and to detect the mines based on their spectral signature. In addition to the detection of individual mines, algorithms have been developed to exploit the similarities of mines in a minefield to improve their detection probability. In this paper, the types of spectral data collected over the past five years will be summarized along with the advances in algorithm development.

  2. Information-educational environment with adaptive control of learning process

    NASA Astrophysics Data System (ADS)

    Modjaev, A. D.; Leonova, N. M.

    2017-01-01

    Recent years, a new scientific branch connected with the activities in social sphere management developing intensively and it is called "Social Cybernetics". In the framework of this scientific branch, theory and methods of management of social sphere are formed. Considerable attention is paid to the management, directly in real time. However, the decision of such management tasks is largely constrained by the lack of or insufficiently deep study of the relevant sections of the theory and methods of management. The article discusses the use of cybernetic principles in solving problems of control in social systems. Applying to educational activities a model of composite interrelated objects representing the behaviour of students at various stages of educational process is introduced. Statistical processing of experimental data obtained during the actual learning process is being done. If you increase the number of features used, additionally taking into account the degree and nature of variability of levels of current progress of students during various types of studies, new properties of students' grouping are discovered. L-clusters were identified, reflecting the behaviour of learners with similar characteristics during lectures. It was established that the characteristics of the clusters contain information about the dynamics of learners' behaviour, allowing them to be used in additional lessons. The ways of solving the problem of adaptive control based on the identified dynamic characteristics of the learners are planned.

  3. Mathematical analysis of the impact mechanism of information platform on agro-product supply chain and agro-product competitiveness

    NASA Astrophysics Data System (ADS)

    Jiang, Qi-Jie; Jin, Mao-Zhu; Ren, Pei-Yu

    2017-04-01

    How to optimize agro-product supply chain to promote its operating efficiency so as to enhance the competitiveness of regional agricultural products has posed a problem to academic circles, business circles and governments of various levels. One way to solve this problem is to introduce an information platform into the supply chain, which this essay focuses on. Firstly, a review of existing research findings concerning the agro-product competitiveness, agro-product supply chain (ASC) and information platform was given. Secondly, we constructed a mathematical model to analyze the impact of information platform on the bullwhip effect in ASC. Thirdly, another mathematical model was constructed to help compare and analyze the impact of information platform on information acquisition of members in ASC. The research results show that the implantation of information platform can mitigate the bullwhip effect in ASC, and members can determine order amount or production more close to the actual market demand. And also the information platform can reduce the time for members in ASC to get information from other members. Besides, information platform can help ASC to alleviate information asymmetry among upstream and downstream members. Furthermore, researches about the operating mechanism and pattern, technical feature and running structure of the information platform, along with their impacts on agro-product supply chain and the competitiveness of agricultural products need to be advanced.

  4. The Music of Mathematics: Toward a New Problem Typology

    NASA Astrophysics Data System (ADS)

    Quarfoot, David

    Halmos (1980) once described problems and their solutions as "the heart of mathematics". Following this line of thinking, one might naturally ask: "What, then, is the heart of problems?". In this work, I attempt to answer this question using techniques from statistics, information visualization, and machine learning. I begin the journey by cataloging the features of problems delineated by the mathematics and mathematics education communities. These dimensions are explored in a large data set of students working thousands of problems at the Art of Problem Solving, an online company that provides adaptive mathematical training for students around the world. This analysis is able to concretely show how the fabric of mathematical problems changes across different subjects, difficulty levels, and students. Furthermore, it locates problems that stand out in the crowd -- those that synergize cognitive engagement, learning, and difficulty. This quantitatively-heavy side of the dissertation is partnered with a qualitatively-inspired portion that involves human scoring of 105 problems and their solutions. In this setting, I am able to capture elusive features of mathematical problems and derive a fuller picture of the space of mathematical problems. Using correlation matrices, principal components analysis, and clustering techniques, I explore the relationships among those features frequently discussed in mathematics problems (e.g., difficulty, creativity, novelty, affective engagement, authenticity). Along the way, I define a new set of uncorrelated features in problems and use these as the basis for a New Mathematical Problem Typology (NMPT). Grounded in the terminology of classical music, the NMPT works to quickly convey the essence and value of a problem, just as terms like "etude" and "mazurka" do for musicians. Taken together, these quantitative and qualitative analyses seek to terraform the landscape of mathematical problems and, concomitantly, the current thinking about that world. Most importantly, this work highlights and names the panoply of problems that exist, expanding the myopic vision of contemporary mathematical problem solving.

  5. Semantic data association for planar features in outdoor 6D-SLAM using lidar

    NASA Astrophysics Data System (ADS)

    Ulas, C.; Temeltas, H.

    2013-05-01

    Simultaneous Localization and Mapping (SLAM) is a fundamental problem of the autonomous systems in GPS (Global Navigation System) denied environments. The traditional probabilistic SLAM methods uses point features as landmarks and hold all the feature positions in their state vector in addition to the robot pose. The bottleneck of the point-feature based SLAM methods is the data association problem, which are mostly based on a statistical measure. The data association performance is very critical for a robust SLAM method since all the filtering strategies are applied after a known correspondence. For point-features, two different but very close landmarks in the same scene might be confused while giving the correspondence decision when their positions and error covariance matrix are solely taking into account. Instead of using the point features, planar features can be considered as an alternative landmark model in the SLAM problem to be able to provide a more consistent data association. Planes contain rich information for the solution of the data association problem and can be distinguished easily with respect to point features. In addition, planar maps are very compact since an environment has only very limited number of planar structures. The planar features does not have to be large structures like building wall or roofs; the small plane segments can also be used as landmarks like billboards, traffic posts and some part of the bridges in urban areas. In this paper, a probabilistic plane-feature extraction method from 3DLiDAR data and the data association based on the extracted semantic information of the planar features is introduced. The experimental results show that the semantic data association provides very satisfactory result in outdoor 6D-SLAM.

  6. Better road congestion measures are needed

    DOT National Transportation Integrated Search

    2003-05-01

    Road congestion is growing worse as demand : outstrips new roadway construction and other : efforts to increase traffic flows. : Better ways to measure congestion are needed to : effectively address the problem. : Actual measures of speeds an...

  7. School Avoidance: Tips for Concerned Parents

    MedlinePlus

    ... Threats of physical harm (as from a school bully) Actual physical harm Tips for Concerned Parents: As ... the classroom. If a problem like a school bully or an unreasonable teacher is the cause of ...

  8. Adolescent Psychology around the World

    ERIC Educational Resources Information Center

    Arnett, Jeffrey Jensen, Ed.

    2011-01-01

    This book paints a portrait of adolescent psychology in 4 major regions: Africa/the Middle East, Asia, the Americas, and Europe. Featuring 24 revised and updated chapters from the "International Encyclopedia of Adolescence" (2007), readers are introduced to the way the majority of the world's adolescents actually live. Most contributors…

  9. 24 CFR 1006.205 - Development.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... homebuyers through: (i) Down payment assistance; (ii) Closing costs assistance; (iii) Direct lending; and (iv... assisted and unassisted units are not comparable, the actual costs may be determined based upon a method of cost allocation. If the assisted and unassisted units are comparable in terms of size, features, and...

  10. Rites of Passage and Teacher Training Processes.

    ERIC Educational Resources Information Center

    Katz, Fred E.

    The student teaching process may have features which actually interfere with the processes of learning. Many student teachers revealed in interviews that they went through humiliation, trauma, and disenchantment with teaching in their interactions with cooperating teachers, with other school personnel, and with children in the student teaching…

  11. Proposal of Heuristic Algorithm for Scheduling of Print Process in Auto Parts Supplier

    NASA Astrophysics Data System (ADS)

    Matsumoto, Shimpei; Okuhara, Koji; Ueno, Nobuyuki; Ishii, Hiroaki

    We are interested in the print process on the manufacturing processes of auto parts supplier as an actual problem. The purpose of this research is to apply our scheduling technique developed in university to the actual print process in mass customization environment. Rationalization of the print process is depending on the lot sizing. The manufacturing lead time of the print process is long, and in the present method, production is done depending on worker’s experience and intuition. The construction of an efficient production system is urgent problem. Therefore, in this paper, in order to shorten the entire manufacturing lead time and to reduce the stock, we reexamine the usual method of the lot sizing rule based on heuristic technique, and we propose the improvement method which can plan a more efficient schedule.

  12. Using Sequence Diagrams to Detect Communication Problems Between Systems

    NASA Technical Reports Server (NTRS)

    Lindvall, Mikael; Ackermann, Chris; Stratton, William C.; Sibol, Deane E.; Ray, Arnab; Yonkwa, Lyly; Kresser, Jan; Godfrey, Sally H.; Knodel, Jens

    2008-01-01

    Many software systems are evolving complex system of systems (SoS) for which inter-system communication is both mission-critical and error-prone. Such communication problems ideally would be detected before deployment. In a NASA-supported Software Assurance Research Program (SARP) project, we are researching a new approach addressing such problems. In this paper, we show that problems in the communication between two systems can be detected by using sequence diagrams to model the planned communication and by comparing the planned sequence to the actual sequence. We identify different kinds of problems that can be addressed by modeling the planned sequence using different level of abstractions.

  13. Trajectories of thermospheric air parcels flowing over Alaska, reconstructed from ground-based wind measurements

    NASA Astrophysics Data System (ADS)

    Dhadly, Manbharat; Conde, Mark

    2017-06-01

    It is widely presumed that the convective stability and enormous kinematic viscosity of Earth's upper thermosphere hinders development of both horizontal and vertical wind shears and other gradients. Any strong local structure (over scale sizes of several hundreds of kilometers) that might somehow form would be expected to dissipate rapidly. Air flow in such an atmosphere should be relatively simple, and transport effects only slowly disperse and mix air masses. However, our observations show that wind fields in Earth's thermosphere have much more local-scale structure than usually predicated by current modeling techniques, at least at auroral latitudes; they complicate air parcel trajectories enormously, relative to typical expectations. For tracing air parcels, we used wind measurements of an all-sky Scanning Doppler Fabry-Perot interferometer and reconstructed time-resolved two-dimensional maps of the horizontal vector wind field to infer forward and backward air parcel trajectories over time. This is the first comprehensive study to visualize the complex motions of thermospheric air parcels carried through the actual observed local-scale structures in the high-latitude winds. Results show that thermospheric air parcel transport is a very difficult observational problem, because the trajectories followed are very sensitive to the detailed features of the driving wind field. To reconstruct the actual motion of a given air parcel requires wind measurements everywhere along the trajectory followed, with spatial resolutions of 100 km or less, and temporal resolutions of a few minutes or better. Understanding such transport is important, for example, in predicting the global-scale impacts of aurorally generated composition perturbations.

  14. The correct estimate of the probability of false detection of the matched filter in weak-signal detection problems

    NASA Astrophysics Data System (ADS)

    Vio, R.; Andreani, P.

    2016-05-01

    The reliable detection of weak signals is a critical issue in many astronomical contexts and may have severe consequences for determining number counts and luminosity functions, but also for optimizing the use of telescope time in follow-up observations. Because of its optimal properties, one of the most popular and widely-used detection technique is the matched filter (MF). This is a linear filter designed to maximise the detectability of a signal of known structure that is buried in additive Gaussian random noise. In this work we show that in the very common situation where the number and position of the searched signals within a data sequence (e.g. an emission line in a spectrum) or an image (e.g. a point-source in an interferometric map) are unknown, this technique, when applied in its standard form, may severely underestimate the probability of false detection. This is because the correct use of the MF relies upon a priori knowledge of the position of the signal of interest. In the absence of this information, the statistical significance of features that are actually noise is overestimated and detections claimed that are actually spurious. For this reason, we present an alternative method of computing the probability of false detection that is based on the probability density function (PDF) of the peaks of a random field. It is able to provide a correct estimate of the probability of false detection for the one-, two- and three-dimensional case. We apply this technique to a real two-dimensional interferometric map obtained with ALMA.

  15. Analyzing gene expression time-courses based on multi-resolution shape mixture model.

    PubMed

    Li, Ying; He, Ye; Zhang, Yu

    2016-11-01

    Biological processes actually are a dynamic molecular process over time. Time course gene expression experiments provide opportunities to explore patterns of gene expression change over a time and understand the dynamic behavior of gene expression, which is crucial for study on development and progression of biology and disease. Analysis of the gene expression time-course profiles has not been fully exploited so far. It is still a challenge problem. We propose a novel shape-based mixture model clustering method for gene expression time-course profiles to explore the significant gene groups. Based on multi-resolution fractal features and mixture clustering model, we proposed a multi-resolution shape mixture model algorithm. Multi-resolution fractal features is computed by wavelet decomposition, which explore patterns of change over time of gene expression at different resolution. Our proposed multi-resolution shape mixture model algorithm is a probabilistic framework which offers a more natural and robust way of clustering time-course gene expression. We assessed the performance of our proposed algorithm using yeast time-course gene expression profiles compared with several popular clustering methods for gene expression profiles. The grouped genes identified by different methods are evaluated by enrichment analysis of biological pathways and known protein-protein interactions from experiment evidence. The grouped genes identified by our proposed algorithm have more strong biological significance. A novel multi-resolution shape mixture model algorithm based on multi-resolution fractal features is proposed. Our proposed model provides a novel horizons and an alternative tool for visualization and analysis of time-course gene expression profiles. The R and Matlab program is available upon the request. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Evaluating skin care problems in people with stomas.

    PubMed

    Williams, Julia; Gwillam, Brandon; Sutherland, Norma; Matten, Jane; Hemmingway, Julie; Ilsey, Helen; Somerville, Mary; Vujnovich, Angela; Day, Stephanie; Redmond, Caroline; Cowin, Caroline; Fox, Kathy; Parker, Theresa

    This study aimed to identify actual and potential peristomal skin problems in relation to the use of different types of stoma appliances and accessories. It also compared ostomists' perceptions of their peristomal skin condition with those of stoma care nurse specialists. Maintaining skin integrity is a basic skill that ensures good stoma management. It is widely accepted that from time to time a patient with a stoma will seek clinical advice about a peristomal skin problem. Little is known about how often patients present with these problems, the clinical course of peristomal skin problems, and how patients manage them. A multi-centred descriptive study was conducted among 80 ostomists. Fieldwork took place over 13 months. The sample was drawn from a UK home care delivery database. Using structured questionnaires, ostomists were interviewed by a stoma care nurse specialist. A digital photograph was taken of their peristomal skin and their answers compared with nurse assessment using the Stoma Care Ostomy Research index scoring system. Of the interviewees 32% had healthy peristomal skin both via questionnaire and at observation. At observation, 68% were observed to have peristomal skin problems, of whom 44% had irritated skin, 12% had ulcerated skin, 9% had an apparent allergy and 3% had macerated/eroded skin. In addition, 21% had an ill-fitting appliance at observation. Half (50%) were observed to have a parastomal hernia, although only 24% reported having one. These findings demonstrate significant differences between the perception of skin problems among ostomists and actual skin problems observed by stoma care nurse specialists. Peristomal skin problems are common among ostomists. The difference between ostomists' and nurses' perceptions of peristomal skin condition led to the identification of educational needs for the new ostomist. Education and regular follow-up by the stoma care nurse specialist is imperative.

  17. The quest for better quality-of-life - learning from large-scale shaking table tests

    NASA Astrophysics Data System (ADS)

    Nakashima, M.; Sato, E.; Nagae, T.; Kunio, F.; Takahito, I.

    2010-12-01

    Earthquake engineering has its origins in the practice of “learning from actual earthquakes and earthquake damages.” That is, we recognize serious problems by witnessing the actual damage to our structures, and then we develop and apply engineering solutions to solve these problems. This tradition in earthquake engineering, i.e., “learning from actual damage,” was an obvious engineering response to earthquakes and arose naturally as a practice in a civil and building engineering discipline that traditionally places more emphasis on experience than do other engineering disciplines. But with the rapid progress of urbanization, as society becomes denser, and as the many components that form our society interact with increasing complexity, the potential damage with which earthquakes threaten the society also increases. In such an era, the approach of ”learning from actual earthquake damages” becomes unacceptably dangerous and expensive. Among the practical alternatives to the old practice is to “learn from quasi-actual earthquake damages.” One tool for experiencing earthquake damages without attendant catastrophe is the large shaking table. E-Defense, the largest one we have, was developed in Japan after the 1995 Hyogoken-Nanbu (Kobe) earthquake. Since its inauguration in 2005, E-Defense has conducted over forty full-scale or large-scale shaking table tests, applied to a variety of structural systems. The tests supply detailed data on actual behavior and collapse of the tested structures, offering the earthquake engineering community opportunities to experience and assess the actual seismic performance of the structures, and to help society prepare for earthquakes. Notably, the data were obtained without having to wait for the aftermaths of actual earthquakes. Earthquake engineering has always been about life safety, but in recent years maintaining the quality of life has also become a critical issue. Quality-of-life concerns include nonstructural damage, business continuity, public health, quickness of damage assessment, infrastructure, data and communication networks, and other issues, and not enough useful empirical data have emerged about these issues from the experiences of actual earthquakes. To provide quantitative data that can be used to reduce earthquake risk to our quality of life, E-Defense recently has been implementing two comprehensive research projects in which a base-isolated hospital and a steel high-rise building were tested using the E-Defense shaking table and their seismic performance were examined particularly in terms of the nonstructural damage, damage to building contents and furniture, and operability, functionality, and business-continuity capability. The paper presents the overview of the two projects, together with major findings obtained from the projects.

  18. Improving the nowcasting of precipitation in an Alpine region with an enhanced radar echo tracking algorithm

    NASA Astrophysics Data System (ADS)

    Mecklenburg, S.; Joss, J.; Schmid, W.

    2000-12-01

    Nowcasting for hydrological applications is discussed. The tracking algorithm extrapolates radar images in space and time. It originates from the pattern recognition techniques TREC (Tracking Radar Echoes by Correlation, Rinehart and Garvey, J. Appl. Meteor., 34 (1995) 1286) and COTREC (Continuity of TREC vectors, Li et al., Nature, 273 (1978) 287). To evaluate the quality of the extrapolation, a parameter scheme is introduced, able to distinguish between errors in the position and the intensity of the predicted precipitation. The parameters for the position are the absolute error, the relative error and the error of the forecasted direction. The parameters for the intensity are the ratio of the medians and the variations of the rain rate (ratio of two quantiles) between the actual and the forecasted image. To judge the overall quality of the forecast, the correlation coefficient between the forecasted and the actual radar image has been used. To improve the forecast, three aspects have been investigated: (a) Common meteorological attributes of convective cells, derived from a hail statistics, have been determined to optimize the parameters of the tracking algorithm. Using (a), the forecast procedure modifications (b) and (c) have been applied. (b) Small-scale features have been removed by using larger tracking areas and by applying a spatial and temporal smoothing, since problems with the tracking algorithm are mainly caused by small-scale/short-term variations of the echo pattern or because of limitations caused by the radar technique itself (erroneous vectors caused by clutter or shielding). (c) The searching area and the number of searched boxes have been restricted. This limits false detections, which is especially useful in stratiform precipitation and for stationary echoes. Whereas a larger scale and the removal of small-scale features improve the forecasted position for the convective precipitation, the forecast of the stratiform event is not influenced, but limiting the search area leads to a slightly better forecast. The forecast of the intensity is successful for both precipitation events. Forecasting the variation of the rain rate calls for further investigation. Applying COTREC improves the forecast of the convective precipitation, especially for extrapolation times exceeding 30 min.

  19. [Effect of the alimentary factor on the immunobiologic reactivity of children's bodies].

    PubMed

    Voznesenskaia, F M; Panshinskaia, N M

    1976-01-01

    Observations covered 66 healthy six-year old children of a childrens' home. The actual alimentation of the children was studied according to tabulated values for one year and 112 apportionoments. In the rations of actual nutrition a disturbed correlation of proteins, fats and carbohydrates was noted. Seasonal variations of the salival lysozyme activity were revealed against the background of the actual alimentation. The lowest antimicrobial activity of the lysozyme was recorded in the winter and spring seasons of the year. The low lysozyme activity of the saliva in spring may be explained by deficiency of the animal protein in the ration. In winter time added to the insufficient content of the animal protein were features specific for the day's routine, typical of this season. An addition of the animal protein to the actual nutritional rations of the children, in the form of eggs and nonfat dry milk and a correction of the proteins, fats and carbohydrates proportions in the rations led to a statistically significant rise in the lysozyme activity in the saliva of children during all the months of observation.

  20. Microbial Biotreatment of Actual Textile Wastewater in a Continuous Sequential Rice Husk Biofilter and the Microbial Community Involved

    PubMed Central

    Lindh, Markus V.; Pinhassi, Jarone; Welander, Ulrika

    2017-01-01

    Textile dying processes often pollute wastewater with recalcitrant azo and anthraquinone dyes. Yet, there is little development of effective and affordable degradation systems for textile wastewater applicable in countries where water technologies remain poor. We determined biodegradation of actual textile wastewater in biofilters containing rice husks by spectrophotometry and liquid chromatography mass spectrometry. The indigenous microflora from the rice husks consistently performed >90% decolorization at a hydraulic retention time of 67 h. Analysis of microbial community composition of bacterial 16S rRNA genes and fungal internal transcribed spacer (ITS) gene fragments in the biofilters revealed a bacterial consortium known to carry azoreductase genes, such as Dysgonomonas, and Pseudomonas and the presence of fungal phylotypes such as Gibberella and Fusarium. Our findings emphasize that rice husk biofilters support a microbial community of both bacteria and fungi with key features for biodegradation of actual textile wastewater. These results suggest that microbial processes can substantially contribute to efficient and reliable degradation of actual textile wastewater. Thus, development of biodegradation systems holds promise for application of affordable wastewater treatment in polluted environments. PMID:28114377

  1. Examining the design features of a communication-rich, problem-centred mathematics professional development

    NASA Astrophysics Data System (ADS)

    de Araujo, Zandra; Orrill, Chandra Hawley; Jacobson, Erik

    2018-04-01

    While there is considerable scholarship describing principles for effective professional development, there have been few attempts to examine these principles in practice. In this paper, we identify and examine the particular design features of a mathematics professional development experience provided for middle grades teachers over 14 weeks. The professional development was grounded in a set of mathematical tasks that each had one right answer, but multiple solution paths. The facilitator engaged participants in problem solving and encouraged participants to work collaboratively to explore different solution paths. Through analysis of this collaborative learning environment, we identified five design features for supporting teacher learning of important mathematics and pedagogy in a problem-solving setting. We discuss these design features in depth and illustrate them by presenting an elaborated example from the professional development. This study extends the existing guidance for the design of professional development by examining and operationalizing the relationships among research-based features of effective professional development and the enacted features of a particular design.

  2. Enhancing insight in scientific problem solving by highlighting the functional features of prototypes: an fMRI study.

    PubMed

    Hao, Xin; Cui, Shuai; Li, Wenfu; Yang, Wenjing; Qiu, Jiang; Zhang, Qinglin

    2013-10-09

    Insight can be the first step toward creating a groundbreaking product. As evident in anecdotes and major inventions in history, heuristic events (heuristic prototypes) prompted inventors to acquire insight when solving problems. Bionic imitation in scientific innovation is an example of this kind of problem solving. In particular, heuristic prototypes (e.g., the lotus effect; the very high water repellence exhibited by lotus leaves) help solve insight problems (e.g., non-stick surfaces). We speculated that the biological functional feature of prototypes is a critical factor in inducing insightful scientific problem solving. In this functional magnetic resonance imaging (fMRI) study, we selected scientific innovation problems and utilized "learning prototypes-solving problems" two-phase paradigm to test the supposition. We also explored its neural mechanisms. Functional MRI data showed that the activation of the middle temporal gyrus (MTG, BA 37) and the middle occipital gyrus (MOG, BA 19) were associated with the highlighted functional feature condition. fMRI data also indicated that the MTG (BA 37) could be responsible for the semantic processing of functional features and for the formation of novel associations based on related functions. In addition, the MOG (BA 19) could be involved in the visual imagery of formation and application of function association between the heuristic prototype and problem. Our findings suggest that both semantic processing and visual imagery could be crucial components underlying scientific problem solving. © 2013 Elsevier B.V. All rights reserved.

  3. Family Maltreatment, Substance Problems, and Suicidality: Randomized Prevention Effectiveness Trial

    DTIC Science & Technology

    2007-02-01

    yelling, or spanking . The 1-2-3 Magic videotape demonstrates positive and negative parenting strategies, as well as methods of self-control and...Participation also decreases parents’ use of spanking and reduces child conduct problems (for reviews, see Webster-Stratton, 2001; Webster-Stratton...child from becoming spoiled, and less likely to report actually spanking or slapping their babies (Riley, 1997). Self- reported parenting behavior

  4. Preface to a Theory of Human Symbolic Interchange: An Essay on Communication. Curriculum Praxis, Occasional Paper Series No. 15.

    ERIC Educational Resources Information Center

    Martin, R. Glenn

    The subjectivistic problem for knowledge and communication associated with philosopher Immanuel Kant is that everything we know may in fact be determined by the structure of our own minds and not by the actual nature of external reality. Ernst Cassirer's contribution to the solution of this problem is the notion of "symbols" produced by…

  5. Using the WHO-5 Well-Being Index to Identify College Students at Risk for Mental Health Problems

    ERIC Educational Resources Information Center

    Downs, Andrew; Boucher, Laura A.; Campbell, Duncan G.; Polyakov, Anita

    2017-01-01

    There is a clear need for colleges to do a better job of identifying students who may benefit from treatment and encouraging those students to actually seek help (Hunt & Eisenberg, 2010). Indeed, research suggests that population-based screening can encourage college students who are at risk for mental health problems to seek treatment (Kim,…

  6. 3D Texture Features Mining for MRI Brain Tumor Identification

    NASA Astrophysics Data System (ADS)

    Rahim, Mohd Shafry Mohd; Saba, Tanzila; Nayer, Fatima; Syed, Afraz Zahra

    2014-03-01

    Medical image segmentation is a process to extract region of interest and to divide an image into its individual meaningful, homogeneous components. Actually, these components will have a strong relationship with the objects of interest in an image. For computer-aided diagnosis and therapy process, medical image segmentation is an initial mandatory step. Medical image segmentation is a sophisticated and challenging task because of the sophisticated nature of the medical images. Indeed, successful medical image analysis heavily dependent on the segmentation accuracy. Texture is one of the major features to identify region of interests in an image or to classify an object. 2D textures features yields poor classification results. Hence, this paper represents 3D features extraction using texture analysis and SVM as segmentation technique in the testing methodologies.

  7. The management of nonunion and malunion of the distal humerus--a 30-year experience.

    PubMed

    Jupiter, Jesse B

    2008-01-01

    This personal series of nonunions of the distal humerus reviews unique features of this problem, categorizes them according to unique anatomic features, and presents pitfalls and pearls in the management of these complex reconstructive problems.

  8. Tool Wear Monitoring Using Time Series Analysis

    NASA Astrophysics Data System (ADS)

    Song, Dong Yeul; Ohara, Yasuhiro; Tamaki, Haruo; Suga, Masanobu

    A tool wear monitoring approach considering the nonlinear behavior of cutting mechanism caused by tool wear and/or localized chipping is proposed, and its effectiveness is verified through the cutting experiment and actual turning machining. Moreover, the variation in the surface roughness of the machined workpiece is also discussed using this approach. In this approach, the residual error between the actually measured vibration signal and the estimated signal obtained from the time series model corresponding to dynamic model of cutting is introduced as the feature of diagnosis. Consequently, it is found that the early tool wear state (i.e. flank wear under 40µm) can be monitored, and also the optimal tool exchange time and the tool wear state for actual turning machining can be judged by this change in the residual error. Moreover, the variation of surface roughness Pz in the range of 3 to 8µm can be estimated by the monitoring of the residual error.

  9. Pivot tables for mortality analysis, or who needs life tables anyway?

    PubMed

    Wesley, David; Cox, Hugh F

    2007-01-01

    Actuarial life-table analysis has long been used by life insurance medical directors for mortality abstraction from clinical studies. Ironically, today's life actuary instead uses pivot tables to analyze mortality. Pivot tables (a feature/function in MS Excel) collapse various dimensions of data that were previously arranged in an "experience study" format. Summary statistics such as actual deaths, actual and expected mortality (usually measured in dollars), and calculated results such as actual to expected ratios, are then displayed in a 2-dimensional grid. The same analytic process, excluding the dollar focus, can be used for clinical mortality studies. For raw survival data, especially large datasets, this combination of experience study data and pivot tables has clear advantages over life-table analysis in both accuracy and flexibility. Using the SEER breast cancer data, we compare the results of life-table analysis and pivot-table analysis.

  10. Boosting instance prototypes to detect local dermoscopic features.

    PubMed

    Situ, Ning; Yuan, Xiaojing; Zouridakis, George

    2010-01-01

    Local dermoscopic features are useful in many dermoscopic criteria for skin cancer detection. We address the problem of detecting local dermoscopic features from epiluminescence (ELM) microscopy skin lesion images. We formulate the recognition of local dermoscopic features as a multi-instance learning (MIL) problem. We employ the method of diverse density (DD) and evidence confidence (EC) function to convert MIL to a single-instance learning (SIL) problem. We apply Adaboost to improve the classification performance with support vector machines (SVMs) as the base classifier. We also propose to boost the selection of instance prototypes through changing the data weights in the DD function. We validate the methods on detecting ten local dermoscopic features from a dataset with 360 images. We compare the performance of the MIL approach, its boosting version, and a baseline method without using MIL. Our results show that boosting can provide performance improvement compared to the other two methods.

  11. Credit for Learning Gained in Life and Work Experience.

    ERIC Educational Resources Information Center

    Strange, John

    1980-01-01

    Prime features of sound college programs that assess for credit the prior experiential learning of adults are outlined. Faculty judgment underlies all evaluation methods, which include oral exams, written reports, actual performance, or appraisals of advanced professional knowledge. Work of the Council for the Advancement of Experiential Learning…

  12. Examining U.S. and Spanish Annual Reports: Crisis Communication

    ERIC Educational Resources Information Center

    Palmer-Silveira, Juan C.; Ruiz-Garrido, Miguel F.

    2014-01-01

    Crisis has affected businesses worldwide. Many international corporations must cope with this turmoil, which affects their economic liability. Firms express their actual financial situation in the annual reports they issue every year. The annual report is a document that combines both promotional and informative features. Our study tries to find…

  13. Performing Like an Asylum Seeker: Paradoxes of Hyper-Authenticity

    ERIC Educational Resources Information Center

    Jestrovic, Silvija

    2008-01-01

    This essay investigates performance events that feature actual refugees, asylum seekers and immigrants, but in instances where presence and embodiment are mediated and made ambiguous. My focus is a fashion show by Catalan designer Antonio Miro, who uses refugees from Senegal as models, and Christoph Schlingensief's public art project…

  14. 75 FR 39262 - Agency Forms Undergoing Paperwork Reduction Act Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-08

    ... related risk factors. The efficacy of green building design features in reducing allergens and toxic... the extent to which green-built, low-income housing actually reduces exposures to these compounds when... specific green building practices (e.g., use of low chemical-emitting paints and carpets) may influence...

  15. How Course Portfolios Can Advance the Scholarship and Practice of Management Teaching

    ERIC Educational Resources Information Center

    New, J. Randolph; Clawson, James G.; Coughlan, Richard S.; Hoyle, Joe Ben

    2008-01-01

    The authors believe the development, peer review, and sharing of course portfolios can significantly improve the scholarship and teaching of management. To make this case, they provide background information about course portfolios, including origins, defining features, purposes, and potential benefits. They then identify actual portfolio projects…

  16. Image Manipulation: Then and Now.

    ERIC Educational Resources Information Center

    Sutton, Ronald E.

    The images of photography have been manipulated almost from the moment of their discovery. The blending together in the studio and darkroom of images not found in actual scenes from life has been a regular feature of modern photography in both art and advertising. Techniques of photograph manipulation include retouching; blocking out figures or…

  17. Alaska national hydrography dataset positional accuracy assessment study

    USGS Publications Warehouse

    Arundel, Samantha; Yamamoto, Kristina H.; Constance, Eric; Mantey, Kim; Vinyard-Houx, Jeremy

    2013-01-01

    Initial visual assessments Wide range in the quality of fit between features in NHD and these new image sources. No statistical analysis has been performed to actually quantify accuracy Determining absolute accuracy is cost prohibitive (must collect independent, well defined test points) Quantitative analysis of relative positional error is feasible.

  18. An Investigation on Instructors' Knowledge, Belief and Practices towards Distance Education

    ERIC Educational Resources Information Center

    Yildiz, Merve; Erdem, Mukaddes

    2018-01-01

    Distance education systems have emerged as increasingly accessible and indispensable features in education owing to the development and spread of communication technologies and the transformation of individual characteristics, needs and demands. With the growing popularity of distance education programs, detailed analysis of their actual success…

  19. Prosodic Encoding in Silent Reading.

    ERIC Educational Resources Information Center

    Wilkenfeld, Deborah

    In silent reading, short-memory tasks, such as semantic and syntactic processing, require a stage of phonetic encoding between visual representation and the actual extraction of meaning, and this encoding includes prosodic as well as segmental features. To test for this suprasegmental coding, an experiment was conducted in which subjects were…

  20. Mosaic Messages

    ERIC Educational Resources Information Center

    Baldauf, Annemarie

    2012-01-01

    Through the generosity of a Lowes Toolbox for Education Grant and a grant from the Bill Graham Foundation, an interdisciplinary mosaic mural was created and installed at Riverview Middle School in Bay Point, California. The actual mural, which featured a theme of nurturing students through music, art, sports, science, and math, took about three…

  1. Structural health monitoring feature design by genetic programming

    NASA Astrophysics Data System (ADS)

    Harvey, Dustin Y.; Todd, Michael D.

    2014-09-01

    Structural health monitoring (SHM) systems provide real-time damage and performance information for civil, aerospace, and other high-capital or life-safety critical structures. Conventional data processing involves pre-processing and extraction of low-dimensional features from in situ time series measurements. The features are then input to a statistical pattern recognition algorithm to perform the relevant classification or regression task necessary to facilitate decisions by the SHM system. Traditional design of signal processing and feature extraction algorithms can be an expensive and time-consuming process requiring extensive system knowledge and domain expertise. Genetic programming, a heuristic program search method from evolutionary computation, was recently adapted by the authors to perform automated, data-driven design of signal processing and feature extraction algorithms for statistical pattern recognition applications. The proposed method, called Autofead, is particularly suitable to handle the challenges inherent in algorithm design for SHM problems where the manifestation of damage in structural response measurements is often unclear or unknown. Autofead mines a training database of response measurements to discover information-rich features specific to the problem at hand. This study provides experimental validation on three SHM applications including ultrasonic damage detection, bearing damage classification for rotating machinery, and vibration-based structural health monitoring. Performance comparisons with common feature choices for each problem area are provided demonstrating the versatility of Autofead to produce significant algorithm improvements on a wide range of problems.

  2. PROBLEM ON IRREGULAR 3-DIMENSIONAL DENSITY DISCONTINUITY SURFACES.

    DTIC Science & Technology

    method. The result is used to analyze the relation of linear correlation between the crustal thickness and the local Bouguer gravity anomaly, and good agreement is obtained in comparison with the actual statistical data. (Author)

  3. Actually, What Is an Actuary?

    ERIC Educational Resources Information Center

    Oudshoorn, Susan; Finkelstein, Gary

    1991-01-01

    The actuarial profession is described to provide secondary school mathematics teachers insights into how actuaries use mathematics in solving real life problems. Examples are provided involving compound interest, the probability of dying, and inflation with computer modeling. (MDH)

  4. Martian North Polar Impacts and Volcanoes: Feature Discrimination and Comparisons to Global Trends

    NASA Technical Reports Server (NTRS)

    Sakimoto, E. H.; Weren, S. L.

    2003-01-01

    The recent Mars Global Surveyor and Mars Odyssey Missions have greatly improved our available data for the north polar region of Mars. Pre- MGS and MO studies proposed possible volcanic features, and have revealed numerous volcanoes and impact craters in a range of weathering states that were poorly visible or not visible in prior data sets. This new data has helped in the reassessment of the polar deposits. From images or shaded Mars Orbiter Laser Altimeter (MOLA) topography grids alone, it has proved to be difficult to differentiate cratered cones of probable volcanic origins from impact craters that appear to have been filled. It is important that the distinction is made if possible, as the relative ages of the polar deposits hinge on small numbers of craters, and the local volcanic regime originally only proposed small numbers of volcanoes. Therefore, we have expanded prior work on detailed topographic parameter measurements and modeling for the polar volcanic landforms and mapped and measured all of the probable volcanic and impact features for the north polar region as well as other midlatitude fields, and suggest that: 1) The polar volcanic edifices are significantly different topographically from midlatitude edifices, and have steeper slopes and larger craters as a group; 2) The impact craters are actually distinct from the volcanoes in terms of the feature volume that is cavity compared to feature volume that is positive relief; 3) There are actually several distinct types of volcanic edifices present; 4) These types tend to be spatially grouped by edifice. This is a contrast to many of the other small volcanic fields around Mars, where small edifices tend to be mixed types within a field.

  5. Preserving sparseness in multivariate polynominal factorization

    NASA Technical Reports Server (NTRS)

    Wang, P. S.

    1977-01-01

    Attempts were made to factor these ten polynomials on MACSYMA. However it did not get very far with any of the larger polynomials. At that time, MACSYMA used an algorithm created by Wang and Rothschild. This factoring algorithm was also implemented for the symbolic manipulation system, SCRATCHPAD of IBM. A closer look at this old factoring algorithm revealed three problem areas, each of which contribute to losing sparseness and intermediate expression growth. This study led to effective ways of avoiding these problems and actually to a new factoring algorithm. The three problems are known as the extraneous factor problem, the leading coefficient problem, and the bad zero problem. These problems are examined separately. Their causes and effects are set forth in detail; the ways to avoid or lessen these problems are described.

  6. Cortex Inspired Model for Inverse Kinematics Computation for a Humanoid Robotic Finger

    PubMed Central

    Gentili, Rodolphe J.; Oh, Hyuk; Molina, Javier; Reggia, James A.; Contreras-Vidal, José L.

    2013-01-01

    In order to approach human hand performance levels, artificial anthropomorphic hands/fingers have increasingly incorporated human biomechanical features. However, the performance of finger reaching movements to visual targets involving the complex kinematics of multi-jointed, anthropomorphic actuators is a difficult problem. This is because the relationship between sensory and motor coordinates is highly nonlinear, and also often includes mechanical coupling of the two last joints. Recently, we developed a cortical model that learns the inverse kinematics of a simulated anthropomorphic finger. Here, we expand this previous work by assessing if this cortical model is able to learn the inverse kinematics for an actual anthropomorphic humanoid finger having its two last joints coupled and controlled by pneumatic muscles. The findings revealed that single 3D reaching movements, as well as more complex patterns of motion of the humanoid finger, were accurately and robustly performed by this cortical model while producing kinematics comparable to those of humans. This work contributes to the development of a bioinspired controller providing adaptive, robust and flexible control of dexterous robotic and prosthetic hands. PMID:23366569

  7. Efficient dynamic modeling of manipulators containing closed kinematic loops

    NASA Astrophysics Data System (ADS)

    Ferretti, Gianni; Rocco, Paolo

    An approach to efficiently solve the forward dynamics problem for manipulators containing closed chains is proposed. The two main distinctive features of this approach are: the dynamics of the equivalent open loop tree structures (any closed loop can be in general modeled by imposing some additional kinematic constraints to a suitable tree structure) is computed through an efficient Newton Euler formulation; the constraint equations relative to the most commonly adopted closed chains in industrial manipulators are explicitly solved, thus, overcoming the redundancy of Lagrange's multipliers method while avoiding the inefficiency due to a numerical solution of the implicit constraint equations. The constraint equations considered for an explicit solution are those imposed by articulated gear mechanisms and planar closed chains (pantograph type structures). Articulated gear mechanisms are actually used in all industrial robots to transmit motion from actuators to links, while planar closed chains are usefully employed to increase the stiffness of the manipulators and their load capacity, as well to reduce the kinematic coupling of joint axes. The accuracy and the efficiency of the proposed approach are shown through a simulation test.

  8. Four types of ensemble coding in data visualizations.

    PubMed

    Szafir, Danielle Albers; Haroz, Steve; Gleicher, Michael; Franconeri, Steven

    2016-01-01

    Ensemble coding supports rapid extraction of visual statistics about distributed visual information. Researchers typically study this ability with the goal of drawing conclusions about how such coding extracts information from natural scenes. Here we argue that a second domain can serve as another strong inspiration for understanding ensemble coding: graphs, maps, and other visual presentations of data. Data visualizations allow observers to leverage their ability to perform visual ensemble statistics on distributions of spatial or featural visual information to estimate actual statistics on data. We survey the types of visual statistical tasks that occur within data visualizations across everyday examples, such as scatterplots, and more specialized images, such as weather maps or depictions of patterns in text. We divide these tasks into four categories: identification of sets of values, summarization across those values, segmentation of collections, and estimation of structure. We point to unanswered questions for each category and give examples of such cross-pollination in the current literature. Increased collaboration between the data visualization and perceptual psychology research communities can inspire new solutions to challenges in visualization while simultaneously exposing unsolved problems in perception research.

  9. Two-Stage Approach to Image Classification by Deep Neural Networks

    NASA Astrophysics Data System (ADS)

    Ososkov, Gennady; Goncharov, Pavel

    2018-02-01

    The paper demonstrates the advantages of the deep learning networks over the ordinary neural networks on their comparative applications to image classifying. An autoassociative neural network is used as a standalone autoencoder for prior extraction of the most informative features of the input data for neural networks to be compared further as classifiers. The main efforts to deal with deep learning networks are spent for a quite painstaking work of optimizing the structures of those networks and their components, as activation functions, weights, as well as the procedures of minimizing their loss function to improve their performances and speed up their learning time. It is also shown that the deep autoencoders develop the remarkable ability for denoising images after being specially trained. Convolutional Neural Networks are also used to solve a quite actual problem of protein genetics on the example of the durum wheat classification. Results of our comparative study demonstrate the undoubted advantage of the deep networks, as well as the denoising power of the autoencoders. In our work we use both GPU and cloud services to speed up the calculations.

  10. Compensation for loads during arm movements using equilibrium-point control.

    PubMed

    Gribble, P L; Ostry, D J

    2000-12-01

    A significant problem in motor control is how information about movement error is used to modify control signals to achieve desired performance. A potential source of movement error and one that is readily controllable experimentally relates to limb dynamics and associated movement-dependent loads. In this paper, we have used a position control model to examine changes to control signals for arm movements in the context of movement-dependent loads. In the model, based on the equilibrium-point hypothesis, equilibrium shifts are adjusted directly in proportion to the positional error between desired and actual movements. The model is used to simulate multi-joint movements in the presence of both "internal" loads due to joint interaction torques, and externally applied loads resulting from velocity-dependent force fields. In both cases it is shown that the model can achieve close correspondence to empirical data using a simple linear adaptation procedure. An important feature of the model is that it achieves compensation for loads during movement without the need for either coordinate transformations between positional error and associated corrective forces, or inverse dynamics calculations.

  11. Urban climate and energy demand interaction in Northern Eurasia

    NASA Astrophysics Data System (ADS)

    Kasilova, E. V.; Ginzburg, A. S.; Demchenko, P. F.

    2017-11-01

    The regional and urban climate change in Northern Eurasia is one of the main challenges for sustainable development of human habitats situated in boreal and temperate areas. The half of primary energy is spent for space heating even under quite a mild European climate. Implementation of the district heating in urban areas is currently seen as one of the key conditions of sustainable development. The clear understanding of main problems of the urban climateenergy demand interaction is crucial for both small towns and megacities. The specific features of the urban energy systems in Finland, Russia and China under the changing climate conditions were studied. Regional manifestations of the climate change were examined. The climate projections were established for urban regions of the Northern Eurasia. It was shown that the climate warming is likely to continue intensively there. History and actual development trends were discussed for the urban district heating systems in Russia, China and Finland. Common challenges linked with the climate change have been identified for the considered areas. Adaptation possibilities were discussed taking into account climate-energy interactions.

  12. A priori and a posteriori analysis of the flow around a rectangular cylinder

    NASA Astrophysics Data System (ADS)

    Cimarelli, A.; Leonforte, A.; Franciolini, M.; De Angelis, E.; Angeli, D.; Crivellini, A.

    2017-11-01

    The definition of a correct mesh resolution and modelling approach for the Large Eddy Simulation (LES) of the flow around a rectangular cylinder is recognized to be a rather elusive problem as shown by the large scatter of LES results present in the literature. In the present work, we aim at assessing this issue by performing an a priori analysis of Direct Numerical Simulation (DNS) data of the flow. This approach allows us to measure the ability of the LES field on reproducing the main flow features as a function of the resolution employed. Based on these results, we define a mesh resolution which maximize the opposite needs of reducing the computational costs and of adequately resolving the flow dynamics. The effectiveness of the resolution method proposed is then verified by means of an a posteriori analysis of actual LES data obtained by means of the implicit LES approach given by the numerical properties of the Discontinuous Galerkin spatial discretization technique. The present work represents a first step towards a best practice for LES of separating and reattaching flows.

  13. Preliminary simulation study on regional climate change in Xinjiang, China

    NASA Astrophysics Data System (ADS)

    Yuan, Chunqiong; Guo, Qingyu; Xie, Hongbing; Pan, Xiaoling; Anabiek, Subai

    2004-01-01

    Under the conditions of global warming, the degenerated ecological environment has threatened human survival. Therefore, people will gradually pay more attention to the environmental problem of climate change. This paper analyzes the distribution features of air temperature, relative humidity (precipitation), and the horizontal stream field in Xinjiang for July 1997. In order to do the integration of one month (July, 1997) we ran the model NCAP/PENN MM5V3. Data from WLCCD (the latest World Land Cover Characteristics Database) was used to relate the two research domains of the model, and also to replace the vegetation in the MM5. The data in the WWLCD depends on the actual land surface characteristics. It was found that the general law of air temperature, relative humidity (precipitation) and the horizontal stream field of 1000hPa in Xinjiang in July of 1997 by means of the model. The mean regional data for July helped prefect the theory that humans are controlling the ecological environment in order to prevent and control desertification. Among these results, the simulated air temperature was the best.

  14. A cognitive perspective on medical expertise: theory and implication.

    PubMed

    Schmidt, H G; Norman, G R; Boshuizen, H P

    1990-10-01

    A new theory of the development of expertise in medicine is outlined. Contrary to existing views, this theory assumes that expertise is not so much a matter of superior reasoning skills or in-depth knowledge of pathophysiological states as it is based on cognitive structures that describe the features of prototypical or even actual patients. These cognitive structures, referred to as "illness scripts," contain relatively little knowledge about pathophysiological causes of symptoms and complaints but a wealth of clinically relevant information about disease, its consequences, and the context under which illness develops. By contrast, intermediate-level students without clinical experience typically use pathophysiological, causal models of disease when solving problems. The authors review evidence supporting the theory and discuss its implications for the understanding of five phenomena extensively documented in the clinical-reasoning literature: (1) content specificity in diagnostic performance; (2) typical differences in data-gathering techniques between medical students and physicians; (3) difficulties involved in setting standards; (4) a decline in performance on certain measures of clinical reasoning with increasing expertise; and (5) a paradoxical association between errors and longer response times in visual diagnosis.

  15. Unsupervised and self-mapping category formation and semantic object recognition for mobile robot vision used in an actual environment

    NASA Astrophysics Data System (ADS)

    Madokoro, H.; Tsukada, M.; Sato, K.

    2013-07-01

    This paper presents an unsupervised learning-based object category formation and recognition method for mobile robot vision. Our method has the following features: detection of feature points and description of features using a scale-invariant feature transform (SIFT), selection of target feature points using one class support vector machines (OC-SVMs), generation of visual words using self-organizing maps (SOMs), formation of labels using adaptive resonance theory 2 (ART-2), and creation and classification of categories on a category map of counter propagation networks (CPNs) for visualizing spatial relations between categories. Classification results of dynamic images using time-series images obtained using two different-size robots and according to movements respectively demonstrate that our method can visualize spatial relations of categories while maintaining time-series characteristics. Moreover, we emphasize the effectiveness of our method for category formation of appearance changes of objects.

  16. A feature-based inference model of numerical estimation: the split-seed effect.

    PubMed

    Murray, Kyle B; Brown, Norman R

    2009-07-01

    Prior research has identified two modes of quantitative estimation: numerical retrieval and ordinal conversion. In this paper we introduce a third mode, which operates by a feature-based inference process. In contrast to prior research, the results of three experiments demonstrate that people estimate automobile prices by combining metric information associated with two critical features: product class and brand status. In addition, Experiments 2 and 3 demonstrated that when participants are seeded with the actual current base price of one of the to-be-estimated vehicles, they respond by revising the general metric and splitting the information carried by the seed between the two critical features. As a result, the degree of post-seeding revision is directly related to the number of these features that the seed and the transfer items have in common. The paper concludes with a general discussion of the practical and theoretical implications of our findings.

  17. Effects on Text Simplification: Evaluation of Splitting up Noun Phrases

    PubMed Central

    Leroy, Gondy; Kauchak, David; Hogue, Alan

    2016-01-01

    To help increase health literacy, we are developing a text simplification tool that creates more accessible patient education materials. Tool development is guided by data-driven feature analysis comparing simple and difficult text. In the present study, we focus on the common advice to split long noun phrases. Our previous corpus analysis showed that easier texts contained shorter noun phrases. Subsequently, we conduct a user study to measure the difficulty of sentences containing noun phrases of different lengths (2-gram, 3-gram and 4-gram), conditions (split or not) and, to simulate unknown terms, use of pseudowords (present or not). We gathered 35 evaluations for 30 sentences in each condition (3×2×2 conditions) on Amazon’s Mechanical Turk (N=12,600). We conducted a three-way ANOVA for perceived and actual difficulty. Splitting noun phrases had a positive effect on perceived difficulty but a negative effect on actual difficulty. The presence of pseudowords increased perceived and actual difficulty. Without pseudowords, longer noun phrase led to increased perceived and actual difficulty. A follow-up study using the phrases (N = 1,350) showed that measuring awkwardness may indicate when to split noun phrases. We conclude that splitting noun phrases benefits perceived difficulty, but hurts actual difficulty when the phrasing becomes less natural. PMID:27043754

  18. Operator for object recognition and scene analysis by estimation of set occupancy with noisy and incomplete data sets

    NASA Astrophysics Data System (ADS)

    Rees, S. J.; Jones, Bryan F.

    1992-11-01

    Once feature extraction has occurred in a processed image, the recognition problem becomes one of defining a set of features which maps sufficiently well onto one of the defined shape/object models to permit a claimed recognition. This process is usually handled by aggregating features until a large enough weighting is obtained to claim membership, or an adequate number of located features are matched to the reference set. A requirement has existed for an operator or measure capable of a more direct assessment of membership/occupancy between feature sets, particularly where the feature sets may be defective representations. Such feature set errors may be caused by noise, by overlapping of objects, and by partial obscuration of features. These problems occur at the point of acquisition: repairing the data would then assume a priori knowledge of the solution. The technique described in this paper offers a set theoretical measure for partial occupancy defined in terms of the set of minimum additions to permit full occupancy and the set of locations of occupancy if such additions are made. As is shown, this technique permits recognition of partial feature sets with quantifiable degrees of uncertainty. A solution to the problems of obscuration and overlapping is therefore available.

  19. Robust extrema features for time-series data analysis.

    PubMed

    Vemulapalli, Pramod K; Monga, Vishal; Brennan, Sean N

    2013-06-01

    The extraction of robust features for comparing and analyzing time series is a fundamentally important problem. Research efforts in this area encompass dimensionality reduction using popular signal analysis tools such as the discrete Fourier and wavelet transforms, various distance metrics, and the extraction of interest points from time series. Recently, extrema features for analysis of time-series data have assumed increasing significance because of their natural robustness under a variety of practical distortions, their economy of representation, and their computational benefits. Invariably, the process of encoding extrema features is preceded by filtering of the time series with an intuitively motivated filter (e.g., for smoothing), and subsequent thresholding to identify robust extrema. We define the properties of robustness, uniqueness, and cardinality as a means to identify the design choices available in each step of the feature generation process. Unlike existing methods, which utilize filters "inspired" from either domain knowledge or intuition, we explicitly optimize the filter based on training time series to optimize robustness of the extracted extrema features. We demonstrate further that the underlying filter optimization problem reduces to an eigenvalue problem and has a tractable solution. An encoding technique that enhances control over cardinality and uniqueness is also presented. Experimental results obtained for the problem of time series subsequence matching establish the merits of the proposed algorithm.

  20. On Laboratory Work

    NASA Astrophysics Data System (ADS)

    Olney, Dave

    1997-11-01

    This paper offers some suggestions on making lab work for high school chemistry students more productive, with students taking an active role. They include (1) rewriting labs from manuals to better suit one's purpose, (2) the questionable use of canned data tables, (3) designing microscale labs that utilize its unique features, such as safety and ease of repetition, (4) having students actually carry out experimental design on occasion, using a model from PRACTICE IN THINKING, and (5) using comuters/calculators in the lab in meaningful ways. Many examples feature discovery-type labs the author has developed over the years.

  1. Elevated-Temperature Tensile-Testing of Foil-Gage Metals

    NASA Technical Reports Server (NTRS)

    Blackburn, L. B.; Ellingsworth, J. R.

    1986-01-01

    Automated system for measuring strain in metal foils at temperatures above 500 degrees F (260 degrees C) uses mechanical extensometer and displacement transducer. System includes counterbalance feature, which eliminates weight contribution of extensometer and reduces grip pressure required for attachment to specimen. Counterbalancing feature overcomes two major difficulties in using extensometers with foil-gage specimens: (1) Weight of extensometer and transducer represents significant fraction of total load applied to specimen and may actually damage it; and (2) grip pressure required for attachment of extensometer to specimens may induce bending stresses in foil-gage materials.

  2. Exaggerated Claims for Interactive Stories

    NASA Astrophysics Data System (ADS)

    Thue, David; Bulitko, Vadim; Spetch, Marcia; Webb, Michael

    As advertising becomes more crucial to video games' success, developers risk promoting their products beyond the features that they can actually include. For features of interactive storytelling, the effects of making such exaggerations are not well known, as reports from industry have been anecdotal at best. In this paper, we explore the effects of making exaggerated claims for interactive stories, in the context of the theory of advertising. Results from a human user study show that female players find linear and branching stories to be significantly less enjoyable when they are advertised with exaggerated claims.

  3. A robust data fusion scheme for integrated navigation systems employing fault detection methodology augmented with fuzzy adaptive filtering

    NASA Astrophysics Data System (ADS)

    Ushaq, Muhammad; Fang, Jiancheng

    2013-10-01

    Integrated navigation systems for various applications, generally employs the centralized Kalman filter (CKF) wherein all measured sensor data are communicated to a single central Kalman filter. The advantage of CKF is that there is a minimal loss of information and high precision under benign conditions. But CKF may suffer computational overloading, and poor fault tolerance. The alternative is the federated Kalman filter (FKF) wherein the local estimates can deliver optimal or suboptimal state estimate as per certain information fusion criterion. FKF has enhanced throughput and multiple level fault detection capability. The Standard CKF or FKF require that the system noise and the measurement noise are zero-mean and Gaussian. Moreover it is assumed that covariance of system and measurement noises remain constant. But if the theoretical and actual statistical features employed in Kalman filter are not compatible, the Kalman filter does not render satisfactory solutions and divergence problems also occur. To resolve such problems, in this paper, an adaptive Kalman filter scheme strengthened with fuzzy inference system (FIS) is employed to adapt the statistical features of contributing sensors, online, in the light of real system dynamics and varying measurement noises. The excessive faults are detected and isolated by employing Chi Square test method. As a case study, the presented scheme has been implemented on Strapdown Inertial Navigation System (SINS) integrated with the Celestial Navigation System (CNS), GPS and Doppler radar using FKF. Collectively the overall system can be termed as SINS/CNS/GPS/Doppler integrated navigation system. The simulation results have validated the effectiveness of the presented scheme with significantly enhanced precision, reliability and fault tolerance. Effectiveness of the scheme has been tested against simulated abnormal errors/noises during different time segments of flight. It is believed that the presented scheme can be applied to the navigation system of aircraft or unmanned aerial vehicle (UAV).

  4. Intelligent Fault Diagnosis of HVCB with Feature Space Optimization-Based Random Forest

    PubMed Central

    Ma, Suliang; Wu, Jianwen; Wang, Yuhao; Jia, Bowen; Jiang, Yuan

    2018-01-01

    Mechanical faults of high-voltage circuit breakers (HVCBs) always happen over long-term operation, so extracting the fault features and identifying the fault type have become a key issue for ensuring the security and reliability of power supply. Based on wavelet packet decomposition technology and random forest algorithm, an effective identification system was developed in this paper. First, compared with the incomplete description of Shannon entropy, the wavelet packet time-frequency energy rate (WTFER) was adopted as the input vector for the classifier model in the feature selection procedure. Then, a random forest classifier was used to diagnose the HVCB fault, assess the importance of the feature variable and optimize the feature space. Finally, the approach was verified based on actual HVCB vibration signals by considering six typical fault classes. The comparative experiment results show that the classification accuracy of the proposed method with the origin feature space reached 93.33% and reached up to 95.56% with optimized input feature vector of classifier. This indicates that feature optimization procedure is successful, and the proposed diagnosis algorithm has higher efficiency and robustness than traditional methods. PMID:29659548

  5. High-resolution Self-Organizing Maps for advanced visualization and dimension reduction.

    PubMed

    Saraswati, Ayu; Nguyen, Van Tuc; Hagenbuchner, Markus; Tsoi, Ah Chung

    2018-05-04

    Kohonen's Self Organizing feature Map (SOM) provides an effective way to project high dimensional input features onto a low dimensional display space while preserving the topological relationships among the input features. Recent advances in algorithms that take advantages of modern computing hardware introduced the concept of high resolution SOMs (HRSOMs). This paper investigates the capabilities and applicability of the HRSOM as a visualization tool for cluster analysis and its suitabilities to serve as a pre-processor in ensemble learning models. The evaluation is conducted on a number of established benchmarks and real-world learning problems, namely, the policeman benchmark, two web spam detection problems, a network intrusion detection problem, and a malware detection problem. It is found that the visualization resulted from an HRSOM provides new insights concerning these learning problems. It is furthermore shown empirically that broad benefits from the use of HRSOMs in both clustering and classification problems can be expected. Copyright © 2018 Elsevier Ltd. All rights reserved.

  6. Information on actual medication use and drug-related problems in older patients: questionnaire or interview?

    PubMed

    Willeboordse, Floor; Grundeken, Lucienne H; van den Eijkel, Lisanne P; Schellevis, François G; Elders, Petra J M; Hugtenburg, Jacqueline G

    2016-04-01

    Information on medication use and drug-related problems is important in the preparation of clinical medication reviews. Critical information can only be provided by patients themselves, but interviewing patients is time-consuming. Alternatively, patient information could be obtained with a questionnaire. In this study the agreement between patient information on medication use and drug-related problems in older patients obtained with a questionnaire was compared with information obtained during an interview. General practice in The Netherlands. A questionnaire was developed to obtain information on actual medication use and drug-related problems. Two patient groups ≥65 years were selected based on general practitioner electronic medical records in nine practices; I. polypharmacy and II. ≥1 predefined general geriatric problems. Eligible patients were asked to complete the questionnaire and were interviewed afterwards. Agreement on information on medication use and drug-related problems collected with the questionnaire and interview was calculated. Ninety-seven patients participated. Of all medications used, 87.6 % (95 % CI 84.7-90.5) was reported identically in the questionnaire and interview. Agreement for the complete medication list was found for 45.4 % (95 % CI 35.8-55.3) of the patients. On drug-related problem level, agreement between questionnaire and interview was 75 %. Agreement tended to be lower in vulnerable patients characterized by ≥4 chronic diseases, ≥10 medications used and low health literacy. Information from a questionnaire showed reasonable agreement compared with interviewing. The patients reported more medications and drug-related problems in the interview than the questionnaire. Taking the limitations into account, a questionnaire seems a suitable tool for medication reviews that may replace an interview for most patients.

  7. Is Trait Rumination Associated with the Ability to Generate Effective Problem Solving Strategies? Utilizing Two Versions of the Means-Ends Problem-Solving Test.

    PubMed

    Hasegawa, Akira; Nishimura, Haruki; Mastuda, Yuko; Kunisato, Yoshihiko; Morimoto, Hiroshi; Adachi, Masaki

    This study examined the relationship between trait rumination and the effectiveness of problem solving strategies as assessed by the Means-Ends Problem-Solving Test (MEPS) in a nonclinical population. The present study extended previous studies in terms of using two instructions in the MEPS: the second-person, actual strategy instructions, which has been utilized in previous studies on rumination, and the third-person, ideal-strategy instructions, which is considered more suitable for assessing the effectiveness of problem solving strategies. We also replicated the association between rumination and each dimension of the Social Problem-Solving Inventory-Revised Short Version (SPSI-R:S). Japanese undergraduate students ( N  = 223) completed the Beck Depression Inventory-Second Edition, Ruminative Responses Scale (RRS), MEPS, and SPSI-R:S. One half of the sample completed the MEPS with the second-person, actual strategy instructions. The other participants completed the MEPS with the third-person, ideal-strategy instructions. The results showed that neither total RRS score, nor its subscale scores were significantly correlated with MEPS scores under either of the two instructions. These findings taken together with previous findings indicate that in nonclinical populations, trait rumination is not related to the effectiveness of problem solving strategies, but that state rumination while responding to the MEPS deteriorates the quality of strategies. The correlations between RRS and SPSI-R:S scores indicated that trait rumination in general, and its brooding subcomponent in particular are parts of cognitive and behavioral responses that attempt to avoid negative environmental and negative private events. Results also showed that reflection is a part of active problem solving.

  8. Persuasive user experiences of a health Behavior Change Support System: A 12-month study for prevention of metabolic syndrome.

    PubMed

    Karppinen, Pasi; Oinas-Kukkonen, Harri; Alahäivälä, Tuomas; Jokelainen, Terhi; Keränen, Anna-Maria; Salonurmi, Tuire; Savolainen, Markku

    2016-12-01

    Obesity has become a severe health problem in the world. Even a moderate 5% weight loss can significantly reduce the prevalence of metabolic syndrome, which can be vital for preventing comorbidities caused by the obesity. Health Behavior Change Support Systems (hBCSS) emphasize an autogenous approach, where an individual uses the system to influence one's own attitude or behavior to achieve his or her own goal. Regardless of promising results, such health interventions technology has often been considered merely as a tool for delivering content that has no effect or value of its own. More research on actual system features is required. The objective of this study is to describe how users perceive persuasive software features designed and implemented into a support system. The research medium in this study is a web-based information system designed as a lifestyle intervention for participants who are at risk of developing a metabolic syndrome or who are already suffering from it. The system was designed closely following the principles of the Persuasive Systems Design (PSD) model and the Behavior Change Support Systems (BCSS) framework. A total of 43 system users were interviewed for this study during and after a 52 week intervention period. In addition, the system's login data and subjects' Body Mass Index (BMI) measures were used to interpret the results. This study explains in detail how the users perceived using the system and its persuasive features. Self-monitoring, reminders, and tunneling were perceived as especially beneficial persuasive features. The need for social support appeared to grow along the duration of the intervention. Unobtrusiveness was found to be very important in all stages of the intervention rather than only at the beginning. Persuasive software features have power to affect individuals' health behaviors. Through their systematicity the PSD model and the BCSS framework provide effective support for the design and development of technological health interventions. Designers of such systems may choose, for instance, to implement more self-monitoring tools to help individuals to adjust their personal goals with the system's offerings better. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  9. Morphologic interpretation of fertile structures in glossopterid gymnosperms

    USGS Publications Warehouse

    Schopf, J.M.

    1976-01-01

    The problem of determining affinity among glossopterid gymnosperms is beset by deficiencies in preservation, natural dissociation of parts, and scarcity of features assuredly critical for morphologic comprarison. The glossopterids probably are not a very heterogeneous group of plants, but this is difficult to prove. The Gondwana glacial "hiatus" has resulted in the omission of a critical chapter glossopterid evolution. As a consequence, morphologic features and phyletic probabilities must be evaluated on a much more hypothetical basis than would otherwise be justified. Confusion has arisen from the lack of morphologic terms that permit clear discussion of a newly evolved type of reproductive structure in glossopterids. The structure, here designated a "fertiliger", consists of a leafy bract, a partially adnate stalk, and a fertile head or capitulum. Seven types of fertile structures are discussed, all of which are bilaterally symmetrical and have different features on dorsiventral surfaces. I regard all fertiligers as ovulate but this interpretation may bot be acceptable to some workers; others may not accept dorsiventral organization of the capitulum as being fundamental. Among glossopterids, however, in spite of differences in preservation that may seem to support a variant interpretation, these ovulate fertiligers are the distinctive features that show general consistency. A single fertile bract bearing several capitula, as exemplified by Lidgettonia, is called a compound fertiliger. Staminate structures (microsporophylls) of glossopterids are separately classified as Eretmonia, Glossotheca, and possibly as other taxa. Only the manner of sporangial attachment is not entirely clear. It seems likely the staminate parts have previously been confused with scale leaves and are actually coextensive in distribution with the glossopterids. A tentative phyletic model suggests the distant derivation of glossopterids from middle Carboniferous cordaiteans. Many details must be speculative due to the lack of a pertinent fossil record, but this interpretation accounts for some features that have no counterpart in pteridosperms. Permineralized ovules from Antarctica provide general support for this working hypothesis, but specific evidence is lacking. Furthermore, it seems unlikely angiosperms originated from glossopterids; it is more reasonable to consider the glossopterids as possible distant ancestors of the Gnetales. ?? 1976.

  10. Nyala and Bushbuck II: A Harvesting Model.

    ERIC Educational Resources Information Center

    Fay, Temple H.; Greeff, Johanna C.

    1999-01-01

    Adds a cropping or harvesting term to the animal overpopulation model developed in Part I of this article. Investigates various harvesting strategies that might suggest a solution to the overpopulation problem without actually culling any animals. (ASK)

  11. PR Students, Teachers Welcome Corporate Case Study Packages.

    ERIC Educational Resources Information Center

    Broom, Glen M.; Ferguson-De Thorne, Mary Ann

    1978-01-01

    Reports on an actual public relations problem that was used to introduce students to the practice of corporate public relations. Shows both positive and negative reaction to use of the case study in college classes. (RL)

  12. Batch Scheduling for Hybrid Assembly Differentiation Flow Shop to Minimize Total Actual Flow Time

    NASA Astrophysics Data System (ADS)

    Maulidya, R.; Suprayogi; Wangsaputra, R.; Halim, A. H.

    2018-03-01

    A hybrid assembly differentiation flow shop is a three-stage flow shop consisting of Machining, Assembly and Differentiation Stages and producing different types of products. In the machining stage, parts are processed in batches on different (unrelated) machines. In the assembly stage, each part of the different parts is assembled into an assembly product. Finally, the assembled products will further be processed into different types of final products in the differentiation stage. In this paper, we develop a batch scheduling model for a hybrid assembly differentiation flow shop to minimize the total actual flow time defined as the total times part spent in the shop floor from the arrival times until its due date. We also proposed a heuristic algorithm for solving the problems. The proposed algorithm is tested using a set of hypothetic data. The solution shows that the algorithm can solve the problems effectively.

  13. Heterogeneity of road traffic accident rate in the Russian cities and the need of usage various methods of transport safety management

    NASA Astrophysics Data System (ADS)

    Petrov, A. I.; Petrova, D. A.

    2017-10-01

    The article considers one of the topical problems of road safety management at the federal level - the problem of the heterogeneity of road traffic accident rate in Russian cities. The article analyzes actual statistical data on road traffic accident rate in the administrative centers of Russia. The histograms of the distribution of the values of two most important road accidents characteristics - Social Risk HR and Severity Rate of Road Accidents - formed in 2016 in administrative centers of Russia are presented. On the basis of the regression model of the statistical connection between Severity Rate of Road Accidents and Social Risk HR, a classification of the Russian cities based on the level of actual road traffic accident rate was developed. On the basis of this classification a differentiated system of priority methods for organizing the safe functioning of transport systems in the cities of Russia is proposed.

  14. Corrosion performance of Cr3C2-NiCr+0.2%Zr coated super alloys under actual medical waste incinerator environment

    NASA Astrophysics Data System (ADS)

    Ahuja, Lalit; Mudgal, Deepa; Singh, Surendra; Prakash, Satya

    2018-03-01

    Incineration techniques are widely used to dispose of various types of waste which lead to formation of very corrosive environment. Such corrosive environment leads to the degradation of the alloys used in these areas. To obviate this problem, zirconium modified Cr3C2-(NiCr) coating powder has been deposited on three superalloys namely Superni 718, Superni 600 and Superco 605 using Detonation gun technique. Corrosion test was conducted in actual medical waste incinerator environment. The samples were hung inside the secondary chamber operated at 1050°C for 1000h under cyclic condition. Corrosion kinetics was monitored using the weight gain measurements and thickness loss. Corrosion products were characterized using scanning electron microscopy, energy dispersive spectroscopy and X-ray diffraction technique. It was observed that coating is found to be successful in impeding the corrosion problem in superalloys.

  15. A STUDY ON A COOPERATIVE RELATIONSHIP TO THE IMPROVEMENT OF THE REGIONAL FIRE FIGHTING VALIDITY -Case Study in Bangkok, Thailand-

    NASA Astrophysics Data System (ADS)

    Sripramai, Keerati; Oikawa, Yasushi; Watanabe, Hiroshi; Katada, Toshitaka

    Generally, in order to improve some regional fire fighting validity, indispensable strategies are not only a reinforcement of the governmental fire fighting ability, but also a strengthening of the cooperative relationship between governmental and non-governmental fire fighting ability. However, for practical purposes, the effective strategy should be different depending on the actual situationin the subject area. So, in this study, we grasp the actual state and background of the problems that need to be solved for the improvement of the regional fire fighting validity in Bangkok as a case study, and examine the appropriate solution focusing on the relationship between official and voluntary fire fighting. Through some practicable activities such as interviews, investigati ons, and making the regional fire fighting validity map, it became clear that the problems of uncooperative relationship and the lack of trust between stakeholders should be solved first and foremost.

  16. Does Anxiety Modify the Risk for, or Severity of, Conduct Problems Among Children With Co-Occurring ADHD: Categorical and Dimensional and Analyses.

    PubMed

    Danforth, Jeffrey S; Doerfler, Leonard A; Connor, Daniel F

    2017-08-01

    The goal was to examine whether anxiety modifies the risk for, or severity of, conduct problems in children with ADHD. Assessment included both categorical and dimensional measures of ADHD, anxiety, and conduct problems. Analyses compared conduct problems between children with ADHD features alone versus children with co-occurring ADHD and anxiety features. When assessed by dimensional rating scales, results showed that compared with children with ADHD alone, those children with ADHD co-occurring with anxiety are at risk for more intense conduct problems. When assessment included a Diagnostic and Statistical Manual of Mental Disorders (4th ed.; DSM-IV) diagnosis via the Schedule for Affective Disorders and Schizophrenia for School Age Children-Epidemiologic Version (K-SADS), results showed that compared with children with ADHD alone, those children with ADHD co-occurring with anxiety neither had more intense conduct problems nor were they more likely to be diagnosed with oppositional defiant disorder or conduct disorder. Different methodological measures of ADHD, anxiety, and conduct problem features influenced the outcome of the analyses.

  17. On fundamentally new sources of energy for rockets in the early works of the pioneers of astronautics

    NASA Technical Reports Server (NTRS)

    Melkumov, T. M.

    1977-01-01

    The research for more efficient methods of propelling a spacecraft, than can be achieved with chemical energy, was studied. During a time when rockets for space flight had not actually been built pioneers in rocket technology were already concerned with this problem. Alternative sources proposed at that time, were nuclear and solar energy. Basic engineering problems of each source were investigated.

  18. Control theory and splines, applied to signature storage

    NASA Technical Reports Server (NTRS)

    Enqvist, Per

    1994-01-01

    In this report the problem we are going to study is the interpolation of a set of points in the plane with the use of control theory. We will discover how different systems generate different kinds of splines, cubic and exponential, and investigate the effect that the different systems have on the tracking problems. Actually we will see that the important parameters will be the two eigenvalues of the control matrix.

  19. Development of polyvinylether refrigeration oil for hydrofluorocarbon air-conditioning systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tozaki, Toshinori; Konishi, Tsuneo; Nagamatsu, Noritoshi

    1998-10-01

    Polyolestor (POE) poses capillary tube blockage problems when it is used as an air-conditioner refrigeration oil. A polyvinylether (PVE) oil has been developed to settle such problems. The causes of blockage were determined by analyzing capillary tubes after testing them with PVE and POE in the laboratory and in actual equipment. PVE was confirmed to have superior performance over POE with respect to resistance of capillary tube blockage.

  20. Improved mapping of the travelling salesman problem for quantum annealing

    NASA Astrophysics Data System (ADS)

    Troyer, Matthias; Heim, Bettina; Brown, Ethan; Wecker, David

    2015-03-01

    We consider the quantum adiabatic algorithm as applied to the travelling salesman problem (TSP). We introduce a novel mapping of TSP to an Ising spin glass Hamiltonian and compare it to previous known mappings. Through direct perturbative analysis, unitary evolution, and simulated quantum annealing, we show this new mapping to be significantly superior. We discuss how this advantage can translate to actual physical implementations of TSP on quantum annealers.

  1. An analysis-by-synthesis approach to the estimation of vocal cord polyp features.

    PubMed

    Koizumi, T; Taniguchi, S; Itakura, F

    1993-09-01

    This paper deals with a new noninvasive method of estimating vocal cord polyp features through hoarse-voice analysis. A noteworthy feature of this method is that it enables us not only to discriminate hoarse voices caused by pathological vocal cords with a single golf-ball-like polyp from normal voices, but also to estimate polyp features such as the mass and dimension of polyp through the use of a novel model of pathological vocal cords which has been devised to simulate the subtle movement of the vocal cords. A synthetic hoarse voice produced with a hoarse-voice synthesizer is compared with a natural hoarse voice caused by the vocal cord polyp in terms of a distance measure and the polyp features are estimated by minimizing the distance measure. Some estimates of polyp dimension that have been obtained by applying this procedure to hoarse voices are found to compare favorably with actual polyp dimensions, demonstrating that the procedure is effective for estimating the features of golf-ball-like vocal cord polyps.

  2. Study of accuracy of precipitation measurements using simulation method

    NASA Astrophysics Data System (ADS)

    Nagy, Zoltán; Lajos, Tamás; Morvai, Krisztián

    2013-04-01

    Hungarian Meteorological Service1 Budapest University of Technology and Economics2 Precipitation is one of the the most important meteorological parameters describing the state of the climate and to get correct information from trends, accurate measurements of precipitation is very important. The problem is that the precipitation measurements are affected by systematic errors leading to an underestimation of actual precipitation which errors vary by type of precipitaion and gauge type. It is well known that the wind speed is the most important enviromental factor that contributes to the underestimation of actual precipitation, especially for solid precipitation. To study and correct the errors of precipitation measurements there are two basic possibilities: · Use of results and conclusion of International Precipitation Measurements Intercomparisons; · To build standard reference gauges (DFIR, pit gauge) and make own investigation; In 1999 at the HMS we tried to achieve own investigation and built standard reference gauges But the cost-benefit ratio in case of snow (use of DFIR) was very bad (we had several winters without significant amount of snow, while the state of DFIR was continously falling) Due to the problem mentioned above there was need for new approximation that was the modelling made by Budapest University of Technology and Economics, Department of Fluid Mechanics using the FLUENT 6.2 model. The ANSYS Fluent package is featured fluid dynamics solution for modelling flow and other related physical phenomena. It provides the tools needed to describe atmospheric processes, design and optimize new equipment. The CFD package includes solvers that accurately simulate behaviour of the broad range of flows that from single-phase to multi-phase. The questions we wanted to get answer to are as follows: · How do the different types of gauges deform the airflow around themselves? · Try to give quantitative estimation of wind induced error. · How does the use of wind shield improve the accuracy of precipitation measurements? · Try to find the source of the error that can be detected at tipping bucket raingauge in winter time because of use of heating power? On our poster we would like to present the answers to the questions listed above.

  3. A pratical deconvolution algorithm in multi-fiber spectra extraction

    NASA Astrophysics Data System (ADS)

    Zhang, Haotong; Li, Guangwei; Bai, Zhongrui

    2015-08-01

    Deconvolution algorithm is a very promising method in multi-fiber spectroscopy data reduction, the method can extract spectra to the photo noise level as well as improve the spectral resolution, but as mentioned in Bolton & Schlegel (2010), it is limited by its huge computation requirement and thus can not be implemented directly in actual data reduction. We develop a practical algorithm to solve the computation problem. The new algorithm can deconvolve a 2D fiber spectral image of any size with actual PSFs, which may vary with positions. We further consider the influence of noise, which is thought to be an intrinsic ill-posed problem in deconvolution algorithms. We modify our method with a Tikhonov regularization item to depress the method induced noise. A series of simulations based on LAMOST data are carried out to test our method under more real situations with poisson noise and extreme cross talk, i.e., the fiber-to-fiber distance is comparable to the FWHM of the fiber profile. Compared with the results of traditional extraction methods, i.e., the Aperture Extraction Method and the Profile Fitting Method, our method shows both higher S/N and spectral resolution. The computaion time for a noise added image with 250 fibers and 4k pixels in wavelength direction, is about 2 hours when the fiber cross talk is not in the extreme case and 3.5 hours for the extreme fiber cross talk. We finally apply our method to real LAMOST data. We find that the 1D spectrum extracted by our method has both higher SNR and resolution than the traditional methods, but there are still some suspicious weak features possibly caused by the noise sensitivity of the method around the strong emission lines. How to further attenuate the noise influence will be the topic of our future work. As we have demonstrated, multi-fiber spectra extracted by our method will have higher resolution and signal to noise ratio thus will provide more accurate information (such as higher radial velocity and metallicity measurement accuracy in stellar physics) to astronomers than traditional methods.

  4. Zero-Energy Optical Logic: Can It Be Practical?

    NASA Astrophysics Data System (ADS)

    Caulfield, H. John

    The thermodynamic “permission” to build a device that can evaluate a sequence of logic operations that operate at zero energy has existed for about 40 years. That is, physics allows it in principle. Conceptual solutions have been explored ever since then. A great number of important concepts were developed in so doing. Over the last four years, my colleagues and I have explored the possibility of a constructive proof. And we finally succeeded. Somewhat unexpectedly, we found such a proof and found that lossless logic systems could actually be built. And, as we had anticipated, it can only be implemented by optics. That raises a new question: Might an optical zero-energy logic system actually be good enough to displace electronic versions in some cases? In this paper, I do not even try to answer that question, but I do lay out some problems now blocking practical applications and show some promising approaches to solving them. The problems addressed are speed, size, and error rate. The anticipated speed problem simply vanishes, as it was an inference from the implicit assumption that the logic would be electronic. But the other two problems are real and must be addressed if energy-free logic is to have any significant applications. Initial steps in solving the size and error rate are addressed in more detail.

  5. Parallel Preconditioning for CFD Problems on the CM-5

    NASA Technical Reports Server (NTRS)

    Simon, Horst D.; Kremenetsky, Mark D.; Richardson, John; Lasinski, T. A. (Technical Monitor)

    1994-01-01

    Up to today, preconditioning methods on massively parallel systems have faced a major difficulty. The most successful preconditioning methods in terms of accelerating the convergence of the iterative solver such as incomplete LU factorizations are notoriously difficult to implement on parallel machines for two reasons: (1) the actual computation of the preconditioner is not very floating-point intensive, but requires a large amount of unstructured communication, and (2) the application of the preconditioning matrix in the iteration phase (i.e. triangular solves) are difficult to parallelize because of the recursive nature of the computation. Here we present a new approach to preconditioning for very large, sparse, unsymmetric, linear systems, which avoids both difficulties. We explicitly compute an approximate inverse to our original matrix. This new preconditioning matrix can be applied most efficiently for iterative methods on massively parallel machines, since the preconditioning phase involves only a matrix-vector multiplication, with possibly a dense matrix. Furthermore the actual computation of the preconditioning matrix has natural parallelism. For a problem of size n, the preconditioning matrix can be computed by solving n independent small least squares problems. The algorithm and its implementation on the Connection Machine CM-5 are discussed in detail and supported by extensive timings obtained from real problem data.

  6. 'Actual neurosis' and psychosomatic medicine: the vicissitudes of an enigmatic concept.

    PubMed

    Hartocollis, Peter

    2002-12-01

    Out of the concept of neurasthenia, the main non-psychotic diagnosis of nineteenth-century psychiatry besides hysteria, and on the basis of psychophysiological problems of his own, self-diagnosed as neurasthenia, Freud developed the notion of 'actual neurosis', a 'contentless psychic state' manifested by various somatic symptoms and a depressive mood, which he attributed to a chemical factor associated with aberrant sexual practices and in particular masturbation. Rejected by post-Freudian analysts as such along with the diagnosis of neurasthenia, the concept of 'actual neurosis' has survived under various theoretical schemes that seek to explain psychosomatic illness and somatisation, in general, with its concomitant poverty of affects and dearth of fantasy life. In more recent years, the concept of 'actual neurosis' has resurfaced under the label of chronic fatigue syndrome, a medical entity thought to be an immunological deficiency, while in psychoanalysis Freud's idea of a contentless mental state has been replaced by that of unconscious fantasy and symbolisation at a pre-genital or pre-verbal level.

  7. A universal deep learning approach for modeling the flow of patients under different severities.

    PubMed

    Jiang, Shancheng; Chin, Kwai-Sang; Tsui, Kwok L

    2018-02-01

    The Accident and Emergency Department (A&ED) is the frontline for providing emergency care in hospitals. Unfortunately, relative A&ED resources have failed to keep up with continuously increasing demand in recent years, which leads to overcrowding in A&ED. Knowing the fluctuation of patient arrival volume in advance is a significant premise to relieve this pressure. Based on this motivation, the objective of this study is to explore an integrated framework with high accuracy for predicting A&ED patient flow under different triage levels, by combining a novel feature selection process with deep neural networks. Administrative data is collected from an actual A&ED and categorized into five groups based on different triage levels. A genetic algorithm (GA)-based feature selection algorithm is improved and implemented as a pre-processing step for this time-series prediction problem, in order to explore key features affecting patient flow. In our improved GA, a fitness-based crossover is proposed to maintain the joint information of multiple features during iterative process, instead of traditional point-based crossover. Deep neural networks (DNN) is employed as the prediction model to utilize their universal adaptability and high flexibility. In the model-training process, the learning algorithm is well-configured based on a parallel stochastic gradient descent algorithm. Two effective regularization strategies are integrated in one DNN framework to avoid overfitting. All introduced hyper-parameters are optimized efficiently by grid-search in one pass. As for feature selection, our improved GA-based feature selection algorithm has outperformed a typical GA and four state-of-the-art feature selection algorithms (mRMR, SAFS, VIFR, and CFR). As for the prediction accuracy of proposed integrated framework, compared with other frequently used statistical models (GLM, seasonal-ARIMA, ARIMAX, and ANN) and modern machine models (SVM-RBF, SVM-linear, RF, and R-LASSO), the proposed integrated "DNN-I-GA" framework achieves higher prediction accuracy on both MAPE and RMSE metrics in pairwise comparisons. The contribution of our study is two-fold. Theoretically, the traditional GA-based feature selection process is improved to have less hyper-parameters and higher efficiency, and the joint information of multiple features is maintained by fitness-based crossover operator. The universal property of DNN is further enhanced by merging different regularization strategies. Practically, features selected by our improved GA can be used to acquire an underlying relationship between patient flows and input features. Predictive values are significant indicators of patients' demand and can be used by A&ED managers to make resource planning and allocation. High accuracy achieved by the present framework in different cases enhances the reliability of downstream decision makings. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. The pre-image problem in kernel methods.

    PubMed

    Kwok, James Tin-yau; Tsang, Ivor Wai-hung

    2004-11-01

    In this paper, we address the problem of finding the pre-image of a feature vector in the feature space induced by a kernel. This is of central importance in some kernel applications, such as on using kernel principal component analysis (PCA) for image denoising. Unlike the traditional method which relies on nonlinear optimization, our proposed method directly finds the location of the pre-image based on distance constraints in the feature space. It is noniterative, involves only linear algebra and does not suffer from numerical instability or local minimum problems. Evaluations on performing kernel PCA and kernel clustering on the USPS data set show much improved performance.

  9. Once more with feeling: Normative data for the aha experience in insight and noninsight problems.

    PubMed

    Webb, Margaret E; Little, Daniel R; Cropper, Simon J

    2017-10-19

    Despite the presumed ability of insight problems to elicit the subjective feeling of insight, as well as the use of so-called insight problems to investigate this phenomenon for over 100 years, no research has collected normative data regarding the ability of insight problems to actually elicit the feeling of insight in a given individual. The work described in this article provides an overview of both classic and contemporary problems used to examine the construct of insight and presents normative data on the success rate, mean time to solution, and mean rating of aha experience for each problem and task type. We suggest using these data in future work as a reference for selecting problems on the basis of their ability to elicit an aha experience.

  10. Review and Evaluation of Hand-Arm Coordinate Systems for Measuring Vibration Exposure, Biodynamic Responses, and Hand Forces.

    PubMed

    Dong, Ren G; Sinsel, Erik W; Welcome, Daniel E; Warren, Christopher; Xu, Xueyan S; McDowell, Thomas W; Wu, John Z

    2015-09-01

    The hand coordinate systems for measuring vibration exposures and biodynamic responses have been standardized, but they are not actually used in many studies. This contradicts the purpose of the standardization. The objectives of this study were to identify the major sources of this problem, and to help define or identify better coordinate systems for the standardization. This study systematically reviewed the principles and definition methods, and evaluated typical hand coordinate systems. This study confirms that, as accelerometers remain the major technology for vibration measurement, it is reasonable to standardize two types of coordinate systems: a tool-based basicentric (BC) system and an anatomically based biodynamic (BD) system. However, these coordinate systems are not well defined in the current standard. Definition of the standard BC system is confusing, and it can be interpreted differently; as a result, it has been inconsistently applied in various standards and studies. The standard hand BD system is defined using the orientation of the third metacarpal bone. It is neither convenient nor defined based on important biological or biodynamic features. This explains why it is rarely used in practice. To resolve these inconsistencies and deficiencies, we proposed a revised method for defining the realistic handle BC system and an alternative method for defining the hand BD system. A fingertip-based BD system for measuring the principal grip force is also proposed based on an important feature of the grip force confirmed in this study.

  11. Review and Evaluation of Hand–Arm Coordinate Systems for Measuring Vibration Exposure, Biodynamic Responses, and Hand Forces

    PubMed Central

    Dong, Ren G.; Sinsel, Erik W.; Welcome, Daniel E.; Warren, Christopher; Xu, Xueyan S.; McDowell, Thomas W.; Wu, John Z.

    2015-01-01

    The hand coordinate systems for measuring vibration exposures and biodynamic responses have been standardized, but they are not actually used in many studies. This contradicts the purpose of the standardization. The objectives of this study were to identify the major sources of this problem, and to help define or identify better coordinate systems for the standardization. This study systematically reviewed the principles and definition methods, and evaluated typical hand coordinate systems. This study confirms that, as accelerometers remain the major technology for vibration measurement, it is reasonable to standardize two types of coordinate systems: a tool-based basicentric (BC) system and an anatomically based biodynamic (BD) system. However, these coordinate systems are not well defined in the current standard. Definition of the standard BC system is confusing, and it can be interpreted differently; as a result, it has been inconsistently applied in various standards and studies. The standard hand BD system is defined using the orientation of the third metacarpal bone. It is neither convenient nor defined based on important biological or biodynamic features. This explains why it is rarely used in practice. To resolve these inconsistencies and deficiencies, we proposed a revised method for defining the realistic handle BC system and an alternative method for defining the hand BD system. A fingertip-based BD system for measuring the principal grip force is also proposed based on an important feature of the grip force confirmed in this study. PMID:26929824

  12. Next-generation technologies and data analytical approaches for epigenomics.

    PubMed

    Mensaert, Klaas; Denil, Simon; Trooskens, Geert; Van Criekinge, Wim; Thas, Olivier; De Meyer, Tim

    2014-04-01

    Epigenetics refers to the collection of heritable features that modulate the genome-environment interaction without being encoded in the actual DNA sequence. While being mitotically and sometimes even meiotically transmitted, epigenetic traits often demonstrate extensive flexibility. This allows cells to acquire diverse gene expression patterns during differentiation, but also to adapt to a changing environment. However, epigenetic alterations are not always beneficial to the organism, as they are, for example, frequently identified in human diseases such as cancer. Accurate and cost-efficient genome-scale profiling of epigenetic features is thus of major importance to pinpoint these "epimutations," for example, to monitor the epigenetic impact of environmental exposure. Over the last decade, the field of epigenetics has been revolutionized by several innovative "epigenomics" technologies exactly addressing this need. In this review, we discuss and compare widely used next-generation methods to assess DNA methylation and hydroxymethylation, noncoding RNA expression, histone modifications, and nucleosome positioning. Although recent methods are typically based on "second-generation" sequencing, we also pay attention to still commonly used array- and PCR-based methods, and look forward to the additional advantages of single-molecule sequencing. As the current bottleneck in epigenomics research is the analysis rather than generation of data, the basic difficulties and problem-solving strategies regarding data preprocessing and statistical analysis are introduced for the different technologies. Finally, we also consider the complications associated with epigenomic studies of species with yet unsequenced genomes and possible solutions. Copyright © 2013 Wiley Periodicals, Inc.

  13. Salt, hypertension and renal disease: comparative medicine, models and real diseases.

    PubMed Central

    Michell, A. R.

    1994-01-01

    Dogs are well established as experimental animals for the study of both renal disease and hypertension, but most work is based on surgical or pharmacological models and relatively little on spontaneous diseases. This review argues for the latter as an underexploited aspect of comparative medicine. The most important feature of canine hypertension may not be the ease with which models can be produced but the fact that dogs are actually rather resistant to hypertension, and perhaps to its effects, even when they have chronic renal failure. The importance of natural models of chronic renal failure is strengthened by the evidence that self-sustaining progression is a consequence of extreme nephron loss, that is, a late event, rather than the dominant feature of the course of the disease. The role of salt in hypertension is discussed and emphasis given to the importance of understanding the physiological basis of nutritional requirement and recognizing that it is unlikely to exceed 0.6 mmol/kg/day for most healthy adult mammals except during pregnancy or lactation. Such a perspective is essential to the evaluation of experiments, whether in animals or humans, in order to avoid arbitrary definitions of 'high' or 'low' sodium intake, and the serious misinterpretations of data which result. An age-related rise in arterial pressure may well be a warning of excess salt intake, rather than a normal occurrence. Problems of defining hypertension in the face of variability of arterial pressure are also discussed. PMID:7831161

  14. Auctionable fixed transmission rights for congestion management

    NASA Astrophysics Data System (ADS)

    Alomoush, Muwaffaq Irsheid

    Electric power deregulation has proposed a major change to the regulated utility monopoly. The change manifests the main part of engineers' efforts to reshape three components of today's regulated monopoly: generation, distribution and transmission. In this open access deregulated power market, transmission network plays a major role, and transmission congestion is a major problem that requires further consideration especially when inter-zonal/intra-zonal scheme is implemented. Declaring that engineering studies and experience are the criteria to define zonal boundaries or defining a zone based on the fact that a zone is a densely interconnected area (lake) and paths connecting these densely interconnected areas are inter-zonal lines will render insufficient and fuzzy definitions. Moreover, a congestion problem formulation should take into consideration interactions between intra-zonal and inter-zonal flows and their effects on power systems. In this thesis, we introduce a procedure for minimizing the number of adjustments of preferred schedules to alleviate congestion and apply control schemes to minimize interactions between zones. In addition, we give the zone definition a certain criterion based on the Locational Marginal Price (LMP). This concept will be used to define congestion zonal boundaries and to decide whether any zone should be merged with another zone or split into new zones. The thesis presents a unified scheme that combines zonal and FTR schemes to manage congestion. This combined scheme is utilized with LMPs to define zonal boundaries more appropriately. The presented scheme gains the best features of the FTR scheme, which are providing financial certainty, maximizing the efficient use of the system and making users pay for the actual use of congested paths. LMPs may give an indication of the impact of wheeling transactions, and calculations of and comparisons of LMPs with and without wheeling transactions should be adequate criteria to approve the transaction by the ISO, take a decision to expand the existing system, or retain the original structure of the system. Also, the thesis investigates the impact of wheeling transactions on congestion management, where we present a generalized mathematical model for the Fixed Transmission Right (FTR) auction. The auction guarantees FTR availability to all participants on a non-discriminatory basis, in which system users are permitted to buy, sell and trade FTRs through an auction. When FTRs are utilized with LMPs, they increase the efficient use of the transmission system and let a transmission customer gain advantageous features such as acquiring a mechanism to offset the extra cost due to congestion charges, providing financial and operational certainty and making system users pay for the actual use of the congested paths. The thesis also highlighted FTR trading in secondary markets to self-arrange access across different paths, create long-term transmission rights and provide more commercial certainty.

  15. DYNA3D/ParaDyn Regression Test Suite Inventory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Jerry I.

    2016-09-01

    The following table constitutes an initial assessment of feature coverage across the regression test suite used for DYNA3D and ParaDyn. It documents the regression test suite at the time of preliminary release 16.1 in September 2016. The columns of the table represent groupings of functionalities, e.g., material models. Each problem in the test suite is represented by a row in the table. All features exercised by the problem are denoted by a check mark (√) in the corresponding column. The definition of “feature” has not been subdivided to its smallest unit of user input, e.g., algorithmic parameters specific to amore » particular type of contact surface. This represents a judgment to provide code developers and users a reasonable impression of feature coverage without expanding the width of the table by several multiples. All regression testing is run in parallel, typically with eight processors, except problems involving features only available in serial mode. Many are strictly regression tests acting as a check that the codes continue to produce adequately repeatable results as development unfolds; compilers change and platforms are replaced. A subset of the tests represents true verification problems that have been checked against analytical or other benchmark solutions. Users are welcomed to submit documented problems for inclusion in the test suite, especially if they are heavily exercising, and dependent upon, features that are currently underrepresented.« less

  16. Patient factors related to the presentation of fatigue complaints: results from a women's general health care practice.

    PubMed

    de Rijk, A E; Schreurs, K M; Bensing, J M

    2000-01-01

    The aim of this study was to examine which patient-related factors predicted: (1) fatigue, (2) the intention to discuss fatigue and (3) the actual discussion of fatigue during consultation with a GP in a women's general health care practice. Patients were asked to complete two questionnaires: one before and one after consultation. The patient-related factors included: social-demographic characteristics; fatigue characteristics; absence of cognitive representations of fatigue; nature of the requests for consultation; and other complaints. Some 74% of the 155 respondents reported fatigue. Compared to the patients that were not fatigued, the fatigued patients were more frequently employed outside the home, had higher levels of general fatigue, and a higher need for emotional support from their doctor. A minority (12%) intended to discuss fatigue during consultation. Of the respondents returning the second questionnaire (n = 107), 22% reported actually discussing their fatigue with the GP while only 11% had intended to do so. In addition to the intention to discuss fatigue during consultation, the following variables related to actually discussing fatigue: living alone, caring for young children, higher levels of general fatigue, absence of cognitions with regard to the duration of the fatigue, and greater psychological, neurological, digestive, and/or musculoskeletal problems as the reason for consultation. Fatigue was found to be the single reason for consultation in only one case. It is concluded that fatigue does not constitute a serious problem for most patients and that discussion of fatigue with the GP tends to depend on the occurrence of other psychological or physical problems and the patient's social context.

  17. Single-machine common/slack due window assignment problems with linear decreasing processing times

    NASA Astrophysics Data System (ADS)

    Zhang, Xingong; Lin, Win-Chin; Wu, Wen-Hsiang; Wu, Chin-Chia

    2017-08-01

    This paper studies linear non-increasing processing times and the common/slack due window assignment problems on a single machine, where the actual processing time of a job is a linear non-increasing function of its starting time. The aim is to minimize the sum of the earliness cost, tardiness cost, due window location and due window size. Some optimality results are discussed for the common/slack due window assignment problems and two O(n log n) time algorithms are presented to solve the two problems. Finally, two examples are provided to illustrate the correctness of the corresponding algorithms.

  18. A Statistical Method of Evaluating the Pronunciation Proficiency/Intelligibility of English Presentations by Japanese Speakers

    ERIC Educational Resources Information Center

    Kibishi, Hiroshi; Hirabayashi, Kuniaki; Nakagawa, Seiichi

    2015-01-01

    In this paper, we propose a statistical evaluation method of pronunciation proficiency and intelligibility for presentations made in English by native Japanese speakers. We statistically analyzed the actual utterances of speakers to find combinations of acoustic and linguistic features with high correlation between the scores estimated by the…

  19. An Investigation about Actualization Levels of Learning Outcomes in Early Childhood Curriculum

    ERIC Educational Resources Information Center

    Kazu, Ibrahim Yasar; Is, Abdulgafur

    2018-01-01

    Understanding the characteristics of preschool-age children is an important and first step for supporting children's healthy development and school readiness. Children may show different developmental features and come different social, socio-cultural background; however, they are in the same age. Reaching of education at a desired level will be…

  20. Natural and Artificial Playing Fields: Characteristics and Safety Features.

    ERIC Educational Resources Information Center

    Schmidt, Roger C., Ed.; Hoerner, Earl F., Ed.; Milner, Edward M., Ed.; Morehouse, C. A., Ed.

    These papers are on the subjects of playing field standards, surface traction, testing and correlation to actual field experience, and state-of-the-art natural and artificial surfaces. The papers, presented at the Symposium on the Characteristics and Safety of Playing Surfaces (Artificial and Natural) for Field Sports in 1998, cover the…

  1. Occupy and Escalate

    ERIC Educational Resources Information Center

    Bousquet, Marc

    2010-01-01

    The academic year began with a bang last fall at the University of California (UC). A series of bangs, actually, featuring a united front of students, staff, and faculty in a coordinated series of walkouts and strikes across the system's ten campuses. The target of their outrage was a series of draconian layoffs, wage cuts, and drastic tuition…

  2. The Mentoring Relationship in Action.

    ERIC Educational Resources Information Center

    IUME Briefs, 1992

    1992-01-01

    Mentoring is now a very popular, but loosely defined, feature of many programs for youth. The heart of mentoring is the relationship between the youth and the mentor, but little is actually known about this relationship. Mentoring should not be limited to at-risk youth, since many average students or underachievers from stable backgrounds may…

  3. ExperimentaLab: A Virtual Platform to Enhance Entrepreneurial Education through Training

    ERIC Educational Resources Information Center

    Iscaro, Valentina; Castaldi, Laura; Sepe, Enrica

    2017-01-01

    With a view to enhancing the entrepreneurial activity of universities, the authors explore the concepts and features of the "experimental lab", presenting it as an effective means of supporting entrepreneurial training programmes and helping students to turn ideas into actual start-ups. In this context, the term experimental lab refers…

  4. The Effects of Videoconferenced Distance-Learning Instruction in a Taiwanese Company

    ERIC Educational Resources Information Center

    Lin, Chin-Hung; Yang, Shu-Ching

    2011-01-01

    Distance learning, where instruction is given to students despite wide separations of students and teachers, is increasingly popular. Videoconferencing, which is examined in this study, is a distance learning mode of featuring real-time interaction of students and teachers and provides sequence, real-time, vision, and actual interaction. This…

  5. Direct and rapid determination of cotton maturity by FT-Mid-IR technique

    USDA-ARS?s Scientific Manuscript database

    FT-mid-IR (FT-MIR) spectra of seed and lint cottons were collected to explore the potential for the discrimination of immature cottons from mature ones and also for the determination of actual cotton maturity. Spectral features of immature and mature cottons revealed large differences in the 1200-90...

  6. A Validation of Parafoveal Semantic Information Extraction in Reading Chinese

    ERIC Educational Resources Information Center

    Zhou, Wei; Kliegl, Reinhold; Yan, Ming

    2013-01-01

    Parafoveal semantic processing has recently been well documented in reading Chinese sentences, presumably because of language-specific features. However, because of a large variation of fixation landing positions on pretarget words, some preview words actually were located in foveal vision when readers' eyes landed close to the end of the…

  7. Forms of Fighting: A Micro-Social Analysis of Bullying and In-School Violence

    ERIC Educational Resources Information Center

    Malette, Nicole

    2017-01-01

    Current empirical research on youth bullying rarely asks students to describe their violent encounters. This practice conflates incidents of aggression that may actually have different forms and features. In this article I provide the results of a qualitative analysis of retrospective interviews with high school youth about their experiences of…

  8. Common Characteristics of Models in Present-Day Scientific Practice

    ERIC Educational Resources Information Center

    Van Der Valk, Ton; Van Driel, Jan H.; De Vos, Wobbe

    2007-01-01

    Teaching the use of models in scientific research requires a description, in general terms, of how scientists actually use models in their research activities. This paper aims to arrive at defining common characteristics of models that are used in present-day scientific research. Initially, a list of common features of models and modelling, based…

  9. Cadastral Database Positional Accuracy Improvement

    NASA Astrophysics Data System (ADS)

    Hashim, N. M.; Omar, A. H.; Ramli, S. N. M.; Omar, K. M.; Din, N.

    2017-10-01

    Positional Accuracy Improvement (PAI) is the refining process of the geometry feature in a geospatial dataset to improve its actual position. This actual position relates to the absolute position in specific coordinate system and the relation to the neighborhood features. With the growth of spatial based technology especially Geographical Information System (GIS) and Global Navigation Satellite System (GNSS), the PAI campaign is inevitable especially to the legacy cadastral database. Integration of legacy dataset and higher accuracy dataset like GNSS observation is a potential solution for improving the legacy dataset. However, by merely integrating both datasets will lead to a distortion of the relative geometry. The improved dataset should be further treated to minimize inherent errors and fitting to the new accurate dataset. The main focus of this study is to describe a method of angular based Least Square Adjustment (LSA) for PAI process of legacy dataset. The existing high accuracy dataset known as National Digital Cadastral Database (NDCDB) is then used as bench mark to validate the results. It was found that the propose technique is highly possible for positional accuracy improvement of legacy spatial datasets.

  10. Suspect/foil identification in actual crimes and in the laboratory: a reality monitoring analysis.

    PubMed

    Behrman, Bruce W; Richards, Regina E

    2005-06-01

    Four reality monitoring variables were used to discriminate suspect from foil identifications in 183 actual criminal cases. Four hundred sixty-one identification attempts based on five and six-person lineups were analyzed. These identification attempts resulted in 238 suspect identifications and 68 foil identifications. Confidence, automatic processing, eliminative processing and feature use comprised the set of reality monitoring variables. Thirty-five verbal confidence phrases taken from police reports were assigned numerical values on a 10-point confidence scale. Automatic processing identifications were those that occurred "immediately" or "without hesitation." Eliminative processing identifications occurred when witnesses compared or eliminated persons in the lineups. Confidence, automatic processing and eliminative processing were significant predictors, but feature use was not. Confidence was the most effective discriminator. In cases that involved substantial evidence extrinsic to the identification 43% of the suspect identifications were made with high confidence, whereas only 10% of the foil identifications were made with high confidence. The results of a laboratory study using the same predictors generally paralleled the archival results. Forensic implications are discussed.

  11. An investigation to improve the Menhaden fishery prediction and detection model through the application of ERTS-A data

    NASA Technical Reports Server (NTRS)

    Maughan, P. M. (Principal Investigator)

    1972-01-01

    The author has identified the following significant results. Preliminary analyses indicate that several important relationships have been observed utilizing ERTS-1 imagery. Of most significance is that in the Mississippi Sound, as elsewhere, considerable detail exists as to turbidity patterns in the water column. Simple analysis is complicated by the apparent interaction between actual turbidity, turbidity induced by shoal water, and actual imaging of the bottom in extreme shoal water. A statistical approach is being explored which shows promise of at least partially separating these effects so that partitioning of true turbid plumes can be accomplished. This partitioning is of great importance to this program in that supportive data seem to indicate that menhaden occur more frequently in turbid areas. In this connection four individual captures have been associated with a major turbid feature imaged on 6 August. If a significant relationship between imaged turbid features and catch distribution can be established, for example by graphic and/or numeric analysis, it will represent a major advancement for short term prediction of commercially accessible menhaden.

  12. Of Publishers and Pirates: License Agreements Promote Unethical Behavior, But That's Only the Beginning.

    ERIC Educational Resources Information Center

    Pournelle, Jerry

    1984-01-01

    Discussion of software license agreements implies that they actually contribute to software piracy because of their stringency and indicates that competition in the software publishing field will eventually eliminate the piracy problem. (MBR)

  13. Pipe and Solids Analysis: What Can I Learn?

    EPA Science Inventory

    This presentation gives a brief overview of techniques that regulators, utilities and consultants might want to request from laboratories to anticipate or solve water treatment and distribution system water quality problems. Actual examples will be given from EPA collaborations,...

  14. Training for Aviation Decision Making: The Naturalistic Decision Making Perspective

    NASA Technical Reports Server (NTRS)

    Orasanu, Judith; Shafto, Michael G. (Technical Monitor)

    1995-01-01

    This paper describes the implications of a naturalistic decision making (NDM) perspective for training air crews to make flight-related decisions. The implications are based on two types of analyses: (a) identification of distinctive features that serve as a basis for classifying a diverse set of decision events actually encountered by flight crews, and (b) performance strategies that distinguished more from less effective crews flying full-mission simulators, as well as performance analyses from NTSB accident investigations. Six training recommendations are offered: (1) Because of the diversity of decision situations, crews need to be aware that different strategies may be appropriate for different problems; (2) Given that situation assessment is essential to making a good decision, it is important to train specific content knowledge needed to recognize critical conditions, to assess risks and available time, and to develop strategies to verify or diagnose the problem; (3) Tendencies to oversimplify problems may be overcome by training to evaluate options in terms of goals, constraints, consequences, and prevailing conditions; (4) In order to provide the time to gather information and consider options, it is essential to manage the situation, which includes managing crew workload, prioritizing tasks, contingency planning, buying time (e.g., requesting holding or vectors), and using low workload periods to prepare for high workload; (5) Evaluating resource requirements ("What do I need?") and capabilities ("'What do I have?" ) are essential to making good decisions. Using resources to meet requirements may involve the cabin crew, ATC, dispatchers, and maintenance personnel; (6) Given that decisions must often be made under high risk, time pressure, and workload, train under realistic flight conditions to promote the development of robust decision skills.

  15. The Development of Smart Home System for Controlling and Monitoring Energy Consumption using WebSocket Protocol

    NASA Astrophysics Data System (ADS)

    Witthayawiroj, Niti; Nilaphruek, Pongpon

    2017-03-01

    Energy consumption especially electricity is considered one of the most serious problems in households these days. It is because the amount of electricity consumed is more than the amount that people actually need. This means that there is an overusing which resulted from the inconvenience of moving to the switch to turn off the light or any appliances and it is often that closing the light is forgettable, for instance; in addition, there are no tools for monitoring how much energy that is consumed in residents. From this, it can be easily seen that people are having a problem in energy usage monitor and control. There are two main objectives of this study including 1) creating the communication framework among server, clients and devices, and 2) developing the prototype system that try to solve the mentioned problems which gives the user an opportunity to know the amount of electricity they have used in their houses and also the ability to turn appliances on and off through the Internet on smart devices such as smart phones and tablets that support Android platform or any web browser. Raspberry Pi is used as a microcontroller and the data is transferred to the smart device by WebSocket protocol which is strongly recommended for real-time communication. The example features on the device’s screen are user management, controlling and monitoring of appliances. The result expresses that the system is very effective and not difficult to use from users’ satisfaction. However, current sensors may be used for a more accurate electricity measurement and Wi-Fi module for more appliances to calculate its power in the future.

  16. Classification of Medical Datasets Using SVMs with Hybrid Evolutionary Algorithms Based on Endocrine-Based Particle Swarm Optimization and Artificial Bee Colony Algorithms.

    PubMed

    Lin, Kuan-Cheng; Hsieh, Yi-Hsiu

    2015-10-01

    The classification and analysis of data is an important issue in today's research. Selecting a suitable set of features makes it possible to classify an enormous quantity of data quickly and efficiently. Feature selection is generally viewed as a problem of feature subset selection, such as combination optimization problems. Evolutionary algorithms using random search methods have proven highly effective in obtaining solutions to problems of optimization in a diversity of applications. In this study, we developed a hybrid evolutionary algorithm based on endocrine-based particle swarm optimization (EPSO) and artificial bee colony (ABC) algorithms in conjunction with a support vector machine (SVM) for the selection of optimal feature subsets for the classification of datasets. The results of experiments using specific UCI medical datasets demonstrate that the accuracy of the proposed hybrid evolutionary algorithm is superior to that of basic PSO, EPSO and ABC algorithms, with regard to classification accuracy using subsets with a reduced number of features.

  17. Optimal and robust control of a class of nonlinear systems using dynamically re-optimised single network adaptive critic design

    NASA Astrophysics Data System (ADS)

    Tiwari, Shivendra N.; Padhi, Radhakant

    2018-01-01

    Following the philosophy of adaptive optimal control, a neural network-based state feedback optimal control synthesis approach is presented in this paper. First, accounting for a nominal system model, a single network adaptive critic (SNAC) based multi-layered neural network (called as NN1) is synthesised offline. However, another linear-in-weight neural network (called as NN2) is trained online and augmented to NN1 in such a manner that their combined output represent the desired optimal costate for the actual plant. To do this, the nominal model needs to be updated online to adapt to the actual plant, which is done by synthesising yet another linear-in-weight neural network (called as NN3) online. Training of NN3 is done by utilising the error information between the nominal and actual states and carrying out the necessary Lyapunov stability analysis using a Sobolev norm based Lyapunov function. This helps in training NN2 successfully to capture the required optimal relationship. The overall architecture is named as 'Dynamically Re-optimised single network adaptive critic (DR-SNAC)'. Numerical results for two motivating illustrative problems are presented, including comparison studies with closed form solution for one problem, which clearly demonstrate the effectiveness and benefit of the proposed approach.

  18. Temporality of Features in Near-Death Experience Narratives

    PubMed Central

    Martial, Charlotte; Cassol, Héléna; Antonopoulos, Georgios; Charlier, Thomas; Heros, Julien; Donneau, Anne-Françoise; Charland-Verville, Vanessa; Laureys, Steven

    2017-01-01

    Background: After an occurrence of a Near-Death Experience (NDE), Near-Death Experiencers (NDErs) usually report extremely rich and detailed narratives. Phenomenologically, a NDE can be described as a set of distinguishable features. Some authors have proposed regular patterns of NDEs, however, the actual temporality sequence of NDE core features remains a little explored area. Objectives: The aim of the present study was to investigate the frequency distribution of these features (globally and according to the position of features in narratives) as well as the most frequently reported temporality sequences of features. Methods: We collected 154 French freely expressed written NDE narratives (i.e., Greyson NDE scale total score ≥ 7/32). A text analysis was conducted on all narratives in order to infer temporal ordering and frequency distribution of NDE features. Results: Our analyses highlighted the following most frequently reported sequence of consecutive NDE features: Out-of-Body Experience, Experiencing a tunnel, Seeing a bright light, Feeling of peace. Yet, this sequence was encountered in a very limited number of NDErs. Conclusion: These findings may suggest that NDEs temporality sequences can vary across NDErs. Exploring associations and relationships among features encountered during NDEs may complete the rigorous definition and scientific comprehension of the phenomenon. PMID:28659779

  19. Temporality of Features in Near-Death Experience Narratives.

    PubMed

    Martial, Charlotte; Cassol, Héléna; Antonopoulos, Georgios; Charlier, Thomas; Heros, Julien; Donneau, Anne-Françoise; Charland-Verville, Vanessa; Laureys, Steven

    2017-01-01

    Background: After an occurrence of a Near-Death Experience (NDE), Near-Death Experiencers (NDErs) usually report extremely rich and detailed narratives. Phenomenologically, a NDE can be described as a set of distinguishable features. Some authors have proposed regular patterns of NDEs, however, the actual temporality sequence of NDE core features remains a little explored area. Objectives: The aim of the present study was to investigate the frequency distribution of these features (globally and according to the position of features in narratives) as well as the most frequently reported temporality sequences of features. Methods: We collected 154 French freely expressed written NDE narratives (i.e., Greyson NDE scale total score ≥ 7/32). A text analysis was conducted on all narratives in order to infer temporal ordering and frequency distribution of NDE features. Results: Our analyses highlighted the following most frequently reported sequence of consecutive NDE features: Out-of-Body Experience, Experiencing a tunnel, Seeing a bright light, Feeling of peace. Yet, this sequence was encountered in a very limited number of NDErs. Conclusion: These findings may suggest that NDEs temporality sequences can vary across NDErs. Exploring associations and relationships among features encountered during NDEs may complete the rigorous definition and scientific comprehension of the phenomenon.

  20. Neural network-based feature point descriptors for registration of optical and SAR images

    NASA Astrophysics Data System (ADS)

    Abulkhanov, Dmitry; Konovalenko, Ivan; Nikolaev, Dmitry; Savchik, Alexey; Shvets, Evgeny; Sidorchuk, Dmitry

    2018-04-01

    Registration of images of different nature is an important technique used in image fusion, change detection, efficient information representation and other problems of computer vision. Solving this task using feature-based approaches is usually more complex than registration of several optical images because traditional feature descriptors (SIFT, SURF, etc.) perform poorly when images have different nature. In this paper we consider the problem of registration of SAR and optical images. We train neural network to build feature point descriptors and use RANSAC algorithm to align found matches. Experimental results are presented that confirm the method's effectiveness.

  1. Understanding and utilization of Thematic Mapper and other remotely sensed data for vegetation monitoring

    NASA Technical Reports Server (NTRS)

    Crist, E. P.; Cicone, R. C.; Metzler, M. D.; Parris, T. M.; Rice, D. P.; Sampson, R. E.

    1983-01-01

    The TM Tasseled Cap transformation, which provides both a 50% reduction in data volume with little or no loss of important information and spectral features with direct physical association, is presented and discussed. Using both simulated and actual TM data, some important characteristics of vegetation and soils in this feature space are described, as are the effects of solar elevation angle and atmospheric haze. A preliminary spectral haze diagnostic feature, based on only simulated data, is also examined. The characteristics of the TM thermal band are discussed, as is a demonstration of the use of TM data in energy balance studies. Some characteristics of AVHRR data are described, as are the sensitivities to scene content of several LANDSAT-MSS preprocessing techniques.

  2. Study on Conversion Between Momentum and Contrarian Based on Fractal Game

    NASA Astrophysics Data System (ADS)

    Wu, Xu; Song, Guanghui; Deng, Yan; Xu, Lin

    2015-06-01

    Based on the fractal game which is performed by the majority and the minority, the fractal market theory (FMT) is employed to describe the features of investors' decision-making. Accordingly, the process of fractal games is formed in order to analyze the statistical features of conversion between momentum and contrarian. The result shows that among three fractal game mechanisms, the statistical feature of simulated return rate series is much more similar to log returns on actual series. In addition, the conversion between momentum and contrarian is also extremely similar to real situation, which can reflect the effectiveness of using fractal game in analyzing the conversion between momentum and contrarian. Moreover, it also provides decision-making reference which helps investors develop effective investment strategy.

  3. Student Understanding of pH: "I Don't Know What the Log Actually Is, I Only Know Where the Button Is on My Calculator"

    ERIC Educational Resources Information Center

    Watters, Dianne J.; Watters, James J.

    2006-01-01

    In foundation biochemistry and biological chemistry courses, a major problem area that has been identified is students' lack of understanding of pH, acids, bases, and buffers and their inability to apply their knowledge in solving acid/base problems. The aim of this study was to explore students' conceptions of pH and their ability to solve…

  4. Software Reliability Study

    DTIC Science & Technology

    1976-08-01

    indicates the routine’s parent subsystem and function as well. A 1 01[ Subsystem-T Function Routine PROBS Number of actual problems encountered in the...liable to be shelved in the first scheduleof manpower pinch, The n.ixt problem is one of education . Here the situation is similar to that experienced in...assurance organization, and each is feasible in the Project 5 environment. Of particular help in the date collection process was the involement of the

  5. Exact solution of three-dimensional transport problems using one-dimensional models. [in semiconductor devices

    NASA Technical Reports Server (NTRS)

    Misiakos, K.; Lindholm, F. A.

    1986-01-01

    Several parameters of certain three-dimensional semiconductor devices including diodes, transistors, and solar cells can be determined without solving the actual boundary-value problem. The recombination current, transit time, and open-circuit voltage of planar diodes are emphasized here. The resulting analytical expressions enable determination of the surface recombination velocity of shallow planar diodes. The method involves introducing corresponding one-dimensional models having the same values of these parameters.

  6. Problems of Modern Higher Education in the Sphere of Russian Philology and the Ways of Solving Them (on the Example of the Situation in Kazan Federal University)

    ERIC Educational Resources Information Center

    Bushkanets, Leah E.; Mahinina, Natalia G.; Nasrutdinova, Lilia H.; Sidorova, Marina M.

    2016-01-01

    The article is devoted to the actual problems of modern higher education in the sphere of Russian Philology which depends on the world crisis situation, that continues to persist, despite the efforts to reform it. This article aims to mark some important problematical items necessary to realize the reformation of higher philological education and…

  7. Machinery running state identification based on discriminant semi-supervised local tangent space alignment for feature fusion and extraction

    NASA Astrophysics Data System (ADS)

    Su, Zuqiang; Xiao, Hong; Zhang, Yi; Tang, Baoping; Jiang, Yonghua

    2017-04-01

    Extraction of sensitive features is a challenging but key task in data-driven machinery running state identification. Aimed at solving this problem, a method for machinery running state identification that applies discriminant semi-supervised local tangent space alignment (DSS-LTSA) for feature fusion and extraction is proposed. Firstly, in order to extract more distinct features, the vibration signals are decomposed by wavelet packet decomposition WPD, and a mixed-domain feature set consisted of statistical features, autoregressive (AR) model coefficients, instantaneous amplitude Shannon entropy and WPD energy spectrum is extracted to comprehensively characterize the properties of machinery running state(s). Then, the mixed-dimension feature set is inputted into DSS-LTSA for feature fusion and extraction to eliminate redundant information and interference noise. The proposed DSS-LTSA can extract intrinsic structure information of both labeled and unlabeled state samples, and as a result the over-fitting problem of supervised manifold learning and blindness problem of unsupervised manifold learning are overcome. Simultaneously, class discrimination information is integrated within the dimension reduction process in a semi-supervised manner to improve sensitivity of the extracted fusion features. Lastly, the extracted fusion features are inputted into a pattern recognition algorithm to achieve the running state identification. The effectiveness of the proposed method is verified by a running state identification case in a gearbox, and the results confirm the improved accuracy of the running state identification.

  8. Developing, implementing and evaluating OSH interventions in SMEs: a pilot, exploratory study.

    PubMed

    Masi, Donato; Cagno, Enrico; Micheli, Guido J L

    2014-01-01

    The literature on occupational safety and health (OSH) interventions contains many debates on how interventions should work, but far less attention has been paid to how they actually do work, and to the contextual factors that influence their implementation, development and effect. The need of improving the understanding of the OSH interventions issue is particularly relevant for small and medium-sized enterprises (SMEs), since they experience worse OSH conditions, and have fewer physical, economic and organizational resources if compared to larger enterprises; thus, SMEs strongly need to focus their few resources in the decision-making process so as to select and put in place only the most proper interventions. This exploratory study is based on interviews with safety officers of 5 SMEs, and it gives an overview of the key features of the actual intervention process in SMEs and of the contextual factors making this actual intervention process similar or dissimilar to the ideal case. The results show how much qualitative and experience driven the actual intervention process is; they should be used to direct the future research towards an increasingly applicable one, to enable practitioners from SMEs to develop, implement and evaluate their OSH interventions in an "ideal" way.

  9. In the year 2525, if x ray is still alive, if lithography can survive, they may find...

    NASA Astrophysics Data System (ADS)

    Nistler, John L.; Michael, Mark; Hause, Fred N.; Edwards, Richard D.

    1998-12-01

    Data and discussions will be presented on the NTRM, National Technology Roadmap, for reticles based on a Process Integration perception. Specifically two layers are considered for this paper, the gate layer which is primarily a chrome geometry mask with a lot of open glass and a local interconnect layer which is primarily a chrome plate using clear geometries. Information from other sources is used where appropriate and actual in-house data is used to illustrate specific points. Realizing that demands from different customers for specific types of features tend to drive specific mask makers and their decisions on equipment purchases and processes. We attempt to help predict where Integration approaches have either caused a lag in technology pushes or have actually speeded up certain requirements. Discussions of integration requirements, which tend to push maskmakers, will be presented. Along with typical design approaches in OPC and PSM which either will push technology or actually slow down the trend towards smaller geometries. In addition, data will be presented showing how specific stepper characteristics may actually drive the customer's criteria, thus changing the requirements from customer to customer.

  10. Trophic hierarchies illuminated via amino acid isotopic analysis

    USDA-ARS?s Scientific Manuscript database

    This research addresses the problem of discerning whether natural enemies in agricultural systems are actually suppressing pest populations (or simply eating other predators). Knowing the ecological function of natural enemies (particularly arthropods) is an integral part of biological control progr...

  11. [Current perspectives on endodontic treatment of teeth with chronic periapical lesions].

    PubMed

    Canalda Sahli, C

    1990-01-01

    The author study in this article histopathological aspects of periapical lesions, intra-granulomatous epithelial proliferation phenomenon as pathogenic mechanism of microscopic cystic cavities formation, diagnostic problems of them all, as well as the most actual therapeutic perspectives.

  12. Exact solution for the optimal neuronal layout problem.

    PubMed

    Chklovskii, Dmitri B

    2004-10-01

    Evolution perfected brain design by maximizing its functionality while minimizing costs associated with building and maintaining it. Assumption that brain functionality is specified by neuronal connectivity, implemented by costly biological wiring, leads to the following optimal design problem. For a given neuronal connectivity, find a spatial layout of neurons that minimizes the wiring cost. Unfortunately, this problem is difficult to solve because the number of possible layouts is often astronomically large. We argue that the wiring cost may scale as wire length squared, reducing the optimal layout problem to a constrained minimization of a quadratic form. For biologically plausible constraints, this problem has exact analytical solutions, which give reasonable approximations to actual layouts in the brain. These solutions make the inverse problem of inferring neuronal connectivity from neuronal layout more tractable.

  13. Tool use and mechanical problem solving in apraxia.

    PubMed

    Goldenberg, G; Hagmann, S

    1998-07-01

    Moorlaas (1928) proposed that apraxic patients can identify objects and can remember the purpose they have been made for but do not know the way in which they must be used to achieve that purpose. Knowledge about the use of objects and tools can have two sources: It can be based on retrieval of instructions of use from semantic memory or on a direct inference of function from structure. The ability to infer function from structure enables subjects to use unfamiliar tools and to detect alternative uses of familiar tools. It is the basis of mechanical problem solving. The purpose of the present study was to analyze retrieval of instruction of use, mechanical problem solving, and actual tool use in patients with apraxia due to circumscribed lesions of the left hemisphere. For assessing mechanical problem solving we developed a test of selection and application of novel tools. Access to instruction of use was tested by pantomime of tool use. Actual tool use was examined for the same familiar tools. Forty two patients with left brain damage (LBD) and aphasia, 22 patients with right brain damage (RBD) and 22 controls were examined. Only LBD patients differed from controls on all tests. RBD patients had difficulties with the use but not with the selection of novel tools. In LBD patients there was a significant correlation between pantomime of tool use and novel tool selection but there were single cases who scored in the defective range on one of these tests and normally on the other. Analysis of LBD patients' lesions suggested that frontal lobe damage does not disturb novel tool selection. Only LBD patients who failed on pantomime of object use and on novel tool selection committed errors in actual use of familiar tools. The finding that mechanical problem solving is invariably defective in apraxic patients who commit errors with familiar tools is in good accord with clinical observations, as the gravity of their errors goes beyond what one would expect as a mere sequel of loss of access to instruction of use.

  14. A taxonomy of inductive problems.

    PubMed

    Kemp, Charles; Jern, Alan

    2014-02-01

    Inductive inferences about objects, features, categories, and relations have been studied for many years, but there are few attempts to chart the range of inductive problems that humans are able to solve. We present a taxonomy of inductive problems that helps to clarify the relationships between familiar inductive problems such as generalization, categorization, and identification, and that introduces new inductive problems for psychological investigation. Our taxonomy is founded on the idea that semantic knowledge is organized into systems of objects, features, categories, and relations, and we attempt to characterize all of the inductive problems that can arise when these systems are partially observed. Recent studies have begun to address some of the new problems in our taxonomy, and future work should aim to develop unified theories of inductive reasoning that explain how people solve all of the problems in the taxonomy.

  15. A Self-Organizing State-Space-Model Approach for Parameter Estimation in Hodgkin-Huxley-Type Models of Single Neurons

    PubMed Central

    Vavoulis, Dimitrios V.; Straub, Volko A.; Aston, John A. D.; Feng, Jianfeng

    2012-01-01

    Traditional approaches to the problem of parameter estimation in biophysical models of neurons and neural networks usually adopt a global search algorithm (for example, an evolutionary algorithm), often in combination with a local search method (such as gradient descent) in order to minimize the value of a cost function, which measures the discrepancy between various features of the available experimental data and model output. In this study, we approach the problem of parameter estimation in conductance-based models of single neurons from a different perspective. By adopting a hidden-dynamical-systems formalism, we expressed parameter estimation as an inference problem in these systems, which can then be tackled using a range of well-established statistical inference methods. The particular method we used was Kitagawa's self-organizing state-space model, which was applied on a number of Hodgkin-Huxley-type models using simulated or actual electrophysiological data. We showed that the algorithm can be used to estimate a large number of parameters, including maximal conductances, reversal potentials, kinetics of ionic currents, measurement and intrinsic noise, based on low-dimensional experimental data and sufficiently informative priors in the form of pre-defined constraints imposed on model parameters. The algorithm remained operational even when very noisy experimental data were used. Importantly, by combining the self-organizing state-space model with an adaptive sampling algorithm akin to the Covariance Matrix Adaptation Evolution Strategy, we achieved a significant reduction in the variance of parameter estimates. The algorithm did not require the explicit formulation of a cost function and it was straightforward to apply on compartmental models and multiple data sets. Overall, the proposed methodology is particularly suitable for resolving high-dimensional inference problems based on noisy electrophysiological data and, therefore, a potentially useful tool in the construction of biophysical neuron models. PMID:22396632

  16. TH-E-BRF-05: Comparison of Survival-Time Prediction Models After Radiotherapy for High-Grade Glioma Patients Based On Clinical and DVH Features

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Magome, T; Haga, A; Igaki, H

    Purpose: Although many outcome prediction models based on dose-volume information have been proposed, it is well known that the prognosis may be affected also by multiple clinical factors. The purpose of this study is to predict the survival time after radiotherapy for high-grade glioma patients based on features including clinical and dose-volume histogram (DVH) information. Methods: A total of 35 patients with high-grade glioma (oligodendroglioma: 2, anaplastic astrocytoma: 3, glioblastoma: 30) were selected in this study. All patients were treated with prescribed dose of 30–80 Gy after surgical resection or biopsy from 2006 to 2013 at The University of Tokyomore » Hospital. All cases were randomly separated into training dataset (30 cases) and test dataset (5 cases). The survival time after radiotherapy was predicted based on a multiple linear regression analysis and artificial neural network (ANN) by using 204 candidate features. The candidate features included the 12 clinical features (tumor location, extent of surgical resection, treatment duration of radiotherapy, etc.), and the 192 DVH features (maximum dose, minimum dose, D95, V60, etc.). The effective features for the prediction were selected according to a step-wise method by using 30 training cases. The prediction accuracy was evaluated by a coefficient of determination (R{sup 2}) between the predicted and actual survival time for the training and test dataset. Results: In the multiple regression analysis, the value of R{sup 2} between the predicted and actual survival time was 0.460 for the training dataset and 0.375 for the test dataset. On the other hand, in the ANN analysis, the value of R{sup 2} was 0.806 for the training dataset and 0.811 for the test dataset. Conclusion: Although a large number of patients would be needed for more accurate and robust prediction, our preliminary Result showed the potential to predict the outcome in the patients with high-grade glioma. This work was partly supported by the JSPS Core-to-Core Program(No. 23003) and Grant-in-aid from the JSPS Fellows.« less

  17. PhyloGibbs-MP: Module Prediction and Discriminative Motif-Finding by Gibbs Sampling

    PubMed Central

    Siddharthan, Rahul

    2008-01-01

    PhyloGibbs, our recent Gibbs-sampling motif-finder, takes phylogeny into account in detecting binding sites for transcription factors in DNA and assigns posterior probabilities to its predictions obtained by sampling the entire configuration space. Here, in an extension called PhyloGibbs-MP, we widen the scope of the program, addressing two major problems in computational regulatory genomics. First, PhyloGibbs-MP can localise predictions to small, undetermined regions of a large input sequence, thus effectively predicting cis-regulatory modules (CRMs) ab initio while simultaneously predicting binding sites in those modules—tasks that are usually done by two separate programs. PhyloGibbs-MP's performance at such ab initio CRM prediction is comparable with or superior to dedicated module-prediction software that use prior knowledge of previously characterised transcription factors. Second, PhyloGibbs-MP can predict motifs that differentiate between two (or more) different groups of regulatory regions, that is, motifs that occur preferentially in one group over the others. While other “discriminative motif-finders” have been published in the literature, PhyloGibbs-MP's implementation has some unique features and flexibility. Benchmarks on synthetic and actual genomic data show that this algorithm is successful at enhancing predictions of differentiating sites and suppressing predictions of common sites and compares with or outperforms other discriminative motif-finders on actual genomic data. Additional enhancements include significant performance and speed improvements, the ability to use “informative priors” on known transcription factors, and the ability to output annotations in a format that can be visualised with the Generic Genome Browser. In stand-alone motif-finding, PhyloGibbs-MP remains competitive, outperforming PhyloGibbs-1.0 and other programs on benchmark data. PMID:18769735

  18. The correct estimate of the probability of false detection of the matched filter in weak-signal detection problems . II. Further results with application to a set of ALMA and ATCA data

    NASA Astrophysics Data System (ADS)

    Vio, R.; Vergès, C.; Andreani, P.

    2017-08-01

    The matched filter (MF) is one of the most popular and reliable techniques to the detect signals of known structure and amplitude smaller than the level of the contaminating noise. Under the assumption of stationary Gaussian noise, MF maximizes the probability of detection subject to a constant probability of false detection or false alarm (PFA). This property relies upon a priori knowledge of the position of the searched signals, which is usually not available. Recently, it has been shown that when applied in its standard form, MF may severely underestimate the PFA. As a consequence the statistical significance of features that belong to noise is overestimated and the resulting detections are actually spurious. For this reason, an alternative method of computing the PFA has been proposed that is based on the probability density function (PDF) of the peaks of an isotropic Gaussian random field. In this paper we further develop this method. In particular, we discuss the statistical meaning of the PFA and show that, although useful as a preliminary step in a detection procedure, it is not able to quantify the actual reliability of a specific detection. For this reason, a new quantity is introduced called the specific probability of false alarm (SPFA), which is able to carry out this computation. We show how this method works in targeted simulations and apply it to a few interferometric maps taken with the Atacama Large Millimeter/submillimeter Array (ALMA) and the Australia Telescope Compact Array (ATCA). We select a few potential new point sources and assign an accurate detection reliability to these sources.

  19. State of volcanic ash dispersion prediction

    NASA Astrophysics Data System (ADS)

    Eliasson, Jonas; Palsson, Thorgeir; Weber, Konradin

    2017-04-01

    The Eyjafjallajokull 2010 and Grimsvotn 2011 eruptions created great problems for commercial aviation in Western Europe and in the North Atlantic region. Comparison of satellite images of the visible and predicted ash clouds showed the VAAC prediction to be much larger than the actual ash clouds. No official explanation of this discrepancy exists apart from the definition of the ash cloud boundary. Papers on simulation of the Eyjafjallajökull ash cloud in peer reviewed journals, typically attempted to simulate the VAAC predictions rather than focusing on the satellite pictures. Sporadic measurements made in-situ showed much lower ash concentrations over Europe than the predicted values. Two of the weak points in ash cloud prediction have been studied in airborne measurements of volcanic ash by the Universities in Kyoto Japan, Iceland and Düsseldorf Germany of eruptions in Sakurajima, Japan. It turns out that gravitational deformation of the plume and a streak fallout process make estimated ash content of clouds larger than the actual, both features are not included in the simulation model. Tropospheric plumes tend to ride in stable inversions this causes gravitational flattening (pancaking) of the volcanic plume, while diffusion in the mixing layer is insignificant. New rules from ICAO, effective from November 2014, reiterate that jetliners should avoid visible ash, this makes information on visible ash important. A procedure developed by JMÁs Tokyo VAAC uses satellite images of visible ash to correct the prediction. This and the fact that meteorological data necessary to model gravitational dispersion and streak fallout do not exist in the international database available to the VAAĆs. This shows that close monitoring by airborne measurements and satellite and other photographic surveillance is necessary.

  20. Energy density functional on a microscopic basis

    NASA Astrophysics Data System (ADS)

    Baldo, M.; Robledo, L.; Schuck, P.; Viñas, X.

    2010-06-01

    In recent years impressive progress has been made in the development of highly accurate energy density functionals, which allow us to treat medium-heavy nuclei. In this approach one tries to describe not only the ground state but also the first relevant excited states. In general, higher accuracy requires a larger set of parameters, which must be carefully chosen to avoid redundancy. Following this line of development, it is unavoidable that the connection of the functional with the bare nucleon-nucleon interaction becomes more and more elusive. In principle, the construction of a density functional from a density matrix expansion based on the effective nucleon-nucleon interaction is possible, and indeed the approach has been followed by few authors. However, to what extent a density functional based on such a microscopic approach can reach the accuracy of the fully phenomenological ones remains an open question. A related question is to establish which part of a functional can be actually derived by a microscopic approach and which part, in contrast, must be left as purely phenomenological. In this paper we discuss the main problems that are encountered when the microscopic approach is followed. To this purpose we will use the method we have recently introduced to illustrate the different aspects of these problems. In particular we will discuss the possible connection of the density functional with the nuclear matter equation of state and the distinct features of finite-size effect typical of nuclei.

Top