Working set selection using functional gain for LS-SVM.
Bo, Liefeng; Jiao, Licheng; Wang, Ling
2007-09-01
The efficiency of sequential minimal optimization (SMO) depends strongly on the working set selection. This letter shows how the improvement of SMO in each iteration, named the functional gain (FG), is used to select the working set for least squares support vector machine (LS-SVM). We prove the convergence of the proposed method and give some theoretical support for its performance. Empirical comparisons demonstrate that our method is superior to the maximum violating pair (MVP) working set selection.
Eimer, Martin; Kiss, Monika; Nicholas, Susan
2011-12-01
When target-defining features are specified in advance, attentional target selection in visual search is controlled by preparatory top-down task sets. We used ERP measures to study voluntary target selection in the absence of such feature-specific task sets, and to compare it to selection that is guided by advance knowledge about target features. Visual search arrays contained two different color singleton digits, and participants had to select one of these as target and report its parity. Target color was either known in advance (fixed color task) or had to be selected anew on each trial (free color-choice task). ERP correlates of spatially selective attentional target selection (N2pc) and working memory processing (SPCN) demonstrated rapid target selection and efficient exclusion of color singleton distractors from focal attention and working memory in the fixed color task. In the free color-choice task, spatially selective processing also emerged rapidly, but selection efficiency was reduced, with nontarget singleton digits capturing attention and gaining access to working memory. Results demonstrate the benefits of top-down task sets: Feature-specific advance preparation accelerates target selection, rapidly resolves attentional competition, and prevents irrelevant events from attracting attention and entering working memory.
Where do we store the memory representations that guide attention?
Woodman, Geoffrey F.; Carlisle, Nancy B.; Reinhart, Robert M. G.
2013-01-01
During the last decade one of the most contentious and heavily studied topics in the attention literature has been the role that working memory representations play in controlling perceptual selection. The hypothesis has been advanced that to have attention select a certain perceptual input from the environment, we only need to represent that item in working memory. Here we summarize the work indicating that the relationship between what representations are maintained in working memory and what perceptual inputs are selected is not so simple. First, it appears that attentional selection is also determined by high-level task goals that mediate the relationship between working memory storage and attentional selection. Second, much of the recent work from our laboratory has focused on the role of long-term memory in controlling attentional selection. We review recent evidence supporting the proposal that working memory representations are critical during the initial configuration of attentional control settings, but that after those settings are established long-term memory representations play an important role in controlling which perceptual inputs are selected by mechanisms of attention. PMID:23444390
Group Selection Methods and Contribution to the West Point Leadership Development System (WPLDS)
2015-08-01
Government. 14. ABSTRACT Group work in an academic setting can consist of projects or problems students can work on collaboratively. Although pedagogical ...ABSTRACT Group work in an academic setting can consist of projects or problems students can work on collaboratively. Although pedagogical studies...helping students develop intangibles like communication, time management, organization, leadership, interpersonal, and relationship skills. Supporting
SASS Applied to Optimum Work Roll Profile Selection in the Hot Rolling of Wide Steel
NASA Astrophysics Data System (ADS)
Nolle, Lars
The quality of steel strip produced in a wide strip rolling mill depends heavily on the careful selection of initial ground work roll profiles for each of the mill stands in the finishing train. In the past, these profiles were determined by human experts, based on their knowledge and experience. In previous work, the profiles were successfully optimised using a self-organising migration algorithm (SOMA). In this research, SASS, a novel heuristic optimisation algorithm that has only one control parameter, has been used to find the optimum profiles for a simulated rolling mill. The resulting strip quality produced using the profiles found by SASS is compared with results from previous work and the quality produced using the original profile specifications. The best set of profiles found by SASS clearly outperformed the original set and performed equally well as SOMA without the need of finding a suitable set of control parameters.
Kreitz, Carina; Furley, Philip; Memmert, Daniel; Simons, Daniel J
2016-04-01
The probability of inattentional blindness, the failure to notice an unexpected object when attention is engaged on some primary task, is influenced by contextual factors like task demands, features of the unexpected object, and the observer's attention set. However, predicting who will notice an unexpected object and who will remain inattentionally blind has proven difficult, and the evidence that individual differences in cognition affect noticing remains ambiguous. We hypothesized that greater working memory capacity might modulate the effect of attention sets on noticing because working memory is associated with the ability to focus attention selectively. People with greater working memory capacity might be better able to attend selectively to target items, thereby increasing the chances of noticing unexpected objects that were similar to the attended items while decreasing the odds of noticing unexpected objects that differed from the attended items. Our study (N = 120 participants) replicated evidence that task-induced attention sets modulate noticing but found no link between noticing and working memory capacity. Our results are largely consistent with the idea that individual differences in working memory capacity do not predict noticing of unexpected objects in an inattentional blindness task. © The Author(s) 2015.
Directional Microphone Hearing Aids in School Environments: Working toward Optimization
ERIC Educational Resources Information Center
Ricketts, Todd A.; Picou, Erin M.; Galster, Jason
2017-01-01
Purpose: The hearing aid microphone setting (omnidirectional or directional) can be selected manually or automatically. This study examined the percentage of time the microphone setting selected using each method was judged to provide the best signalto-noise ratio (SNR) for the talkers of interest in school environments. Method: A total of 26…
Supporting awareness through collaborative brushing and linking of tabular data.
Hajizadeh, Amir Hossein; Tory, Melanie; Leung, Rock
2013-12-01
Maintaining an awareness of collaborators' actions is critical during collaborative work, including during collaborative visualization activities. Particularly when collaborators are located at a distance, it is important to know what everyone is working on in order to avoid duplication of effort, share relevant results in a timely manner and build upon each other's results. Can a person's brushing actions provide an indication of their queries and interests in a data set? Can these actions be revealed to a collaborator without substantially disrupting their own independent work? We designed a study to answer these questions in the context of distributed collaborative visualization of tabular data. Participants in our study worked independently to answer questions about a tabular data set, while simultaneously viewing brushing actions of a fictitious collaborator, shown directly within a shared workspace. We compared three methods of presenting the collaborator's actions: brushing & linking (i.e. highlighting exactly what the collaborator would see), selection (i.e. showing only a selected item), and persistent selection (i.e. showing only selected items but having them persist for some time). Our results demonstrated that persistent selection enabled some awareness of the collaborator's activities while causing minimal interference with independent work. Other techniques were less effective at providing awareness, and brushing & linking caused substantial interference. These findings suggest promise for the idea of exploiting natural brushing actions to provide awareness in collaborative work.
A Model-Based Analysis of Semi-Automated Data Discovery and Entry Using Automated Content Extraction
2011-02-01
Accomplish Goal) to (a) visually search the contents of a file folder until the icon corresponding to the desired file is located (Choose...Item_from_set), and (b) move the mouse to that icon and double click to open it (Double_select Object). Note that Choose Item_from_set and Double_select...argument, which Open File fills with <found_item>, a working memory pointer to the file icon that Choose_item_from Set finds. Look_at, Point_to
Automatic threshold selection for multi-class open set recognition
NASA Astrophysics Data System (ADS)
Scherreik, Matthew; Rigling, Brian
2017-05-01
Multi-class open set recognition is the problem of supervised classification with additional unknown classes encountered after a model has been trained. An open set classifer often has two core components. The first component is a base classifier which estimates the most likely class of a given example. The second component consists of open set logic which estimates if the example is truly a member of the candidate class. Such a system is operated in a feed-forward fashion. That is, a candidate label is first estimated by the base classifier, and the true membership of the example to the candidate class is estimated afterward. Previous works have developed an iterative threshold selection algorithm for rejecting examples from classes which were not present at training time. In those studies, a Platt-calibrated SVM was used as the base classifier, and the thresholds were applied to class posterior probabilities for rejection. In this work, we investigate the effectiveness of other base classifiers when paired with the threshold selection algorithm and compare their performance with the original SVM solution.
Gomarus, H Karin; Althaus, Monika; Wijers, Albertus A; Minderaa, Ruud B
2006-04-01
Psychophysiological correlates of selective attention and working memory were investigated in a group of 18 healthy children using a visually presented selective memory search task. Subjects had to memorize one (load1) or 3 (load3) letters (memory set) and search for these among a recognition set consisting of 4 letters only if the letters appeared in the correct (relevant) color. Event-related potentials (ERPs) as well as alpha and theta event-related synchronization and desynchronization (ERD/ERS) were derived from the EEG that was recorded during the task. In the ERP to the memory set, a prolonged load-related positivity was found. In response to the recognition set, effects of relevance were manifested in an early frontal positivity and a later frontal negativity. Effects of load were found in a search-related negativity within the attended category and a suppression of the P3-amplitude. Theta ERS was most pronounced for the most difficult task condition during the recognition set, whereas alpha ERD showed a load-effect only during memorization. The manipulation of stimulus relevance and memory load affected both ERP components and ERD/ERS. The present paradigm may supply a useful method for studying processes of selective attention and working memory and can be used to examine group differences between healthy controls and children showing psychopathology.
7 CFR 1773.4 - Borrower responsibilities.
Code of Federal Regulations, 2010 CFR
2010-01-01
... responsibilities. (a) Selection of a qualified CPA. The borrower's board of directors is responsible for the selection of a qualified CPA that meets the requirements set forth in § 1773.5. When selecting a CPA, the borrower should consider, among other matters: (1) The qualifications of CPAs available to do the work; (2...
7 CFR 1773.4 - Borrower responsibilities.
Code of Federal Regulations, 2011 CFR
2011-01-01
... responsibilities. (a) Selection of a qualified CPA. The borrower's board of directors is responsible for the selection of a qualified CPA that meets the requirements set forth in § 1773.5. When selecting a CPA, the borrower should consider, among other matters: (1) The qualifications of CPAs available to do the work; (2...
7 CFR 1773.4 - Borrower responsibilities.
Code of Federal Regulations, 2014 CFR
2014-01-01
... responsibilities. (a) Selection of a qualified CPA. The borrower's board of directors is responsible for the selection of a qualified CPA that meets the requirements set forth in § 1773.5. When selecting a CPA, the borrower should consider, among other matters: (1) The qualifications of CPAs available to do the work; (2...
7 CFR 1773.4 - Borrower responsibilities.
Code of Federal Regulations, 2012 CFR
2012-01-01
... responsibilities. (a) Selection of a qualified CPA. The borrower's board of directors is responsible for the selection of a qualified CPA that meets the requirements set forth in § 1773.5. When selecting a CPA, the borrower should consider, among other matters: (1) The qualifications of CPAs available to do the work; (2...
7 CFR 1773.4 - Borrower responsibilities.
Code of Federal Regulations, 2013 CFR
2013-01-01
... responsibilities. (a) Selection of a qualified CPA. The borrower's board of directors is responsible for the selection of a qualified CPA that meets the requirements set forth in § 1773.5. When selecting a CPA, the borrower should consider, among other matters: (1) The qualifications of CPAs available to do the work; (2...
Working Time in Comparative Perspective. Volume II: Life-Cycle Working Time and Nonstandard Work.
ERIC Educational Resources Information Center
Houseman, Susan, Ed.; Nakamura, Alice, Ed.
This is the second of two volumes of selected papers presented at the 1996 conference "Changes in Working Hours in Canada and the United States." Eleven chapters explore an expanded set of working-time issues, which may be loosely grouped under these two topics: working time over the life cycle and nonstandard work arrangements.…
ERIC Educational Resources Information Center
Muchinsky, Paul M.
1975-01-01
A work sample test can provide a high degree of content validity, and offers a practical method of screening job applicants in accordance with guidelines on employee selection procedures set forth by the Equal Employment Opportunity Commission. (MW)
Optimal Contractor Selection in Construction Industry: The Fuzzy Way
NASA Astrophysics Data System (ADS)
Krishna Rao, M. V.; Kumar, V. S. S.; Rathish Kumar, P.
2018-02-01
A purely price-based approach to contractor selection has been identified as the root cause for many serious project delivery problems. Therefore, the capability of the contractor to execute the project should be evaluated using a multiple set of selection criteria including reputation, past performance, performance potential, financial soundness and other project specific criteria. An industry-wide questionnaire survey was conducted with the objective of identifying the important criteria for adoption in the selection process. In this work, a fuzzy set based model was developed for contractor prequalification/evaluation, by using effective criteria obtained from the percept of construction professionals, taking subjective judgments of decision makers also into consideration. A case study consisting of four alternatives (contractors in the present case) solicited from a public works department of Pondicherry in India, is used to illustrate the effectiveness of the proposed approach. The final selection of contractor is made based on the integrated score or Overall Evaluation Score of the decision alternative in prequalification as well as bid evaluation stages.
Cheng, Tiejun; Li, Qingliang; Wang, Yanli; Bryant, Stephen H
2011-02-28
Aqueous solubility is recognized as a critical parameter in both the early- and late-stage drug discovery. Therefore, in silico modeling of solubility has attracted extensive interests in recent years. Most previous studies have been limited in using relatively small data sets with limited diversity, which in turn limits the predictability of derived models. In this work, we present a support vector machines model for the binary classification of solubility by taking advantage of the largest known public data set that contains over 46 000 compounds with experimental solubility. Our model was optimized in combination with a reduction and recombination feature selection strategy. The best model demonstrated robust performance in both cross-validation and prediction of two independent test sets, indicating it could be a practical tool to select soluble compounds for screening, purchasing, and synthesizing. Moreover, our work may be used for comparative evaluation of solubility classification studies ascribe to the use of completely public resources.
Symbiosis of executive and selective attention in working memory
Vandierendonck, André
2014-01-01
The notion of working memory (WM) was introduced to account for the usage of short-term memory resources by other cognitive tasks such as reasoning, mental arithmetic, language comprehension, and many others. This collaboration between memory and other cognitive tasks can only be achieved by a dedicated WM system that controls task coordination. To that end, WM models include executive control. Nevertheless, other attention control systems may be involved in coordination of memory and cognitive tasks calling on memory resources. The present paper briefly reviews the evidence concerning the role of selective attention in WM activities. A model is proposed in which selective attention control is directly linked to the executive control part of the WM system. The model assumes that apart from storage of declarative information, the system also includes an executive WM module that represents the current task set. Control processes are automatically triggered when particular conditions in these modules are met. As each task set represents the parameter settings and the actions needed to achieve the task goal, it will depend on the specific settings and actions whether selective attention control will have to be shared among the active tasks. Only when such sharing is required, task performance will be affected by the capacity limits of the control system involved. PMID:25152723
Symbiosis of executive and selective attention in working memory.
Vandierendonck, André
2014-01-01
The notion of working memory (WM) was introduced to account for the usage of short-term memory resources by other cognitive tasks such as reasoning, mental arithmetic, language comprehension, and many others. This collaboration between memory and other cognitive tasks can only be achieved by a dedicated WM system that controls task coordination. To that end, WM models include executive control. Nevertheless, other attention control systems may be involved in coordination of memory and cognitive tasks calling on memory resources. The present paper briefly reviews the evidence concerning the role of selective attention in WM activities. A model is proposed in which selective attention control is directly linked to the executive control part of the WM system. The model assumes that apart from storage of declarative information, the system also includes an executive WM module that represents the current task set. Control processes are automatically triggered when particular conditions in these modules are met. As each task set represents the parameter settings and the actions needed to achieve the task goal, it will depend on the specific settings and actions whether selective attention control will have to be shared among the active tasks. Only when such sharing is required, task performance will be affected by the capacity limits of the control system involved.
A comparison between computer-controlled and set work rate exercise based on target heart rate
NASA Technical Reports Server (NTRS)
Pratt, Wanda M.; Siconolfi, Steven F.; Webster, Laurie; Hayes, Judith C.; Mazzocca, Augustus D.; Harris, Bernard A., Jr.
1991-01-01
Two methods are compared for observing the heart rate (HR), metabolic equivalents, and time in target HR zone (defined as the target HR + or - 5 bpm) during 20 min of exercise at a prescribed intensity of the maximum working capacity. In one method, called set-work rate exercise, the information from a graded exercise test is used to select a target HR and to calculate a corresponding constant work rate that should induce the desired HR. In the other method, the work rate is controlled by a computer algorithm to achieve and maintain a prescribed target HR. It is shown that computer-controlled exercise is an effective alternative to the traditional set work rate exercise, particularly when tight control of cardiovascular responses is necessary.
Modulation of selective attention by polarity-specific tDCS effects.
Pecchinenda, Anna; Ferlazzo, Fabio; Lavidor, Michal
2015-02-01
Selective attention relies on working memory to maintain an attention set of task priorities. Consequently, selective attention is more efficient when working memory resources are not depleted. However, there is some evidence that distractors are processed even when working memory load is low. We used tDCS to assess whether boosting the activity of the Dorsolateral Prefrontal Cortex (DLPFC), involved in selective attention and working memory, would reduce interference from emotional distractors. Findings showed that anodal tDCS over the DLPFC was not sufficient to reduce interference from angry distractors. In contrast, cathodal tDCS over the DLPFC reduced interference from happy distractors. These findings show that altering the DLPFC activity is not sufficient to establish top-down control and increase selective attention efficiency. Although, when the neural signal in the DLPFC is altered by cathodal tDCS, interference from emotional distractors is reduced, leading to an improved performance. Copyright © 2014 Elsevier Ltd. All rights reserved.
SU-F-R-10: Selecting the Optimal Solution for Multi-Objective Radiomics Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Z; Folkert, M; Wang, J
2016-06-15
Purpose: To develop an evidential reasoning approach for selecting the optimal solution from a Pareto solution set obtained by a multi-objective radiomics model for predicting distant failure in lung SBRT. Methods: In the multi-objective radiomics model, both sensitivity and specificity are considered as the objective functions simultaneously. A Pareto solution set with many feasible solutions will be resulted from the multi-objective optimization. In this work, an optimal solution Selection methodology for Multi-Objective radiomics Learning model using the Evidential Reasoning approach (SMOLER) was proposed to select the optimal solution from the Pareto solution set. The proposed SMOLER method used the evidentialmore » reasoning approach to calculate the utility of each solution based on pre-set optimal solution selection rules. The solution with the highest utility was chosen as the optimal solution. In SMOLER, an optimal learning model coupled with clonal selection algorithm was used to optimize model parameters. In this study, PET, CT image features and clinical parameters were utilized for predicting distant failure in lung SBRT. Results: Total 126 solution sets were generated by adjusting predictive model parameters. Each Pareto set contains 100 feasible solutions. The solution selected by SMOLER within each Pareto set was compared to the manually selected optimal solution. Five-cross-validation was used to evaluate the optimal solution selection accuracy of SMOLER. The selection accuracies for five folds were 80.00%, 69.23%, 84.00%, 84.00%, 80.00%, respectively. Conclusion: An optimal solution selection methodology for multi-objective radiomics learning model using the evidential reasoning approach (SMOLER) was proposed. Experimental results show that the optimal solution can be found in approximately 80% cases.« less
Storbeck, Justin; Watson, Philip
2014-12-01
Prior research has suggested that emotion and working memory domains are integrated, such that positive affect enhances verbal working memory, whereas negative affect enhances spatial working memory (Gray, 2004; Storbeck, 2012). Simon (1967) postulated that one feature of emotion and cognition integration would be reciprocal connectedness (i.e., emotion influences cognition and cognition influences emotion). We explored whether affective judgments and attention to affective qualities are biased by the activation of verbal and spatial working memory mind-sets. For all experiments, participants completed a 2-back verbal or spatial working memory task followed by an endorsement task (Experiments 1 & 2), word-pair selection task (Exp. 3), or attentional dot-probe task (Exp. 4). Participants who had an activated verbal, compared with spatial, working memory mind-set were more likely to endorse pictures (Exp. 1) and words (Exp. 2) as being more positive and to select the more positive word pair out of a set of word pairs that went 'together best' (Exp. 3). Additionally, people who completed the verbal working memory task took longer to disengage from positive stimuli, whereas those who completed the spatial working memory task took longer to disengage from negative stimuli (Exp. 4). Interestingly, across the 4 experiments, we observed higher levels of self-reported negative affect for people who completed the spatial working memory task, which was consistent with their endorsement and attentional bias toward negative stimuli. Therefore, emotion and working memory may have a reciprocal connectedness allowing for bidirectional influence.
A Transionospheric Communication Channel Model
1977-07-01
F30602-75-C-0236 Anne R. Hessing V. Elaine Hatfield 9. PERFORMING ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT , PROJECT, TASK AREA & WORK UNIT...34* ables from a user-selected set of ionospheric state parameters. Mode II of IONSCNT extends the Mode-I results to second-order statistics for cases...describes only representative conditions for the set of input parameters selected by the user. Night-to-night departures from the calcu- :". lated "mean
ERIC Educational Resources Information Center
Saussaye, Michael G.
2012-01-01
The purpose of this qualitative study was to explore counselor educators' perceptions of working with students unwilling to set aside their personal religious beliefs while counseling clients. Purposeful sampling was used in a snowball fashion to select participants with a minimum of one year experience as a counselor educator and who are…
Feature Selection for Chemical Sensor Arrays Using Mutual Information
Wang, X. Rosalind; Lizier, Joseph T.; Nowotny, Thomas; Berna, Amalia Z.; Prokopenko, Mikhail; Trowell, Stephen C.
2014-01-01
We address the problem of feature selection for classifying a diverse set of chemicals using an array of metal oxide sensors. Our aim is to evaluate a filter approach to feature selection with reference to previous work, which used a wrapper approach on the same data set, and established best features and upper bounds on classification performance. We selected feature sets that exhibit the maximal mutual information with the identity of the chemicals. The selected features closely match those found to perform well in the previous study using a wrapper approach to conduct an exhaustive search of all permitted feature combinations. By comparing the classification performance of support vector machines (using features selected by mutual information) with the performance observed in the previous study, we found that while our approach does not always give the maximum possible classification performance, it always selects features that achieve classification performance approaching the optimum obtained by exhaustive search. We performed further classification using the selected feature set with some common classifiers and found that, for the selected features, Bayesian Networks gave the best performance. Finally, we compared the observed classification performances with the performance of classifiers using randomly selected features. We found that the selected features consistently outperformed randomly selected features for all tested classifiers. The mutual information filter approach is therefore a computationally efficient method for selecting near optimal features for chemical sensor arrays. PMID:24595058
Selective Mutism: A Three-Tiered Approach to Prevention and Intervention
ERIC Educational Resources Information Center
Busse, R. T.; Downey, Jenna
2011-01-01
Selective mutism is a rare anxiety disorder that prevents a child from speaking at school or other community settings, and can be detrimental to a child's social development. School psychologists can play an important role in the prevention and treatment of selective mutism. As an advocate for students, school psychologists can work with teachers,…
When We Deal with Children; Selected Writings.
ERIC Educational Resources Information Center
Redl, Fritz
Espousing an interdisciplinary approach, the book contains selected writings, lectures, and speeches concerning clinical work with disturbed children and adolescents in institutional settings. Editorial comment introduces each of the following sections: a survey of the current status of the children's field both clinically and educationally; a…
Arab, Amir Massoud; Nourbakhsh, Mohammad Reza
2014-01-01
Shortened hamstring muscle length has been noted in persons with low back pain (LBP). Prolonged sitting postures, such as those adopted during different work settings and sedentary lifestyle has been associated with hamstring shortness and LBP. The purpose of this study was to investigate the effect of lifestyle and work setting on hamstring length and lumbar lordosis in subjects with and without LBP and to identify the relationship between hamstring muscles length and lumbar lordosis in individuals with different lifestyle and work setting. A total of 508 subjects between the ages of 20 and 65 were selected. Subjects were categorized into two groups of individuals with and without LBP. A questionnaire was used to obtain information about the subjects' lifestyle and work setting. Hamstring muscle length and lumbar lordosis were measured in all subjects. The results showed no significant difference in the number of subjects with different work setting or lifestyle in individuals with and without LBP. Hamstring muscle length or lumbar lordosis was not affected by type of work setting and lifestyle. Our data showed significant difference in hamstring length and no significant difference in lumbar lordosis between subjects with and without LBP in all categories. Lumbar lordosis was not different between individuals with and without hamstring tightness in normal and LBP subjects with different work setting and lifestyle. The findings of this study did not support the assumption that work setting and sedentary lifestyle would lead to hamstring tightness in subjects with LBP. It seems that work setting and lifestyle was not a contributing factor for hamstring tightness in subjects with LBP.
Li, Sui-Xian
2018-05-07
Previous research has shown that the effectiveness of selecting filter sets from among a large set of commercial broadband filters by a vector analysis method based on maximum linear independence (MLI). However, the traditional MLI approach is suboptimal due to the need to predefine the first filter of the selected filter set to be the maximum ℓ₂ norm among all available filters. An exhaustive imaging simulation with every single filter serving as the first filter is conducted to investigate the features of the most competent filter set. From the simulation, the characteristics of the most competent filter set are discovered. Besides minimization of the condition number, the geometric features of the best-performed filter set comprise a distinct transmittance peak along the wavelength axis of the first filter, a generally uniform distribution for the peaks of the filters and substantial overlaps of the transmittance curves of the adjacent filters. Therefore, the best-performed filter sets can be recognized intuitively by simple vector analysis and just a few experimental verifications. A practical two-step framework for selecting optimal filter set is recommended, which guarantees a significant enhancement of the performance of the systems. This work should be useful for optimizing the spectral sensitivity of broadband multispectral imaging sensors.
An Energy-Efficient Approach to Enhance Virtual Sensors Provisioning in Sensor Clouds Environments
Filho, Raimir Holanda; Rabêlo, Ricardo de Andrade L.; de Carvalho, Carlos Giovanni N.; Mendes, Douglas Lopes de S.; Costa, Valney da Gama
2018-01-01
Virtual sensors provisioning is a central issue for sensors cloud middleware since it is responsible for selecting physical nodes, usually from Wireless Sensor Networks (WSN) of different owners, to handle user’s queries or applications. Recent works perform provisioning by clustering sensor nodes based on the correlation measurements and then selecting as few nodes as possible to preserve WSN energy. However, such works consider only homogeneous nodes (same set of sensors). Therefore, those works are not entirely appropriate for sensor clouds, which in most cases comprises heterogeneous sensor nodes. In this paper, we propose ACxSIMv2, an approach to enhance the provisioning task by considering heterogeneous environments. Two main algorithms form ACxSIMv2. The first one, ACASIMv1, creates multi-dimensional clusters of sensor nodes, taking into account the measurements correlations instead of the physical distance between nodes like most works on literature. Then, the second algorithm, ACOSIMv2, based on an Ant Colony Optimization system, selects an optimal set of sensors nodes from to respond user’s queries while attending all parameters and preserving the overall energy consumption. Results from initial experiments show that the approach reduces significantly the sensor cloud energy consumption compared to traditional works, providing a solution to be considered in sensor cloud scenarios. PMID:29495406
An Energy-Efficient Approach to Enhance Virtual Sensors Provisioning in Sensor Clouds Environments.
Lemos, Marcus Vinícius de S; Filho, Raimir Holanda; Rabêlo, Ricardo de Andrade L; de Carvalho, Carlos Giovanni N; Mendes, Douglas Lopes de S; Costa, Valney da Gama
2018-02-26
Virtual sensors provisioning is a central issue for sensors cloud middleware since it is responsible for selecting physical nodes, usually from Wireless Sensor Networks (WSN) of different owners, to handle user's queries or applications. Recent works perform provisioning by clustering sensor nodes based on the correlation measurements and then selecting as few nodes as possible to preserve WSN energy. However, such works consider only homogeneous nodes (same set of sensors). Therefore, those works are not entirely appropriate for sensor clouds, which in most cases comprises heterogeneous sensor nodes. In this paper, we propose ACxSIMv2, an approach to enhance the provisioning task by considering heterogeneous environments. Two main algorithms form ACxSIMv2. The first one, ACASIMv1, creates multi-dimensional clusters of sensor nodes, taking into account the measurements correlations instead of the physical distance between nodes like most works on literature. Then, the second algorithm, ACOSIMv2, based on an Ant Colony Optimization system, selects an optimal set of sensors nodes from to respond user's queries while attending all parameters and preserving the overall energy consumption. Results from initial experiments show that the approach reduces significantly the sensor cloud energy consumption compared to traditional works, providing a solution to be considered in sensor cloud scenarios.
Covariate Selection for Multilevel Models with Missing Data
Marino, Miguel; Buxton, Orfeu M.; Li, Yi
2017-01-01
Missing covariate data hampers variable selection in multilevel regression settings. Current variable selection techniques for multiply-imputed data commonly address missingness in the predictors through list-wise deletion and stepwise-selection methods which are problematic. Moreover, most variable selection methods are developed for independent linear regression models and do not accommodate multilevel mixed effects regression models with incomplete covariate data. We develop a novel methodology that is able to perform covariate selection across multiply-imputed data for multilevel random effects models when missing data is present. Specifically, we propose to stack the multiply-imputed data sets from a multiple imputation procedure and to apply a group variable selection procedure through group lasso regularization to assess the overall impact of each predictor on the outcome across the imputed data sets. Simulations confirm the advantageous performance of the proposed method compared with the competing methods. We applied the method to reanalyze the Healthy Directions-Small Business cancer prevention study, which evaluated a behavioral intervention program targeting multiple risk-related behaviors in a working-class, multi-ethnic population. PMID:28239457
Entrance and Exit Requirements of Professional Social Work Education
ERIC Educational Resources Information Center
Duehn, Wayne D.; Mayadas, Nazneen Sada
1977-01-01
A competency-based direct practice curriculum for graduate social work education is described based on: (1) interpersonal behavioral control; (2) judgment and decision making; (3) contracting and goal setting; (4) selection and application of change method; and (5) assessment of outcomes. (Author/LBH)
Science Fiction for Geographers: Selected Works.
ERIC Educational Resources Information Center
Elbow, Gary S.; Martinson, Tom L.
1980-01-01
Explains how college level teachers of geography can use works of science fiction to help students understand geographical settings and create impressionistic pictures of a given region in their minds. Particular areas in which science fiction is useful include invented terrestrial landscapes, specialized extraterrestrial landscapes, disaster…
ERIC Educational Resources Information Center
Love, Alan C.
2010-01-01
An overlooked feature of Darwin's work is his use of "imaginary illustrations" to show that natural selection is competent to produce adaptive, evolutionary change. When set in the context of Darwin's methodology, these thought experiments provide a novel way to teach natural selection and the nature of science.
Working memory for visual features and conjunctions in schizophrenia.
Gold, James M; Wilk, Christopher M; McMahon, Robert P; Buchanan, Robert W; Luck, Steven J
2003-02-01
The visual working memory (WM) storage capacity of patients with schizophrenia was investigated using a change detection paradigm. Participants were presented with 2, 3, 4, or 6 colored bars with testing of both single feature (color, orientation) and feature conjunction conditions. Patients performed significantly worse than controls at all set sizes but demonstrated normal feature binding. Unlike controls, patient WM capacity declined at set size 6 relative to set size 4. Impairments with subcapacity arrays suggest a deficit in task set maintenance: Greater impairment for supercapacity set sizes suggests a deficit in the ability to selectively encode information for WM storage. Thus, the WM impairment in schizophrenia appears to be a consequence of attentional deficits rather than a reduction in storage capacity.
Extending the Peak Bandwidth of Parameters for Softmax Selection in Reinforcement Learning.
Iwata, Kazunori
2016-05-11
Softmax selection is one of the most popular methods for action selection in reinforcement learning. Although various recently proposed methods may be more effective with full parameter tuning, implementing a complicated method that requires the tuning of many parameters can be difficult. Thus, softmax selection is still worth revisiting, considering the cost savings of its implementation and tuning. In fact, this method works adequately in practice with only one parameter appropriately set for the environment. The aim of this paper is to improve the variable setting of this method to extend the bandwidth of good parameters, thereby reducing the cost of implementation and parameter tuning. To achieve this, we take advantage of the asymptotic equipartition property in a Markov decision process to extend the peak bandwidth of softmax selection. Using a variety of episodic tasks, we show that our setting is effective in extending the bandwidth and that it yields a better policy in terms of stability. The bandwidth is quantitatively assessed in a series of statistical tests.
SPREADSHEET-BASED PROGRAM FOR ERGONOMIC ADJUSTMENT OF NOTEBOOK COMPUTER AND WORKSTATION SETTINGS.
Nanthavanij, Suebsak; Prae-Arporn, Kanlayanee; Chanjirawittaya, Sorajak; Paripoonyo, Satirajit; Rodloy, Somsak
2015-06-01
This paper discusses a computer program, ErgoNBC, which provides suggestions regarding the ergonomic settings of a notebook computer (NBC), workstation components, and selected accessories in order to help computer users to assume an appropriate work posture during the NBC work. From the users' body height, NBC and workstation component data, ErgoNBC computes the recommended tilt angle of NBC base unit, NBC screen angle, distance between the user and NBC, seat height and work surface height. If necessary, the NBC base support, seat cushion and footrest, including their settings, are recommended. An experiment involving twenty-four university students was conducted to evaluate the recommendations provided by ErgoNBC. The Rapid Upper Limb Assessment (RULA) technique was used to analyze their work postures both before and after implementing the Ergo NBC's recommendations. The results clearly showed that ErgoNBC could significantly help to improve the subjects' work postures.
[Professional psychological selection system in the Air Force - 50 years].
Pokrovskiĭ, B L
2014-08-01
Given the data about the establishment of the professional psychological selection system in the Air Force in 1958-1964 in the NIIIAM Air Force by the team psychological department under the leadership of K.K.Platonova. Given the names of the developers of this system and given the results of their research. The result of all made work the order of Air Force Commander about the introduction of the psychological selection in Higher Military Aviation School of Pilots, starting from a set of 1964 became. Recommendations for professional psychological selection of a wide range of aviation professionals in various fields, and in the future - and other professionals of the Armed Forces, became the results of future work.
ERIC Educational Resources Information Center
Demaine, Jack
2006-01-01
This paper is concerned with the longstanding question of policy for those referred to nearly half a century ago by the Crowther Report as the "bottom half"; those mainly working class children who, in a sense, are "selected for failure". The issue of selection is a matter of concern in countries around the world and has been…
ES Review: Selections from 2009 and 2010
ERIC Educational Resources Information Center
Smiles, Robin, Ed.
2010-01-01
This fourth edition of the "ES Review" brings together, in one setting, some of the best work from 2009-10. It features: (1) Teacher Quality (Teachers at Work: Improving Teacher Quality Through School Design (Elena Silva); Understanding Teachers Contracts (Andrew J. Rotherham); How Teachers Unions Lost the Media (Richard Whitmire and…
Adversarial Feature Selection Against Evasion Attacks.
Zhang, Fei; Chan, Patrick P K; Biggio, Battista; Yeung, Daniel S; Roli, Fabio
2016-03-01
Pattern recognition and machine learning techniques have been increasingly adopted in adversarial settings such as spam, intrusion, and malware detection, although their security against well-crafted attacks that aim to evade detection by manipulating data at test time has not yet been thoroughly assessed. While previous work has been mainly focused on devising adversary-aware classification algorithms to counter evasion attempts, only few authors have considered the impact of using reduced feature sets on classifier security against the same attacks. An interesting, preliminary result is that classifier security to evasion may be even worsened by the application of feature selection. In this paper, we provide a more detailed investigation of this aspect, shedding some light on the security properties of feature selection against evasion attacks. Inspired by previous work on adversary-aware classifiers, we propose a novel adversary-aware feature selection model that can improve classifier security against evasion attacks, by incorporating specific assumptions on the adversary's data manipulation strategy. We focus on an efficient, wrapper-based implementation of our approach, and experimentally validate its soundness on different application examples, including spam and malware detection.
Fox, Eric W; Hill, Ryan A; Leibowitz, Scott G; Olsen, Anthony R; Thornbrugh, Darren J; Weber, Marc H
2017-07-01
Random forest (RF) modeling has emerged as an important statistical learning method in ecology due to its exceptional predictive performance. However, for large and complex ecological data sets, there is limited guidance on variable selection methods for RF modeling. Typically, either a preselected set of predictor variables are used or stepwise procedures are employed which iteratively remove variables according to their importance measures. This paper investigates the application of variable selection methods to RF models for predicting probable biological stream condition. Our motivating data set consists of the good/poor condition of n = 1365 stream survey sites from the 2008/2009 National Rivers and Stream Assessment, and a large set (p = 212) of landscape features from the StreamCat data set as potential predictors. We compare two types of RF models: a full variable set model with all 212 predictors and a reduced variable set model selected using a backward elimination approach. We assess model accuracy using RF's internal out-of-bag estimate, and a cross-validation procedure with validation folds external to the variable selection process. We also assess the stability of the spatial predictions generated by the RF models to changes in the number of predictors and argue that model selection needs to consider both accuracy and stability. The results suggest that RF modeling is robust to the inclusion of many variables of moderate to low importance. We found no substantial improvement in cross-validated accuracy as a result of variable reduction. Moreover, the backward elimination procedure tended to select too few variables and exhibited numerous issues such as upwardly biased out-of-bag accuracy estimates and instabilities in the spatial predictions. We use simulations to further support and generalize results from the analysis of real data. A main purpose of this work is to elucidate issues of model selection bias and instability to ecologists interested in using RF to develop predictive models with large environmental data sets.
2013-01-01
Introduction Gender inequalities exist in work life, but little is known about their presence in relation to factors examined in occupation health settings. The aim of this study was to identify and summarize the working and employment conditions described as determinants of gender inequalities in occupational health in studies related to occupational health published between 1999 and 2010. Methods A systematic literature review was undertaken of studies available in MEDLINE, EMBASE, Sociological Abstracts, LILACS, EconLit and CINAHL between 1999 and 2010. Epidemiologic studies were selected by applying a set of inclusion criteria to the title, abstract, and complete text. The quality of the studies was also assessed. Selected studies were qualitatively analysed, resulting in a compilation of all differences between women and men in the prevalence of exposure to working and employment conditions and work-related health problems as outcomes. Results Most of the 30 studies included were conducted in Europe (n=19) and had a cross-sectional design (n=24). The most common topic analysed was related to the exposure to work-related psychosocial hazards (n=8). Employed women had more job insecurity, lower control, worse contractual working conditions and poorer self-perceived physical and mental health than men did. Conversely, employed men had a higher degree of physically demanding work, lower support, higher levels of effort-reward imbalance, higher job status, were more exposed to noise and worked longer hours than women did. Conclusions This systematic review has identified a set of working and employment conditions as determinants of gender inequalities in occupational health from the occupational health literature. These results may be useful to policy makers seeking to reduce gender inequalities in occupational health, and to researchers wishing to analyse these determinants in greater depth. PMID:23915121
Campos-Serna, Javier; Ronda-Pérez, Elena; Artazcoz, Lucia; Moen, Bente E; Benavides, Fernando G
2013-08-05
Gender inequalities exist in work life, but little is known about their presence in relation to factors examined in occupation health settings. The aim of this study was to identify and summarize the working and employment conditions described as determinants of gender inequalities in occupational health in studies related to occupational health published between 1999 and 2010. A systematic literature review was undertaken of studies available in MEDLINE, EMBASE, Sociological Abstracts, LILACS, EconLit and CINAHL between 1999 and 2010. Epidemiologic studies were selected by applying a set of inclusion criteria to the title, abstract, and complete text. The quality of the studies was also assessed. Selected studies were qualitatively analysed, resulting in a compilation of all differences between women and men in the prevalence of exposure to working and employment conditions and work-related health problems as outcomes. Most of the 30 studies included were conducted in Europe (n=19) and had a cross-sectional design (n=24). The most common topic analysed was related to the exposure to work-related psychosocial hazards (n=8). Employed women had more job insecurity, lower control, worse contractual working conditions and poorer self-perceived physical and mental health than men did. Conversely, employed men had a higher degree of physically demanding work, lower support, higher levels of effort-reward imbalance, higher job status, were more exposed to noise and worked longer hours than women did. This systematic review has identified a set of working and employment conditions as determinants of gender inequalities in occupational health from the occupational health literature. These results may be useful to policy makers seeking to reduce gender inequalities in occupational health, and to researchers wishing to analyse these determinants in greater depth.
Perceptual quality estimation of H.264/AVC videos using reduced-reference and no-reference models
NASA Astrophysics Data System (ADS)
Shahid, Muhammad; Pandremmenou, Katerina; Kondi, Lisimachos P.; Rossholm, Andreas; Lövström, Benny
2016-09-01
Reduced-reference (RR) and no-reference (NR) models for video quality estimation, using features that account for the impact of coding artifacts, spatio-temporal complexity, and packet losses, are proposed. The purpose of this study is to analyze a number of potentially quality-relevant features in order to select the most suitable set of features for building the desired models. The proposed sets of features have not been used in the literature and some of the features are used for the first time in this study. The features are employed by the least absolute shrinkage and selection operator (LASSO), which selects only the most influential of them toward perceptual quality. For comparison, we apply feature selection in the complete feature sets and ridge regression on the reduced sets. The models are validated using a database of H.264/AVC encoded videos that were subjectively assessed for quality in an ITU-T compliant laboratory. We infer that just two features selected by RR LASSO and two bitstream-based features selected by NR LASSO are able to estimate perceptual quality with high accuracy, higher than that of ridge, which uses more features. The comparisons with competing works and two full-reference metrics also verify the superiority of our models.
SU-E-J-128: Two-Stage Atlas Selection in Multi-Atlas-Based Image Segmentation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhao, T; Ruan, D
2015-06-15
Purpose: In the new era of big data, multi-atlas-based image segmentation is challenged by heterogeneous atlas quality and high computation burden from extensive atlas collection, demanding efficient identification of the most relevant atlases. This study aims to develop a two-stage atlas selection scheme to achieve computational economy with performance guarantee. Methods: We develop a low-cost fusion set selection scheme by introducing a preliminary selection to trim full atlas collection into an augmented subset, alleviating the need for extensive full-fledged registrations. More specifically, fusion set selection is performed in two successive steps: preliminary selection and refinement. An augmented subset is firstmore » roughly selected from the whole atlas collection with a simple registration scheme and the corresponding preliminary relevance metric; the augmented subset is further refined into the desired fusion set size, using full-fledged registration and the associated relevance metric. The main novelty of this work is the introduction of an inference model to relate the preliminary and refined relevance metrics, based on which the augmented subset size is rigorously derived to ensure the desired atlases survive the preliminary selection with high probability. Results: The performance and complexity of the proposed two-stage atlas selection method were assessed using a collection of 30 prostate MR images. It achieved comparable segmentation accuracy as the conventional one-stage method with full-fledged registration, but significantly reduced computation time to 1/3 (from 30.82 to 11.04 min per segmentation). Compared with alternative one-stage cost-saving approach, the proposed scheme yielded superior performance with mean and medium DSC of (0.83, 0.85) compared to (0.74, 0.78). Conclusion: This work has developed a model-guided two-stage atlas selection scheme to achieve significant cost reduction while guaranteeing high segmentation accuracy. The benefit in both complexity and performance is expected to be most pronounced with large-scale heterogeneous data.« less
The identification of job opportunities for severely disabled sick-listed employees
2012-01-01
Background Work disability is a major problem for both the worker and society. To explore the work opportunities in regular jobs of persons low in functional abilities, we tried to identify occupations low in task demands. Because of the variety of functional abilities and of the corresponding work demands, the disabled persons need to be classified by type of disability in a limited number of subgroups. Within each subgroup, occupations judged suitable for the most seriously disabled will be selected as having a very low level of the corresponding task demands. These occupations can be applied as reference occupations to assess the presence or absence of work capacity of sick-listed employees in regular jobs, and as job opportunities for people with a specific type of functional disability. Methods Registered data from 50,931 disability assessments within the Dutch social security system were used in a second order factor analysis to identify types of disabilities in claimants for a disability pension. Threshold values were chosen to classify claimants according to the severity of the disability. In the disability assessment procedure, a labour expert needs to select jobs with task demands not exceeding the functional abilities of the claimant. For each type of disability, the accessible jobs for the subgroup of the most severely disabled claimants were identified as lowest in the corresponding demand. Results The factor analysis resulted in four types of disabilities: general physical ability; autonomy; psychological ability; and manual skills. For each of these types of disablement, a set of four to six occupations low in task demands were selected for the subgroup of most severely disabled claimants. Because of an overlap of the sets of occupations, 13 occupations were selected in total. The percentage of claimants with at least one of the occupations of the corresponding set (the coverage), ranged from 84% to 93%. An alternative selection of six occupations for all subgroups with even less overlap had a coverage ranging from 84% to 89% per subgroup. Conclusion This study resulted in two proposals for a set of reference occupations. Further research will be needed to compare the results of the new method of disability assessment to the results of the method presently used in practice. PMID:22394686
ERIC Educational Resources Information Center
Choi, Eunjung; Keith, Laura J.
2016-01-01
Contemporary African-American classical composers Cedric Adderley, John Lane, and Trevor Weston intertwine strands of culture and individual experience to produce musical works whose distinct designs offer cultural resources that music educators can use to integrate diversity into instructional settings. Of special interest is their ability to…
Making Facilitation Work: The Challenges on an International DBA Action Learning Set
ERIC Educational Resources Information Center
OFarrell, Jack
2018-01-01
This account relates my experiences as facilitator of an action learning set on a DBA cohort comprising international students and myself. It outlines the reasons for my selection as facilitator and describes my initial expectations and assumptions of action learning. I chart the difficulty in separating the 'what' of my own research from the…
Selection Dynamics in Joint Matching to Rate and Magnitude of Reinforcement
ERIC Educational Resources Information Center
McDowell, J. J.; Popa, Andrei; Calvin, Nicholas T.
2012-01-01
Virtual organisms animated by a selectionist theory of behavior dynamics worked on concurrent random interval schedules where both the rate and magnitude of reinforcement were varied. The selectionist theory consists of a set of simple rules of selection, recombination, and mutation that act on a population of potential behaviors by means of a…
Educational Choices and the Selection Process: Before and after Compulsory Schooling
ERIC Educational Resources Information Center
Mocetti, Sauro
2012-01-01
The aim of this paper is to analyze the selection process at work before and after compulsory schooling by assessing the determinants of school failures, dropouts, and upper secondary school decisions of young Italians. The data-set is built combining individual data by the Labor Force Survey and aggregate data on local labor markets and school…
Middle-Class Parents' Educational Work in an Academically Selective Public High School
ERIC Educational Resources Information Center
Stacey, Meghan
2016-01-01
This article reports the findings of a study on the nature of parent-school engagement at an academically selective public high school in New South Wales, Australia. Such research is pertinent given recent policies of "choice" and decentralization, making a study of local stakeholders timely. The research comprised a set of interviews…
Ma, Yue; Yin, Fei; Zhang, Tao; Zhou, Xiaohua Andrew; Li, Xiaosong
2016-01-01
Spatial scan statistics are widely used in various fields. The performance of these statistics is influenced by parameters, such as maximum spatial cluster size, and can be improved by parameter selection using performance measures. Current performance measures are based on the presence of clusters and are thus inapplicable to data sets without known clusters. In this work, we propose a novel overall performance measure called maximum clustering set-proportion (MCS-P), which is based on the likelihood of the union of detected clusters and the applied dataset. MCS-P was compared with existing performance measures in a simulation study to select the maximum spatial cluster size. Results of other performance measures, such as sensitivity and misclassification, suggest that the spatial scan statistic achieves accurate results in most scenarios with the maximum spatial cluster sizes selected using MCS-P. Given that previously known clusters are not required in the proposed strategy, selection of the optimal maximum cluster size with MCS-P can improve the performance of the scan statistic in applications without identified clusters.
Artificial intelligence techniques for embryo and oocyte classification.
Manna, Claudio; Nanni, Loris; Lumini, Alessandra; Pappalardo, Sebastiana
2013-01-01
One of the most relevant aspects in assisted reproduction technology is the possibility of characterizing and identifying the most viable oocytes or embryos. In most cases, embryologists select them by visual examination and their evaluation is totally subjective. Recently, due to the rapid growth in the capacity to extract texture descriptors from a given image, a growing interest has been shown in the use of artificial intelligence methods for embryo or oocyte scoring/selection in IVF programmes. This work concentrates the efforts on the possible prediction of the quality of embryos and oocytes in order to improve the performance of assisted reproduction technology, starting from their images. The artificial intelligence system proposed in this work is based on a set of Levenberg-Marquardt neural networks trained using textural descriptors (the local binary patterns). The proposed system was tested on two data sets of 269 oocytes and 269 corresponding embryos from 104 women and compared with other machine learning methods already proposed in the past for similar classification problems. Although the results are only preliminary, they show an interesting classification performance. This technique may be of particular interest in those countries where legislation restricts embryo selection. One of the most relevant aspects in assisted reproduction technology is the possibility of characterizing and identifying the most viable oocytes or embryos. In most cases, embryologists select them by visual examination and their evaluation is totally subjective. Recently, due to the rapid growth in our capacity to extract texture descriptors from a given image, a growing interest has been shown in the use of artificial intelligence methods for embryo or oocyte scoring/selection in IVF programmes. In this work, we concentrate our efforts on the possible prediction of the quality of embryos and oocytes in order to improve the performance of assisted reproduction technology, starting from their images. The artificial intelligence system proposed in this work is based on a set of Levenberg-Marquardt neural networks trained using textural descriptors (the 'local binary patterns'). The proposed system is tested on two data sets, of 269 oocytes and 269 corresponding embryos from 104 women, and compared with other machine learning methods already proposed in the past for similar classification problems. Although the results are only preliminary, they showed an interesting classification performance. This technique may be of particular interest in those countries where legislation restricts embryo selection. Copyright © 2012 Reproductive Healthcare Ltd. Published by Elsevier Ltd. All rights reserved.
Tier One Performance Screen Initial Operational Test and Evaluation: 2012 Interim Report
2013-12-01
are known to predict outcomes in work settings. Because the TAPAS uses item response theory (IRT) methods to construct and score items, it can be...Qualification Test (AFQT), to select new Soldiers. Although the AFQT is useful for selecting new Soldiers, other personal attributes are important to...to be and will continue to serve as a useful metric for selecting new Soldiers, other personal attributes, in particular non-cognitive attributes
Rácz, A; Bajusz, D; Héberger, K
2015-01-01
Recent implementations of QSAR modelling software provide the user with numerous models and a wealth of information. In this work, we provide some guidance on how one should interpret the results of QSAR modelling, compare and assess the resulting models, and select the best and most consistent ones. Two QSAR datasets are applied as case studies for the comparison of model performance parameters and model selection methods. We demonstrate the capabilities of sum of ranking differences (SRD) in model selection and ranking, and identify the best performance indicators and models. While the exchange of the original training and (external) test sets does not affect the ranking of performance parameters, it provides improved models in certain cases (despite the lower number of molecules in the training set). Performance parameters for external validation are substantially separated from the other merits in SRD analyses, highlighting their value in data fusion.
OligoIS: Scalable Instance Selection for Class-Imbalanced Data Sets.
García-Pedrajas, Nicolás; Perez-Rodríguez, Javier; de Haro-García, Aida
2013-02-01
In current research, an enormous amount of information is constantly being produced, which poses a challenge for data mining algorithms. Many of the problems in extremely active research areas, such as bioinformatics, security and intrusion detection, or text mining, share the following two features: large data sets and class-imbalanced distribution of samples. Although many methods have been proposed for dealing with class-imbalanced data sets, most of these methods are not scalable to the very large data sets common to those research fields. In this paper, we propose a new approach to dealing with the class-imbalance problem that is scalable to data sets with many millions of instances and hundreds of features. This proposal is based on the divide-and-conquer principle combined with application of the selection process to balanced subsets of the whole data set. This divide-and-conquer principle allows the execution of the algorithm in linear time. Furthermore, the proposed method is easy to implement using a parallel environment and can work without loading the whole data set into memory. Using 40 class-imbalanced medium-sized data sets, we will demonstrate our method's ability to improve the results of state-of-the-art instance selection methods for class-imbalanced data sets. Using three very large data sets, we will show the scalability of our proposal to millions of instances and hundreds of features.
Case Studies Nested in Fuzzy-Set QCA on Sufficiency: Formalizing Case Selection and Causal Inference
ERIC Educational Resources Information Center
Schneider, Carsten Q.; Rohlfing, Ingo
2016-01-01
Qualitative Comparative Analysis (QCA) is a method for cross-case analyses that works best when complemented with follow-up case studies focusing on the causal quality of the solution and its constitutive terms, the underlying causal mechanisms, and potentially omitted conditions. The anchorage of QCA in set theory demands criteria for follow-up…
Reading Strategies in Hypertexts and Factors Influencing Hyperlink Selection
ERIC Educational Resources Information Center
Protopsaltis, Aristidis
2008-01-01
Previous work applying cognitive load theory has demonstrated the effect of various text/graphic/narration relations on learning using multimedia material. Other work has looked at how the degree of integration between the text and graphics influences their use. This study set out to look at how the degree of integration between text and graphics…
What Are Some Alternatives for Working Within a Regionally Adopted Science Framework?
ERIC Educational Resources Information Center
Perkes, Victor A.
Alternatives for working within a regionally adopted framework for selecting an elementary school science program are considered in this paper. The alternatives are ranked on a scale from 0 to 5 in increasing levels of modifying a set instructional pattern: Level 0, typified by indifference to any consistent program in science; Level 1, a complete…
Preservation of human performance capacity under prolonged space flight conditions
NASA Technical Reports Server (NTRS)
Yeremin, A. V.; Bogdashevskiy, R. M.; Baburin, Y. F.
1975-01-01
Prophylactic measures directed toward preservation of health and maintenance of the performance ability of a man during prolonged space flight stress center on the selection of optimum work and rest cycles, physical exercises, the use of pharmacological agents, conditioning of the cardiovascular apparatus, etc. A specially selected set of hormone and pharmacological preparations is recommended to stimulate hemopoiesis.
Singh, Jasvinder A; Dowsey, Michelle M; Dohm, Michael; Goodman, Susan M; Leong, Amye L; Scholte Voshaar, Marieke M J H; Choong, Peter F
2017-11-01
Discussion and endorsement of the OMERACT total joint replacement (TJR) core domain set for total hip replacement (THR) and total knee replacement (TKR) for endstage arthritis; and next steps for selection of instruments. The OMERACT TJR working group met at the 2016 meeting at Whistler, British Columbia, Canada. We summarized the previous systematic reviews, the preliminary OMERACT TJR core domain set and results from previous surveys. We discussed preliminary core domains for TJR clinical trials, made modifications, and identified challenges with domain measurement. Working group participants (n = 26) reviewed, clarified, and endorsed each of the inner and middle circle domains and added a range of motion domain to the research agenda. TJR were limited to THR and TKR but included all endstage hip and knee arthritis refractory to medical treatment. Participants overwhelmingly endorsed identification and evaluation of top instruments mapping to the core domains (100%) and use of subscales of validated multidimensional instruments to measure core domains for the TJR clinical trial core measurement set (92%). An OMERACT core domain set for hip/knee TJR trials has been defined and we are selecting instruments to develop the TJR clinical trial core measurement set to serve as a common foundation for harmonizing measures in TJR clinical trials.
Prefrontal Hemodynamics of Physical Activity and Environmental Complexity During Cognitive Work.
McKendrick, Ryan; Mehta, Ranjana; Ayaz, Hasan; Scheldrup, Melissa; Parasuraman, Raja
2017-02-01
The aim of this study was to assess performance and cognitive states during cognitive work in the presence of physical work and in natural settings. Authors of previous studies have examined the interaction between cognitive and physical work, finding performance decrements in working memory. Neuroimaging has revealed increases and decreases in prefrontal oxygenated hemoglobin during the interaction of cognitive and physical work. The effect of environment on cognitive-physical dual tasking has not been previously considered. Thirteen participants were monitored with wireless functional near-infrared spectroscopy (fNIRS) as they performed an auditory 1-back task while sitting, walking indoors, and walking outdoors. Relative to sitting and walking indoors, auditory working memory performance declined when participants were walking outdoors. Sitting during the auditory 1-back task increased oxygenated hemoglobin and decreased deoxygenated hemoglobin in bilateral prefrontal cortex. Walking reduced the total hemoglobin available to bilateral prefrontal cortex. An increase in environmental complexity reduced oxygenated hemoglobin and increased deoxygenated hemoglobin in bilateral prefrontal cortex. Wireless fNIRS is capable of monitoring cognitive states in naturalistic environments. Selective attention and physical work compete with executive processing. During executive processing loading of selective attention and physical work results in deactivation of bilateral prefrontal cortex and degraded working memory performance, indicating that physical work and concomitant selective attention may supersede executive processing in the distribution of mental resources. This research informs decision-making procedures in work where working memory, physical activity, and attention interact. Where working memory is paramount, precautions should be taken to eliminate competition from physical work and selective attention.
Zambrano, Lysien I; Pereyra-Elías, Reneé; Reyes-García, Selvin Z; Fuentes, Itzel; Mayta-Tristán, Percy
2015-01-01
We sought to evaluate the intentions of Honduran medical students to emigrate or to work in a rural setting, and their association with parental education. We performed a cross-sectional, analytic study at a Honduran medical school. Student participants completed a structured questionnaire, which assessed their intentions to emigrate or work in a rural setting after finishing medical school and the highest level of education achieved by their parents. We calculated crude and adjusted prevalence ratios with their respective 95% confidence intervals. Of 868 surveys distributed, 564 were completed. The mean age of the participants was 21 (standard deviation 3) years, and 62.2% were female. Of the respondents, 16.6% intended to emigrate to work and 11.2% intended to work in a rural setting. Higher paternal education (i.e., technical, university and postgraduate training) was associated with a higher rate of intention to emigrate. Students whose fathers underwent postgraduate education were less likely to intend to work in a rural setting. For maternal education, only the postgraduate level was associated with the outcomes in some of the tested models. The frequency of students intending to emigrate was relatively low. However, the frequency of students being willing to work in rural settings was also low. Students whose parents had higher levels of education were more likely to intend to work abroad and less likely to intend to work in a rural area. These factors should be considered in medical schools' selection processes to improve retention and ensure adequate distribution of physicians.
NASA Astrophysics Data System (ADS)
Kurchatkin, I. V.; Gorshkalev, A. A.; Blagin, E. V.
2017-01-01
This article deals with developed methods of the working processes modelling in the combustion chamber of an internal combustion engine (ICE). Methods includes description of the preparation of a combustion chamber 3-d model, setting of the finite-element mesh, boundary condition setting and solution customization. Aircraft radial engine M-14 was selected for modelling. The cycle of cold blowdown in the ANSYS IC Engine software was carried out. The obtained data were compared to results of known calculation methods. A method of engine’s induction port improvement was suggested.
SU-E-T-446: Group-Sparsity Based Angle Generation Method for Beam Angle Optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gao, H
2015-06-15
Purpose: This work is to develop the effective algorithm for beam angle optimization (BAO), with the emphasis on enabling further improvement from existing treatment-dependent templates based on clinical knowledge and experience. Methods: The proposed BAO algorithm utilizes a priori beam angle templates as the initial guess, and iteratively generates angular updates for this initial set, namely angle generation method, with improved dose conformality that is quantitatively measured by the objective function. That is, during each iteration, we select “the test angle” in the initial set, and use group-sparsity based fluence map optimization to identify “the candidate angle” for updating “themore » test angle”, for which all the angles in the initial set except “the test angle”, namely “the fixed set”, are set free, i.e., with no group-sparsity penalty, and the rest of angles including “the test angle” during this iteration are in “the working set”. And then “the candidate angle” is selected with the smallest objective function value from the angles in “the working set” with locally maximal group sparsity, and replaces “the test angle” if “the fixed set” with “the candidate angle” has a smaller objective function value by solving the standard fluence map optimization (with no group-sparsity regularization). Similarly other angles in the initial set are in turn selected as “the test angle” for angular updates and this chain of updates is iterated until no further new angular update is identified for a full loop. Results: The tests using the MGH public prostate dataset demonstrated the effectiveness of the proposed BAO algorithm. For example, the optimized angular set from the proposed BAO algorithm was better the MGH template. Conclusion: A new BAO algorithm is proposed based on the angle generation method via group sparsity, with improved dose conformality from the given template. Hao Gao was partially supported by the NSFC (#11405105), the 973 Program (#2015CB856000) and the Shanghai Pujiang Talent Program (#14PJ1404500)« less
ERIC Educational Resources Information Center
Dollerup, Cay, Ed.; Lindegaard, Annette, Ed.
This selection of papers starts with insights into multi- and plurilingual settings, then proceeds to discussions of aims for practical work with students, and ends with visions of future developments within translation for the mass media and the impact of machine translation. Papers are: "Interpreting at the European Commission";…
Motivation and Quality of Work Life among Secondary School EFL Teachers
ERIC Educational Resources Information Center
Baleghizadeh, Sasan; Gordani, Yahya
2012-01-01
This study set out to investigate the relationship between quality of work life and teacher motivation among 160 secondary school English as a foreign language (EFL) teachers in Tehran, Iran. In addition, 30 of the participants were randomly selected to take part in follow-up interviews which asked why they felt the way they reported. The results…
The New Urban High School: A Practitioner's Guide.
ERIC Educational Resources Information Center
Big Picture Co., Cambridge, MA.
In October 1996, the Big Picture Company set out to find six urban high schools that use school-to-work strategies as a lever for whole-school reform. In the schools finally selected for the New Urban High Schools Project, and in others examined for the study, "school-to-work" is a misnomer, because the majority of students are entering…
THE MOTIVATION TO WORK. SPECIAL SUPPLEMENT TO "THE SELECTION OF TRAINEES UNDER MDTA".
ERIC Educational Resources Information Center
INDIK, BERNARD P.
THE PURPOSE OF THIS STUDY WAS TO BUILD A SET OF MEASURES WHICH WOULD PROVIDE INSIGHT INTO PEOPLE'S "MOTIVATION TO WORK." A SYSTEMATIC 10 PERCENT SAMPLE, 1,958 PERSONS, WAS DRAWN FROM THE REGISTERED POPULATION OF THE NEWARK EMPLOYMENT SERVICE IN LATE 1964. A SAMPLE OF 500 PERSONS, CLASSIFIED INTO EIGHT CATEGORIES ON THE BASIS OF A…
Visual statistical learning is not reliably modulated by selective attention to isolated events
Musz, Elizabeth; Weber, Matthew J.; Thompson-Schill, Sharon L.
2014-01-01
Recent studies of visual statistical learning (VSL) indicate that the visual system can automatically extract temporal and spatial relationships between objects. We report several attempts to replicate and extend earlier work (Turk-Browne et al., 2005) in which observers performed a cover task on one of two interleaved stimulus sets, resulting in learning of temporal relationships that occur in the attended stream, but not those present in the unattended stream. Across four experiments, we exposed observers to a similar or identical familiarization protocol, directing attention to one of two interleaved stimulus sets; afterward, we assessed VSL efficacy for both sets using either implicit response-time measures or explicit familiarity judgments. In line with prior work, we observe learning for the attended stimulus set. However, unlike previous reports, we also observe learning for the unattended stimulus set. When instructed to selectively attend to only one of the stimulus sets and ignore the other set, observers could extract temporal regularities for both sets. Our efforts to experimentally decrease this effect by changing the cover task (Experiment 1) or the complexity of the statistical regularities (Experiment 3) were unsuccessful. A fourth experiment using a different assessment of learning likewise failed to show an attentional effect. Simulations drawing random samples our first three experiments (n=64) confirm that the distribution of attentional effects in our sample closely approximates the null. We offer several potential explanations for our failure to replicate earlier findings, and discuss how our results suggest limiting conditions on the relevance of attention to VSL. PMID:25172196
Development of the International Spinal Cord Injury Activities and Participation Basic Data Set.
Post, M W; Charlifue, S; Biering-Sørensen, F; Catz, A; Dijkers, M P; Horsewell, J; Noonan, V K; Noreau, L; Tate, D G; Sinnott, K A
2016-07-01
Consensus decision-making process. The objective of this study was to develop an International Spinal Cord Injury (SCI) Activities and Participation (A&P) Basic Data Set. International working group. A committee of experts was established to select and define A&P data elements to be included in this data set. A draft data set was developed and posted on the International Spinal Cord Society (ISCoS) and American Spinal Injury Association websites and was also disseminated among appropriate organizations for review. Suggested revisions were considered, and a final version of the A&P Data Set was completed. Consensus was reached to define A&P and to incorporate both performance and satisfaction ratings. Items that were considered core to each A&P domain were selected from two existing questionnaires. Four items measuring activities were selected from the Spinal Cord Independence Measure III to provide basic data on task execution in activities of daily living. Eight items were selected from the Craig Handicap Assessment and Reporting Technique to provide basic data on the frequency of participation. An additional rating of satisfaction on a three-point scale for each item completes the total of 24 A&P variables. Collection of the International SCI A&P Basic Data Set variables in all future research on SCI outcomes is advised to facilitate comparison of results across published studies from around the world. Additional standardised instruments to assess activities of daily living or participation can be administered, depending on the purpose of a particular study.
Escalona Galvis, Luis Waldo; Diaz-Montiel, Paulina; Venkataraman, Satchi
2017-01-01
Electrical Resistance Tomography (ERT) offers a non-destructive evaluation (NDE) technique that takes advantage of the inherent electrical properties in carbon fiber reinforced polymer (CFRP) composites for internal damage characterization. This paper investigates a method of optimum selection of sensing configurations for delamination detection in thick cross-ply laminates using ERT. Reduction in the number of sensing locations and measurements is necessary to minimize hardware and computational effort. The present work explores the use of an effective independence (EI) measure originally proposed for sensor location optimization in experimental vibration modal analysis. The EI measure is used for selecting the minimum set of resistance measurements among all possible combinations resulting from selecting sensing electrode pairs. Singular Value Decomposition (SVD) is applied to obtain a spectral representation of the resistance measurements in the laminate for subsequent EI based reduction to take place. The electrical potential field in a CFRP laminate is calculated using finite element analysis (FEA) applied on models for two different laminate layouts considering a set of specified delamination sizes and locations with two different sensing arrangements. The effectiveness of the EI measure in eliminating redundant electrode pairs is demonstrated by performing inverse identification of damage using the full set and the reduced set of resistance measurements. This investigation shows that the EI measure is effective for optimally selecting the electrode pairs needed for resistance measurements in ERT based damage detection. PMID:28772485
Escalona Galvis, Luis Waldo; Diaz-Montiel, Paulina; Venkataraman, Satchi
2017-02-04
Electrical Resistance Tomography (ERT) offers a non-destructive evaluation (NDE) technique that takes advantage of the inherent electrical properties in carbon fiber reinforced polymer (CFRP) composites for internal damage characterization. This paper investigates a method of optimum selection of sensing configurations for delamination detection in thick cross-ply laminates using ERT. Reduction in the number of sensing locations and measurements is necessary to minimize hardware and computational effort. The present work explores the use of an effective independence (EI) measure originally proposed for sensor location optimization in experimental vibration modal analysis. The EI measure is used for selecting the minimum set of resistance measurements among all possible combinations resulting from selecting sensing electrode pairs. Singular Value Decomposition (SVD) is applied to obtain a spectral representation of the resistance measurements in the laminate for subsequent EI based reduction to take place. The electrical potential field in a CFRP laminate is calculated using finite element analysis (FEA) applied on models for two different laminate layouts considering a set of specified delamination sizes and locations with two different sensing arrangements. The effectiveness of the EI measure in eliminating redundant electrode pairs is demonstrated by performing inverse identification of damage using the full set and the reduced set of resistance measurements. This investigation shows that the EI measure is effective for optimally selecting the electrode pairs needed for resistance measurements in ERT based damage detection.
Knowledge mining from clinical datasets using rough sets and backpropagation neural network.
Nahato, Kindie Biredagn; Harichandran, Khanna Nehemiah; Arputharaj, Kannan
2015-01-01
The availability of clinical datasets and knowledge mining methodologies encourages the researchers to pursue research in extracting knowledge from clinical datasets. Different data mining techniques have been used for mining rules, and mathematical models have been developed to assist the clinician in decision making. The objective of this research is to build a classifier that will predict the presence or absence of a disease by learning from the minimal set of attributes that has been extracted from the clinical dataset. In this work rough set indiscernibility relation method with backpropagation neural network (RS-BPNN) is used. This work has two stages. The first stage is handling of missing values to obtain a smooth data set and selection of appropriate attributes from the clinical dataset by indiscernibility relation method. The second stage is classification using backpropagation neural network on the selected reducts of the dataset. The classifier has been tested with hepatitis, Wisconsin breast cancer, and Statlog heart disease datasets obtained from the University of California at Irvine (UCI) machine learning repository. The accuracy obtained from the proposed method is 97.3%, 98.6%, and 90.4% for hepatitis, breast cancer, and heart disease, respectively. The proposed system provides an effective classification model for clinical datasets.
Scerri, Anthony; Innes, Anthea; Scerri, Charles
2017-08-01
Although literature describing and evaluating training programmes in hospital settings increased in recent years, there are no reviews that summarise these programmes. This review sought to address this, by collecting the current evidence on dementia training programmes directed to staff working in general hospitals. Literature from five databases were searched, based on a number of inclusion criteria. The selected studies were summarised and data was extracted and compared using narrative synthesis based on a set of pre-defined categories. Methodological quality was assessed. Fourteen peer-reviewed studies were identified with the majority being pre-test post-test investigations. No randomised controlled trials were found. Methodological quality was variable with selection bias being the major limitation. There was a great variability in the development and mode of delivery although, interdisciplinary ward based, tailor-made, short sessions using experiential and active learning were the most utilised. The majority of the studies mainly evaluated learning, with few studies evaluating changes in staff behaviour/practices and patients' outcomes. This review indicates that high quality studies are needed that especially evaluate staff behaviours and patient outcomes and their sustainability over time. It also highlights measures that could be used to develop and deliver training programmes in hospital settings.
Lepre, Jorge; Rice, J Jeremy; Tu, Yuhai; Stolovitzky, Gustavo
2004-05-01
Despite the growing literature devoted to finding differentially expressed genes in assays probing different tissues types, little attention has been paid to the combinatorial nature of feature selection inherent to large, high-dimensional gene expression datasets. New flexible data analysis approaches capable of searching relevant subgroups of genes and experiments are needed to understand multivariate associations of gene expression patterns with observed phenotypes. We present in detail a deterministic algorithm to discover patterns of multivariate gene associations in gene expression data. The patterns discovered are differential with respect to a control dataset. The algorithm is exhaustive and efficient, reporting all existent patterns that fit a given input parameter set while avoiding enumeration of the entire pattern space. The value of the pattern discovery approach is demonstrated by finding a set of genes that differentiate between two types of lymphoma. Moreover, these genes are found to behave consistently in an independent dataset produced in a different laboratory using different arrays, thus validating the genes selected using our algorithm. We show that the genes deemed significant in terms of their multivariate statistics will be missed using other methods. Our set of pattern discovery algorithms including a user interface is distributed as a package called Genes@Work. This package is freely available to non-commercial users and can be downloaded from our website (http://www.research.ibm.com/FunGen).
2010-02-05
Smoke-free policies (i.e., policies that completely eliminate smoking in indoor workplaces and public places) result in health benefits, including preventing heart attacks. Preemptive legislation at the state level prohibits localities from enacting laws that vary from state law or are more stringent. A Healthy People 2010 objective (27-19) is to eliminate state laws that preempt stronger local tobacco control laws. A 2005 CDC review found that little progress was being made toward reducing the number of state laws preempting local smoking restrictions in three indoor settings: government work sites, private-sector work sites, and restaurants. These three settings were selected for analysis because they are settings that often are addressed by state and local smoking restrictions and because they are major settings where nonsmoking workers and patrons are exposed to secondhand smoke. This report updates the previous analysis and summarizes changes that occurred from December 31, 2004, to December 31, 2009, in state laws that preempt local smoke-free laws for the same three settings. During that period, the number of states preempting local smoking restrictions in at least one of these three settings decreased from 19 to 12. In contrast with the 2005 findings, this decrease indicates progress toward achieving the goal of eliminating state laws preempting local smoking restrictions. Further progress could result in additional reductions in secondhand smoke exposure.
Löbner, Margrit; Luppa, Melanie; Konnopka, Alexander; Meisel, Hans J; Günther, Lutz; Meixensberger, Jürgen; Stengler, Katarina; Angermeyer, Matthias C; König, Hans-Helmut; Riedel-Heller, Steffi G
2014-01-01
To examine rehabilitation preferences, participation and determinants for the choice of a certain rehabilitation setting (inpatient vs. outpatient) and setting-specific rehabilitation outcomes. The longitudinal observational study referred to 534 consecutive disc surgery patients (18-55 years). Face-to-face baseline interviews took place about 3.6 days after disc surgery during acute hospital stay. 486 patients also participated in a follow-up interview via telephone three months later (dropout-rate: 9%). The following instruments were used: depression and anxiety (Hospital Anxiety and Depression Scale), pain intensity (numeric analog scale), health-related quality of life (Short Form 36 Health Survey), subjective prognosis of gainful employment (SPE-scale) as well as questions on rehabilitation attendance, return to work, and amount of sick leave days. The vast majority of patients undergoing surgery for a herniated disc attended a post-hospital rehabilitation treatment program (93%). Thereby two-thirds of these patients took part in an inpatient rehabilitation program (67.9%). Physical, psychological, vocational and health-related quality of life characteristics differed widely before as well as after rehabilitation depending on the setting. Inpatient rehabilitees were significantly older, reported more pain, worse physical quality of life, more anxiety and depression and a worse subjective prognosis of gainful employment before rehabilitation. Pre-rehabilitation differences remained significant after rehabilitation. More than half of the outpatient rehabilitees (56%) compared to only one third of the inpatient rehabilitees (33%) returned to work three months after disc surgery (p<.001). The results suggest a "pre-selection" of patients with better health status in outpatient rehabilitation. Gaining better knowledge about setting-specific selection processes may help optimizing rehabilitation allocation procedures and improve rehabilitation effects such as return to work.
Inferring the Mode of Selection from the Transient Response to Demographic Perturbations
NASA Astrophysics Data System (ADS)
Balick, Daniel; Do, Ron; Reich, David; Sunyaev, Shamil
2014-03-01
Despite substantial recent progress in theoretical population genetics, most models work under the assumption of a constant population size. Deviations from fixed population sizes are ubiquitous in natural populations, many of which experience population bottlenecks and re-expansions. The non-equilibrium dynamics introduced by a large perturbation in population size are generally viewed as a confounding factor. In the present work, we take advantage of the transient response to a population bottleneck to infer features of the mode of selection and the distribution of selective effects. We develop an analytic framework and a corresponding statistical test that qualitatively differentiates between alleles under additive and those under recessive or more general epistatic selection. This statistic can be used to bound the joint distribution of selective effects and dominance effects in any diploid sexual organism. We apply this technique to human population genetic data, and severely restrict the space of allowed selective coefficients in humans. Additionally, one can test a set of functionally or medically relevant alleles for the primary mode of selection, or determine the local regional variation in dominance coefficients along the genome.
Nursing Work in Long-Term Care: An Integrative Review.
Montayre, Jed; Montayre, Jasmine
2017-11-01
Evidence suggests that delivery of good nursing care in long-term care (LTC) facilities is reflected in nurses' descriptions of the factors and structures that affect their work. Understanding the contemporary nature of nursing work in aged care will influence policies for improving current work structures in this practice setting. The current review aims to present a contemporary perspective of RNs' work in LTC facilities. A comprehensive search and purposeful selection of the literature was conducted using CINAHL, PubMed, Medline, Scopus, and Google Scholar databases. Nine studies were eligible for review. Common themes revealed that nursing work in aged care settings is characterized by RNs providing indirect care tasks-primarily care coordination, engaging in non-nursing activities, and having an expanded and overlapping role. As care providers, aged care RNs do not always provide direct care as part of their nursing work. The scope of RN work beyond its clinical nature or performance of non-nursing tasks adds complexity in clarifying RN work roles in aged care. [Journal of Gerontological Nursing, 43(11), 41-49.]. Copyright 2017, SLACK Incorporated.
Hybrid feature selection algorithm using symmetrical uncertainty and a harmony search algorithm
NASA Astrophysics Data System (ADS)
Salameh Shreem, Salam; Abdullah, Salwani; Nazri, Mohd Zakree Ahmad
2016-04-01
Microarray technology can be used as an efficient diagnostic system to recognise diseases such as tumours or to discriminate between different types of cancers in normal tissues. This technology has received increasing attention from the bioinformatics community because of its potential in designing powerful decision-making tools for cancer diagnosis. However, the presence of thousands or tens of thousands of genes affects the predictive accuracy of this technology from the perspective of classification. Thus, a key issue in microarray data is identifying or selecting the smallest possible set of genes from the input data that can achieve good predictive accuracy for classification. In this work, we propose a two-stage selection algorithm for gene selection problems in microarray data-sets called the symmetrical uncertainty filter and harmony search algorithm wrapper (SU-HSA). Experimental results show that the SU-HSA is better than HSA in isolation for all data-sets in terms of the accuracy and achieves a lower number of genes on 6 out of 10 instances. Furthermore, the comparison with state-of-the-art methods shows that our proposed approach is able to obtain 5 (out of 10) new best results in terms of the number of selected genes and competitive results in terms of the classification accuracy.
Applications of information theory, genetic algorithms, and neural models to predict oil flow
NASA Astrophysics Data System (ADS)
Ludwig, Oswaldo; Nunes, Urbano; Araújo, Rui; Schnitman, Leizer; Lepikson, Herman Augusto
2009-07-01
This work introduces a new information-theoretic methodology for choosing variables and their time lags in a prediction setting, particularly when neural networks are used in non-linear modeling. The first contribution of this work is the Cross Entropy Function (XEF) proposed to select input variables and their lags in order to compose the input vector of black-box prediction models. The proposed XEF method is more appropriate than the usually applied Cross Correlation Function (XCF) when the relationship among the input and output signals comes from a non-linear dynamic system. The second contribution is a method that minimizes the Joint Conditional Entropy (JCE) between the input and output variables by means of a Genetic Algorithm (GA). The aim is to take into account the dependence among the input variables when selecting the most appropriate set of inputs for a prediction problem. In short, theses methods can be used to assist the selection of input training data that have the necessary information to predict the target data. The proposed methods are applied to a petroleum engineering problem; predicting oil production. Experimental results obtained with a real-world dataset are presented demonstrating the feasibility and effectiveness of the method.
NASA Astrophysics Data System (ADS)
Goudarzi, Nasser
2016-04-01
In this work, two new and powerful chemometrics methods are applied for the modeling and prediction of the 19F chemical shift values of some fluorinated organic compounds. The radial basis function-partial least square (RBF-PLS) and random forest (RF) are employed to construct the models to predict the 19F chemical shifts. In this study, we didn't used from any variable selection method and RF method can be used as variable selection and modeling technique. Effects of the important parameters affecting the ability of the RF prediction power such as the number of trees (nt) and the number of randomly selected variables to split each node (m) were investigated. The root-mean-square errors of prediction (RMSEP) for the training set and the prediction set for the RBF-PLS and RF models were 44.70, 23.86, 29.77, and 23.69, respectively. Also, the correlation coefficients of the prediction set for the RBF-PLS and RF models were 0.8684 and 0.9313, respectively. The results obtained reveal that the RF model can be used as a powerful chemometrics tool for the quantitative structure-property relationship (QSPR) studies.
van Manen, Janine; Kamphuis, Jan Henk; Visbach, Geny; Ziegler, Uli; Gerritsen, Ad; Van Rossum, Bert; Rijnierse, Piet; Timman, Reinier; Verheul, Roel
2008-11-01
Treatment selection in clinical practice is a poorly understood, often largely implicit decision process, perhaps especially for patients with personality disorders. This study, therefore, investigated how intake clinicians use information about patient characteristics to select psychotherapeutic treatment for patients with personality disorder. A structured interview with a forced-choice format was administered to 27 experienced intake clinicians working in five specialist mental health care institutes in the Netherlands. Substantial consensus was evident among intake clinicians. The results revealed that none of the presented patient characteristics were deemed relevant for the selection of the suitable treatment setting. The appropriate duration and intensity are selected using severity or personal strength variables. The theoretical orientation is selected using personal strength variables.
Lionis, Christos; Papadakaki, Maria; Saridaki, Aristoula; Dowrick, Christopher; O'Donnell, Catherine A; Mair, Frances S; van den Muijsenbergh, Maria; Burns, Nicola; de Brún, Tomas; O'Reilly de Brún, Mary; van Weel-Baumgarten, Evelyn; Spiegel, Wolfgang; MacFarlane, Anne
2016-01-01
Objectives Guidelines and training initiatives (G/TIs) are available to support communication in cross-cultural consultations but are rarely implemented in routine practice in primary care. As part of the European Union RESTORE project, our objective was to explore whether the available G/TIs make sense to migrants and other key stakeholders and whether they could collectively choose G/TIs and engage in their implementation in primary care settings. Setting As part of a comparative analysis of 5 linked qualitative case studies, we used purposeful and snowball sampling to recruit migrants and other key stakeholders in primary care settings in Austria, England, Greece, Ireland and the Netherlands. Participants A total of 78 stakeholders participated in the study (Austria 15, England 9, Ireland 11, Greece 16, Netherlands 27), covering a range of groups (migrants, general practitioners, nurses, administrative staff, interpreters, health service planners). Primary and secondary outcome measures We combined Normalisation Process Theory (NPT) and Participatory Learning and Action (PLA) research to conduct a series of PLA style focus groups. Using a standardised protocol, stakeholders' discussions about a set of G/TIs were recorded on PLA commentary charts and their selection process was recorded through a PLA direct-ranking technique. We performed inductive and deductive thematic analysis to investigate sensemaking and engagement with the G/TIs. Results The need for new ways of working was strongly endorsed by most stakeholders. Stakeholders considered that they were the right people to drive the work forward and were keen to enrol others to support the implementation work. This was evidenced by the democratic selection by stakeholders in each setting of one G/TI as a local implementation project. Conclusions This theoretically informed participatory approach used across 5 countries with diverse healthcare systems could be used in other settings to establish positive conditions for the start of implementation journeys for G/TIs to improve healthcare for migrants. PMID:27449890
Evaluation of the attentional capacities and working memory of early and late blind persons.
Pigeon, Caroline; Marin-Lamellet, Claude
2015-02-01
Although attentional processes and working memory seem to be significantly involved in the daily activities (particularly during navigating) of persons who are blind and who use these abilities to compensate for their lack of vision, few studies have investigated these mechanisms in this population. The aim of this study is to evaluate the selective, sustained and divided attention, attentional inhibition and switching and working memory of blind persons. Early blind, late blind and sighted participants completed neuropsychological tests that were designed or adapted to be achievable in the absence of vision. The results revealed that the early blind participants outperformed the sighted ones in selective, sustained and divided attention and working memory tests, and the late blind participants outperformed the sighted participants in selective, sustained and divided attention. However, no differences were found between the blind groups and the sighted group in the attentional inhibition and switching tests. Furthermore, no differences were found between the early and late blind participants in this set of tests. These results suggest that early and late blind persons can compensate for the lack of vision by an enhancement of the attentional and working memory capacities. Copyright © 2014 Elsevier B.V. All rights reserved.
Hyperspectral data discrimination methods
NASA Astrophysics Data System (ADS)
Casasent, David P.; Chen, Xuewen
2000-12-01
Hyperspectral data provides spectral response information that provides detailed chemical, moisture, and other description of constituent parts of an item. These new sensor data are useful in USDA product inspection. However, such data introduce problems such as the curse of dimensionality, the need to reduce the number of features used to accommodate realistic small training set sizes, and the need to employ discriminatory features and still achieve good generalization (comparable training and test set performance). Several two-step methods are compared to a new and preferable single-step spectral decomposition algorithm. Initial results on hyperspectral data for good/bad almonds and for good/bad (aflatoxin infested) corn kernels are presented. The hyperspectral application addressed differs greatly from prior USDA work (PLS) in which the level of a specific channel constituent in food was estimated. A validation set (separate from the test set) is used in selecting algorithm parameters. Threshold parameters are varied to select the best Pc operating point. Initial results show that nonlinear features yield improved performance.
Crossing levels in systems ergonomics: a framework to support 'mesoergonomic' inquiry.
Karsh, Ben-Tzion; Waterson, Patrick; Holden, Richard J
2014-01-01
In this paper we elaborate and articulate the need for what has been termed 'mesoergonomics'. In particular, we argue that the concept has the potential to bridge the gap between, and integrate, established work within the domains of micro- and macroergonomics. Mesoergonomics is defined as an open systems approach to human factors and ergonomics (HFE) theory and research whereby the relationship between variables in at least two different system levels or echelons is studied, and where the dependent variables are human factors and ergonomic constructs. We present a framework which can be used to structure a set of questions for future work and prompt further empirical and conceptual inquiry. The framework consists of four steps: (1) establishing the purpose of the mesoergonomic investigation; (2) selecting human factors and ergonomics variables; (3) selecting a specific type of mesoergonomic investigation; and (4) establishing relationships between system levels. In addition, we describe two case studies which illustrate the workings of the framework and the value of adopting a mesoergonomic perspective within HFE. The paper concludes with a set of issues which could form part of a future agenda for research within systems ergonomics. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Health Services Mobility Study, Plan of Work.
ERIC Educational Resources Information Center
City Univ. of New York Research Foundation, NY.
To determine ways and means of facilitating horizontal and vertical mobility within New York City's Health Services Administration and selected private hospitals, a systems approach was adopted. Methodology for manpower development and training in an organizational setting related to the educational system and other accrediting institutions will…
45 CFR 1311.4 - Qualifications, selection, and placement.
Code of Federal Regulations, 2013 CFR
2013-10-01
... DEVELOPMENT SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES THE ADMINISTRATION FOR CHILDREN, YOUTH AND... Start program or otherwise working in the field of child development and family services. The... with services to children and families; and (5) Other appropriate settings. (c) A Head Start Fellow who...
45 CFR 1311.4 - Qualifications, selection, and placement.
Code of Federal Regulations, 2014 CFR
2014-10-01
... DEVELOPMENT SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES THE ADMINISTRATION FOR CHILDREN, YOUTH AND... Start program or otherwise working in the field of child development and family services. The... with services to children and families; and (5) Other appropriate settings. (c) A Head Start Fellow who...
45 CFR 1311.4 - Qualifications, selection, and placement.
Code of Federal Regulations, 2012 CFR
2012-10-01
... DEVELOPMENT SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES THE ADMINISTRATION FOR CHILDREN, YOUTH AND... Start program or otherwise working in the field of child development and family services. The... with services to children and families; and (5) Other appropriate settings. (c) A Head Start Fellow who...
Summers, Rachael H; Ballinger, Claire; Nikoletou, Dimitra; Garrod, Rachel; Bruton, Anne; Leontowitsch, Miranda
2017-07-01
To explore respiratory physiotherapists' views and experiences of using goal-setting with people with chronic obstructive pulmonary disease in rehabilitation settings. A total of 17 respiratory physiotherapists with ⩾12 months current or previous experience of working with patients with chronic obstructive pulmonary disease in a non-acute setting. Participants were diverse in relation to age (25-49 years), sex (13 women), experience (Agenda for Change bands 6-8) and geographic location. Data were collected via face-to-face qualitative in-depth interviews (40-70 minutes) using a semi-structured interview guide. Interview locations were selected by participants (included participants' homes, public places and University). Interviews followed an interview guide, were audio-recorded and transcribed verbatim. Data were analysed using thematic analysis; constant comparison was made within and between accounts, and negative case analysis was used. Three themes emerged through the process of analysis: (1) 'Explaining goal-setting'; (2) 'Working with goals'; and (3) 'Influences on collaborative goal-setting'. Goal-setting practices among respiratory physiotherapists varied considerably. Collaborative goal-setting was described as challenging and was sometimes driven by service need rather than patient values. Lack of training in collaborative goal-setting at both undergraduate and postgraduate level was also seen as an issue. Respiratory physiotherapists reflected uncertainties around the use of goal-setting in their practice, and conflict between patients' goals and organisational demands. This work highlights a need for wider discussion to clarify the purpose and implementation of goal-setting in respiratory rehabilitation.
A Methodology for Selection of a Satellite Servicing Architecture. Volume 3. Appendices.
1985-12-01
draft copies of this work is greatly appreciated. Dis DII ii, I - A special measure of gratitude belongs to Major Dennis Clark for his help with...bute pair when its complementary set changed values, that pair of attributes can be considered PPI of its complemen- tary set. It is recommended that...a system. 2-2 Mission Accomplishment: The system is desired to accomplish its mission by meeting various specifica- tions. These specifications
Gajski, Goran; Ladeira, Carina; Gerić, Marko; Garaj-Vrhovac, Vera; Viegas, Susana
2018-02-01
Cytostatic drugs are highly cytotoxic agents used in cancer treatment and although their benefit is unquestionable, they have been recognized as hazardous to healthcare professionals in occupational settings. In a working environment, simultaneous exposure to cytostatics may occur creating a higher risk than that of a single substance. Hence, the present study evaluated the combined cyto/genotoxicity of a mixture of selected cytostatics with different mechanisms of action (MoA; 5-fluorouracil, cyclophosphamide and paclitaxel) towards human lymphocytes in vitro at a concentration range relevant for occupational as well as environmental exposure. The results suggest that the selected cytostatic drug mixture is potentially cyto/genotoxic and that it can induce cell and genome damage even at low concentrations. This indicates not only that such mixture may pose a risk to cell and genome integrity, but also that single compound toxicity data are not sufficient for the prediction of toxicity in a complex working environment. The presence of drugs in different amounts and with different MoA suggests the need to study the relationship between the presence of genotoxic components in the mixture and the resulting effects, taking into account the MoA of each component by itself. Therefore, this study provides new data sets necessary for scientifically-based risk assessments of cytostatic drug mixtures in occupational as well as environmental settings. Copyright © 2017 Elsevier Inc. All rights reserved.
Small Engine Technology (SET) Task 24 Business and Regional Aircraft System Studies
NASA Technical Reports Server (NTRS)
Lieber, Lysbeth
2003-01-01
This final report has been prepared by Honeywell Engines & Systems, Phoenix, Arizona, a unit of Honeywell International Inc., documenting work performed during the period June 1999 through December 1999 for the National Aeronautics and Space Administration (NASA) Glenn Research Center, Cleveland, Ohio, under the Small Engine Technology (SET) Program, Contract No. NAS3-27483, Task Order 24, Business and Regional Aircraft System Studies. The work performed under SET Task 24 consisted of evaluating the noise reduction benefits compared to the baseline noise levels of representative 1992 technology aircraft, obtained by applying different combinations of noise reduction technologies to five business and regional aircraft configurations. This report focuses on the selection of the aircraft configurations and noise reduction technologies, the prediction of noise levels for those aircraft, and the comparison of the noise levels with those of the baseline aircraft.
Ma, Yue; Yin, Fei; Zhang, Tao; Zhou, Xiaohua Andrew; Li, Xiaosong
2016-01-01
Spatial scan statistics are widely used in various fields. The performance of these statistics is influenced by parameters, such as maximum spatial cluster size, and can be improved by parameter selection using performance measures. Current performance measures are based on the presence of clusters and are thus inapplicable to data sets without known clusters. In this work, we propose a novel overall performance measure called maximum clustering set–proportion (MCS-P), which is based on the likelihood of the union of detected clusters and the applied dataset. MCS-P was compared with existing performance measures in a simulation study to select the maximum spatial cluster size. Results of other performance measures, such as sensitivity and misclassification, suggest that the spatial scan statistic achieves accurate results in most scenarios with the maximum spatial cluster sizes selected using MCS-P. Given that previously known clusters are not required in the proposed strategy, selection of the optimal maximum cluster size with MCS-P can improve the performance of the scan statistic in applications without identified clusters. PMID:26820646
Measuring nurses' perception of work environment: a scoping review of questionnaires.
Norman, Rebecka Maria; Sjetne, Ingeborg Strømseng
2017-01-01
Nurses' work environment has been shown to be associated with quality of care and organizational outcomes. In order to monitor the work environment, it is useful for all stakeholders to know the questionnaires that assess or evaluate conditions for delivering nursing care. The aim of this article is: to review the literature for assessed survey questionnaires that measure nurses' perception of their work environment, make a brief assessment, and map the content domains included in a selection of questionnaires. The search included electronic databases of internationally published literature, international websites, and hand searches of reference lists. Eligible papers describing a questionnaire had to be; a) suitable for nurses working in direct care in general hospitals, nursing homes or home healthcare settings; and b) constructed to measure work environment characteristics that are amenable to change and related to patient and organizational outcomes; and c) presented along with an assessment of their measurement properties. The search yielded 5077 unique articles. For the final synthesis, 65 articles met inclusion criteria, consisting of 34 questionnaires measuring nursing work environments in different settings. Most of the questionnaires that we found were developed, and tested, for registered nurses in a general hospital setting. Six questionnaires were developed specifically for use in nursing home settings and one for home healthcare. The content domains covered by the questionnaires were both overlapping and unique and the terminology in use was inconsistent. The most common content domains in the work environment questionnaires were supportive managers, collaborative relationships with peers, busyness, professional practice and autonomy. The findings from this review enhance the understanding of how "work environment" can be measured by an overview of existing questionnaires and domains. Our results indicate that there are very many work environment questionnaires with varying content.
Breast Cancer and Women's Labor Supply
Bradley, Cathy J; Bednarek, Heather L; Neumark, David
2002-01-01
Objective To investigate the effect of breast cancer on women's labor supply. Date Source/Study Setting Using the 1992 Health and Retirement Study, we estimate the probability of working using probit regression and then, for women who are employed, we estimate regressions for average weekly hours worked using ordinary least squares (OLS). We control for health status by using responses to perceived health status and comorbidities. For a sample of married women, we control for spouses' employer-based health insurance. We also perform additional analyses to detect selection bias in our sample. Principal Findings We find that the probability of breast cancer survivors working is 10 percentage points less than that for women without breast cancer. Among women who work, breast cancer survivors work approximately three more hours per week than women who do not have cancer. Results of similar magnitude persist after health status is controlled in the analysis, and although we could not definitively rule out selection bias, we could not find evidence that our results are attributable to selection bias. Conclusions For some women, breast cancer may impose an economic hardship because it causes them to leave their jobs. However, for women who survive and remain working, this study failed to show a negative effect on hours worked associated with breast cancer. Perhaps the morbidity associated with certain types and stages of breast cancer and its treatment does not interfere with work. PMID:12479498
Pasley, Thomas; Poole, Phillippa
2009-04-03
To assess the level of interest in regional/rural (RR) practice in final year Auckland medical students and to investigate the demographic characteristics and speciality intentions of these students. A questionnaire was distributed to all graduating students from The University of Auckland's School of Medicine (SOM) in 2006 and 2007. Students intending to work in a RR setting had their demographic data and intended specialty compared with students intending to work in the city. There were 186 respondents, with a response rate of 71%. Of this cohort, 58% stated an intention to work in a city, 15% in an RR setting, and 27% were undecided. RR-destined students were more likely to be Maori and less likely to be Asian then their city-destined counterparts. RR students were more likely to have strong interests in general practice than students intending to work in the city. Prior to the introduction of a specific rural selection pathway, Auckland medical students have shown a similar level of interest in RR medicine when compared to previous studies. However the proportion of students interested in RR health is significantly below the current proportion of people living in RR areas. The large proportion of students undecided on career setting at graduation suggests there may be room to increase the proportion further through formative early postgraduate experiences, or other incentives.
1996-01-01
Overwhelming postsplenectomy infection should be preventable if simple precautions are taken. An ad hoc working party of the British Committee for Standards in Haematology has reviewed recommendations for patients without a spleen and drawn up a consensus. Members of the working party were selected for their personal expertise and to represent relevant professional bodies. The guidelines, which are set out below, include and extend the chief medical officer's 1994 update. PMID:8601117
School-Based Clinics That Work.
ERIC Educational Resources Information Center
Public Health Service (DHHS), Rockville, MD.
This paper describes a small set of successful school-based clinics (SBCs) that provide primary health care services for the underserved and identifies factors contributing to their success. Six sites were selected on the basis of three general criteria: (1) direct involvement between the SBC and a federally-funded community health center (CHC);…
Research on the aircraft level measurement by laser tracker
NASA Astrophysics Data System (ADS)
Ye, Xiaowen; Tang, Wuzhong; Cao, Chun
2014-09-01
The measuring principle of laser tracking system was introduced. The aircraft level measurement was completed by establish the measurement datum mark, select public sites, set up the aircraft coordinate system and transfer stations. Laser tracking measurement technology improved the work efficiency and ensured the installation precision of key components.
Effective Group Work for Elementary School-Age Children Whose Parents Are Divorcing.
ERIC Educational Resources Information Center
DeLucia-Waack, Janice; Gerrity, Deborah
2001-01-01
Parental divorce is the issue of most concern for elementary school children. This article describes interventions for children-of-divorce groups for elementary school children. Suggests guidelines related to goal setting; securing agency and parental consent; leadership planning; recruitment, screening, and selection of members; group member…
A Custom Fit with a Commissioned Song.
ERIC Educational Resources Information Center
Strempel, Eileen
1999-01-01
Discusses the benefits of commissioning a work for a choral group. Provides guidelines for music educators who commission a piece: (1) know your own needs; (2) find a composer who interests you; (3) help the composer select appropriate lyrics; (4) set a tentative schedule; (5) consider the costs. (CMK)
A review on setting appropriate reach length for biological assessment of boatable rivers
Researchers working on boatable rivers are presented with the task of selecting an appropriate stream length, or reach length, from which data will be collected. Ideally, the sampling effort applied is the minimum that will allow stated objectives to be addressed as required by a...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chochoms, Michael
2017-02-23
This course presents information for working safely with portable ladders: specifically, stepladders, extensions ladders, and their derivations. Additionally, this course provides limited information on the safe use of stepstools and fixed ladders. The skills, techniques, and good practices needed for selecting, inspecting, setting up and securing, and using ladders are presented in this course.
ERIC Educational Resources Information Center
Boring, Michael R.
2011-01-01
The selection of a superintendent is among the most significant decisions any school district board of directors will make. The superintendent should be the person and office through which the direction set by the board is carried out. The working relationship that develops between the board and superintendent is thus absolutely critical and the…
Emerging Models of the New Paradigm.
ERIC Educational Resources Information Center
Howser, Lee; Schwinn, Carole
Working with the Philadelphia-based Institute of Interactive Management, several teams at Jackson Community College (JCC), in Michigan, set out in 1994 to learn and apply an interactive design methodology to selected college subsystems. Interactive design begins with understanding problems faced by the system as a whole, which in the case of JCC…
DOT National Transportation Integrated Search
1993-01-01
This 2-CD set presents data and information from the 1993 Commodity Flow Survey (CFS) on the movement of goods and products shipped by manufacturing, mining, wholesale, and selected retail establishments in the United States. The data cover domestic ...
Convergent and Divergent Validity of the Learning Transfer System Inventory
ERIC Educational Resources Information Center
Holton, Elwood F., III; Bates, Reid A.; Bookter, Annette I.; Yamkovenko, V. Bogdan
2007-01-01
The Learning Transfer System Inventory (LTSI) was developed to identify a select set of factors with the potential to substantially enhance or inhibit transfer of learning to the work environment. It has undergone a variety of validation studies, including construct, criterion, and crosscultural studies. However, the convergent and divergent…
Endophenotypes for Intelligence in Children and Adolescents
ERIC Educational Resources Information Center
van Leeuwen, Marieke; van den Berg, Stephanie M.; Hoekstra, Rosa A.; Boomsma, Dorret I.
2007-01-01
The aim of this study was to identify promising endophenotypes for intelligence in children and adolescents for future genetic studies in cognitive development. Based on the available set of endophenotypes for intelligence in adults, cognitive tasks were chosen covering the domains of working memory, processing speed, and selective attention. This…
Monitoring the capacity of working memory: Executive control and effects of listening effort
Amichetti, Nicole M.; Stanley, Raymond S.; White, Alison G.
2013-01-01
In two experiments, we used an interruption-and-recall (IAR) task to explore listeners’ ability to monitor the capacity of working memory as new information arrived in real time. In this task, listeners heard recorded word lists with instructions to interrupt the input at the maximum point that would still allow for perfect recall. Experiment 1 demonstrated that the most commonly selected segment size closely matched participants’ memory span, as measured in a baseline span test. Experiment 2 showed that reducing the sound level of presented word lists to a suprathreshold but effortful listening level disrupted the accuracy of matching selected segment sizes with participants’ memory spans. The results are discussed in terms of whether online capacity monitoring may be subsumed under other, already enumerated working memory executive functions (inhibition, set shifting, and memory updating). PMID:23400826
Jović, Ozren; Smrečki, Neven; Popović, Zora
2016-04-01
A novel quantitative prediction and variable selection method called interval ridge regression (iRR) is studied in this work. The method is performed on six data sets of FTIR, two data sets of UV-vis and one data set of DSC. The obtained results show that models built with ridge regression on optimal variables selected with iRR significantly outperfom models built with ridge regression on all variables in both calibration (6 out of 9 cases) and validation (2 out of 9 cases). In this study, iRR is also compared with interval partial least squares regression (iPLS). iRR outperfomed iPLS in validation (insignificantly in 6 out of 9 cases and significantly in one out of 9 cases for p<0.05). Also, iRR can be a fast alternative to iPLS, especially in case of unknown degree of complexity of analyzed system, i.e. if upper limit of number of latent variables is not easily estimated for iPLS. Adulteration of hempseed (H) oil, a well known health beneficial nutrient, is studied in this work by mixing it with cheap and widely used oils such as soybean (So) oil, rapeseed (R) oil and sunflower (Su) oil. Binary mixture sets of hempseed oil with these three oils (HSo, HR and HSu) and a ternary mixture set of H oil, R oil and Su oil (HRSu) were considered. The obtained accuracy indicates that using iRR on FTIR and UV-vis data, each particular oil can be very successfully quantified (in all 8 cases RMSEP<1.2%). This means that FTIR-ATR coupled with iRR can very rapidly and effectively determine the level of adulteration in the adulterated hempseed oil (R(2)>0.99). Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Prashanth, B. N.; Roy, Kingshuk
2017-07-01
Three Dimensional (3D) maintenance data provides a link between design and technical documentation creating interactive 3D graphical training and maintenance material. It becomes difficult for an operator to always go through huge paper manuals or come running to the computer for doing maintenance of a machine which makes the maintenance work fatigue. Above being the case, a 3D animation makes maintenance work very simple since, there is no language barrier. The research deals with the generation of 3D maintenance data of any given machine. The best tool for obtaining the 3D maintenance is selected and the tool is analyzed. Using the same tool, a detailed process for extracting the 3D maintenance data for any machine is set. This project aims at selecting the best tool for obtaining 3D maintenance data and to select the detailed process for obtaining 3D maintenance data. 3D maintenance reduces use of big volumes of manuals which creates human errors and makes the work of an operator fatiguing. Hence 3-D maintenance would help in training and maintenance and would increase productivity. 3Dvia when compared with Cortona 3D and Deep Exploration proves to be better than them. 3Dvia is good in data translation and it has the best renderings compared to the other two 3D maintenance software. 3Dvia is very user friendly and it has various options for creating 3D animations. Its Interactive Electronic Technical Publication (IETP) integration is also better than the other two software. Hence 3Dvia proves to be the best software for obtaining 3D maintenance data of any machine.
Kuligowski, Julia; Carrión, David; Quintás, Guillermo; Garrigues, Salvador; de la Guardia, Miguel
2011-01-01
The selection of an appropriate calibration set is a critical step in multivariate method development. In this work, the effect of using different calibration sets, based on a previous classification of unknown samples, on the partial least squares (PLS) regression model performance has been discussed. As an example, attenuated total reflection (ATR) mid-infrared spectra of deep-fried vegetable oil samples from three botanical origins (olive, sunflower, and corn oil), with increasing polymerized triacylglyceride (PTG) content induced by a deep-frying process were employed. The use of a one-class-classifier partial least squares-discriminant analysis (PLS-DA) and a rooted binary directed acyclic graph tree provided accurate oil classification. Oil samples fried without foodstuff could be classified correctly, independent of their PTG content. However, class separation of oil samples fried with foodstuff, was less evident. The combined use of double-cross model validation with permutation testing was used to validate the obtained PLS-DA classification models, confirming the results. To discuss the usefulness of the selection of an appropriate PLS calibration set, the PTG content was determined by calculating a PLS model based on the previously selected classes. In comparison to a PLS model calculated using a pooled calibration set containing samples from all classes, the root mean square error of prediction could be improved significantly using PLS models based on the selected calibration sets using PLS-DA, ranging between 1.06 and 2.91% (w/w).
Machine Learning Feature Selection for Tuning Memory Page Swapping
2013-09-01
environments we set up. 13 Figure 4.1 Updated Feature Vector List. Features we added to the kernel are anno - tated with “(MLVM...Feb. 1966. [2] P. J . Denning, “The working set model for program behavior,” Communications of the ACM, vol. 11, no. 5, pp. 323–333, May 1968. [3] L. A...8] R. W. Cart and J . L. Hennessy, “WSClock — A simple and effective algorithm for virtual memory management,” M.S. thesis, Dept. Computer Science
Abuse behavior of high-power, lithium-ion cells
NASA Astrophysics Data System (ADS)
Spotnitz, R.; Franklin, J.
Published accounts of abuse testing of lithium-ion cells and components are summarized, including modeling work. From this summary, a set of exothermic reactions is selected with corresponding estimates of heats of reaction. Using this set of reactions, along with estimated kinetic parameters and designs for high-rate batteries, models for the abuse behavior (oven, short-circuit, overcharge, nail, crush) are developed. Finally, the models are used to determine that fluorinated binder plays a relatively unimportant role in thermal runaway.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hodge, Bri-Mathias
2016-04-08
The primary objective of this work was to create a state-of-the-art national wind resource data set and to provide detailed wind plant output data for specific sites based on that data set. Corresponding retrospective wind forecasts were also included at all selected locations. The combined information from these activities was used to create the Wind Integration National Dataset (WIND), and an extraction tool was developed to allow web-based data access.
Robust Statistics and Regularization for Feature Extraction and UXO Discrimination
2011-07-01
July 11, 2011 real data we find that this technique has an improved probability of finding all ordnance in a test data set, relative to previously...many sites. Tests on larger data sets should still be carried out. In previous work we considered a bootstrapping approach to selecting the operating...Marginalizing over x we obtain the probability that the ith order statistic in the test data belongs to the T class (55) P (T |x(i)) = ∞∫ −∞ P (T |x)p(x
Nondestructive equipment study
NASA Technical Reports Server (NTRS)
1985-01-01
Identification of existing nondestructive Evaluation (NDE) methods that could be used in a low Earth orbit environment; evaluation of each method with respect to the set of criteria called out in the statement of work; selection of the most promising NDE methods for further evaluation; use of selected NDE methods to test samples of pressure vessel materials in a vacuum; pressure testing of a complex monolythic pressure vessel with known flaws using acoustic emissions in a vacuum; and recommendations for further studies based on analysis and testing are covered.
Gea-Caballero, Vicente; Castro-Sánchez, Enrique; Júarez-Vela, Raúl; Díaz-Herrera, Miguel Ángel; de Miguel-Montoya, Isabel; Martínez-Riera, José Ramón
Nursing work environments are key determinants of care quality. Our study aimed to evaluate the characteristics of nursing environments in primary care settings in the Canary Islands, and identify crucial components of such environments to improve quality. We conducted a cross-sectional study in primary care organisations using the Practice Environment Scale - Nursing Work Index tool. We collected sociodemographic variables, scores, and selected the essential items conducive to optimal care. Appropriate parametric and non-parametric statistical tests were used to analyse relations between variables (CI = 95%, error = 5%). One hundred and forty-four nurses participated. The mean total score was 81.6. The results for the five dimensions included in the Practice Environment Scale - Nursing Work Index ranged from 2.25 - 2.92 (Mean). Twelve key items for quality of care were selected; six were positive in the Canary Islands, two were mixed, and four negative. 7/12 items were included in Dimension 2 (fundamentals of nursing). Being a manager was statistically associated with higher scores (p<.000). Years of experience was inversely associated with scores in the 12 items (p<.021). Nursing work environments in primary care settings in the Canary Islands are comparable to others previously studied in Spain. Areas to improve were human resources and participation of nurses in management decisions. Nurse managers must be knowledgeable about their working environments so they can focus on improvements in key dimensions. Copyright © 2017 Elsevier España, S.L.U. All rights reserved.
[Influence of Nurses' Self-leadership on Individual and Team Members' Work Role Performance].
Kim, Se Young; Kim, Eun Kyung; Kim, Byungsoo; Lee, Eunpyo
2016-06-01
The purpose of this study was to examine correlations between nurses' self-leadership and individual work role performance and correlations between self-leadership in nursing units and team members' work role performance. Participants were 202 conveniently selected general nurses from 5 general hospitals in Korea. The study was carried out on 35 nursing units. Data were collected during February 2015 with self-report questionnaires. For factors affecting individual work role performance, self-expectation, self-goal setting, constructive thought, clinical career in the present nursing unit and marital status accounted for 44.0% of proficiency, while self-expectation, self-goal setting, constructive thought, and marital status accounted for 42.3% of adaptivity. Self-expectation, self-goal setting, constructive thought, self-reward, clinical career in the present nursing unit and position accounted for 26.4% of proactivity. In terms of team members' work role performance, self-reward and self-expectation in nursing units explained 29.0% of team members' proficiency. Self-reward and self-expectation in nursing units explained 31.6% of team members' adaptivity, and self-reward in nursing units explained 16.8% of team members' proactivity. The results confirm that nurses' self-leadership affects not only individual self-leadership but also team members' work role performance. Accordingly, to improve nurses' work role performance in nursing units of nursing organizations, improvement in nursing environment based on self-leadership education is necessary and nurses' tasks rearranged so they can appreciate work-autonomy and challenges of work.
Wu, J; Kreis, I; Griffiths, D; Darling, C
2002-01-01
Aims: To determine the association between lung function of coke oven workers and exposure to coke oven emissions. Methods: Lung function data and detailed work histories for workers in recovery coke ovens of a steelworks were extracted from a lung function surveillance system. Multiple regressions were employed to determine significant predictors for lung function indices. The first sets of lung function tests for 613 new starters were pooled to assess the selection bias. The last sets of lung function tests for 834 subjects with one or more year of coke oven history were pooled to assess determinants of lung function. Results: Selection bias associated with the recruitment process was not observed among the exposure groups. For subjects with a history of one or more years of coke oven work, each year of working in the most exposed "operation" position was associated with reductions in FEV1 of around 9 ml (p = 0.006, 95% CI: 3 ml to 16 ml) and in FVC of around 12 ml (p = 0.002, 95% CI: 4 ml to 19 ml). Negative effects of smoking on lung function were also observed. Conclusions: Exposure to coke oven emissions was found to be associated with lower FEV1 and FVC. Effects of work exposure on lung function are similar to those found in other studies. PMID:12468747
Simple communication using a SSVEP-based BCI
NASA Astrophysics Data System (ADS)
Sanchez, Guillermo; Diez, Pablo F.; Avila, Enrique; Laciar Leber, Eric
2011-12-01
Majority of Brain-Computer Interface (BCI) for communication purposes are speller, i.e., the user has to select letter by letter. In this work, is proposed a different approach where the user can select words from a word set designed in order to answer a wide range of questions. The word selection process is commanded by a Steady-state visual evoked potential (SSVEP) based-BCI that allows selecting a word in an average time of 26 s with accuracies of 92% on average. This BCI is focus in the first stages on rehabilitation or even in first moments of some diseases (such as stroke), when the person is eager to communicate with family and doctors.
A comparison of the nursing practice environment in mental health and medical-surgical settings.
Roche, Michael A; Duffield, Christine M
2010-06-01
To examine the differences between characteristics of the work environment of nurses working in mental health and general acute inpatient nursing settings. Secondary analysis of data collected on 96 randomly selected medical and surgical (general) wards and six mental health wards in 24 public acute general hospitals across two Australian states between 2004 and 2006. All nurses on the participating wards were asked to complete a survey that included the Practice Environment Scale of the Nursing Work Index (NWI-PES). Responses were received from 2,556 nurses (76.3% response rate). Using the five-domain structure, comparisons were made between mental health and general nurses. Across the entire sample of nurses, those working in mental health settings scored more highly in regard to nurse-doctor relationships and staffing adequacy. Nurses in general wards reported more participation in hospital affairs, stronger leadership, and the presence of more of the foundations of nursing quality care such as access to continued education. Differences between the groups on each of the domains was statistically significant at p=.05 or greater, but not for the composite practice environment scale. A wide range of responses was seen when data were aggregated to the ward level. The work environment of mental health nurses is different from that of their colleagues working in general settings. Specific areas of the mental health environment, such as participation in the hospital, leadership, and the foundations of quality, may be enhanced to improve nurses' job satisfaction and, potentially, other nurse and patient outcomes. Factors in the medical and surgical nursing practice environment have been established as significant influences on nurse and patient outcomes. It is important to understand the existence and potential impact of these factors in mental health inpatient settings.
Shimoni, Yishai
2018-02-01
One of the goals of cancer research is to identify a set of genes that cause or control disease progression. However, although multiple such gene sets were published, these are usually in very poor agreement with each other, and very few of the genes proved to be functional therapeutic targets. Furthermore, recent findings from a breast cancer gene-expression cohort showed that sets of genes selected randomly can be used to predict survival with a much higher probability than expected. These results imply that many of the genes identified in breast cancer gene expression analysis may not be causal of cancer progression, even though they can still be highly predictive of prognosis. We performed a similar analysis on all the cancer types available in the cancer genome atlas (TCGA), namely, estimating the predictive power of random gene sets for survival. Our work shows that most cancer types exhibit the property that random selections of genes are more predictive of survival than expected. In contrast to previous work, this property is not removed by using a proliferation signature, which implies that proliferation may not always be the confounder that drives this property. We suggest one possible solution in the form of data-driven sub-classification to reduce this property significantly. Our results suggest that the predictive power of random gene sets may be used to identify the existence of sub-classes in the data, and thus may allow better understanding of patient stratification. Furthermore, by reducing the observed bias this may allow more direct identification of biologically relevant, and potentially causal, genes.
2018-01-01
One of the goals of cancer research is to identify a set of genes that cause or control disease progression. However, although multiple such gene sets were published, these are usually in very poor agreement with each other, and very few of the genes proved to be functional therapeutic targets. Furthermore, recent findings from a breast cancer gene-expression cohort showed that sets of genes selected randomly can be used to predict survival with a much higher probability than expected. These results imply that many of the genes identified in breast cancer gene expression analysis may not be causal of cancer progression, even though they can still be highly predictive of prognosis. We performed a similar analysis on all the cancer types available in the cancer genome atlas (TCGA), namely, estimating the predictive power of random gene sets for survival. Our work shows that most cancer types exhibit the property that random selections of genes are more predictive of survival than expected. In contrast to previous work, this property is not removed by using a proliferation signature, which implies that proliferation may not always be the confounder that drives this property. We suggest one possible solution in the form of data-driven sub-classification to reduce this property significantly. Our results suggest that the predictive power of random gene sets may be used to identify the existence of sub-classes in the data, and thus may allow better understanding of patient stratification. Furthermore, by reducing the observed bias this may allow more direct identification of biologically relevant, and potentially causal, genes. PMID:29470520
Vařeková, Radka Svobodová; Jiroušková, Zuzana; Vaněk, Jakub; Suchomel, Šimon; Koča, Jaroslav
2007-01-01
The Electronegativity Equalization Method (EEM) is a fast approach for charge calculation. A challenging part of the EEM is the parameterization, which is performed using ab initio charges obtained for a set of molecules. The goal of our work was to perform the EEM parameterization for selected sets of organic, organohalogen and organometal molecules. We have performed the most robust parameterization published so far. The EEM parameterization was based on 12 training sets selected from a database of predicted 3D structures (NCI DIS) and from a database of crystallographic structures (CSD). Each set contained from 2000 to 6000 molecules. We have shown that the number of molecules in the training set is very important for quality of the parameters. We have improved EEM parameters (STO-3G MPA charges) for elements that were already parameterized, specifically: C, O, N, H, S, F and Cl. The new parameters provide more accurate charges than those published previously. We have also developed new parameters for elements that were not parameterized yet, specifically for Br, I, Fe and Zn. We have also performed crossover validation of all obtained parameters using all training sets that included relevant elements and confirmed that calculated parameters provide accurate charges.
Does WIC work? The effects of WIC on pregnancy and birth outcomes.
Bitler, Marianne P; Currie, Janet
2005-01-01
Support for WIC, the Special Supplemental Nutrition Program for Women, Infants, and Children, is based on the belief that "WIC works." This consensus has lately been questioned by researchers who point out that most WIC research fails to properly control for selection into the program. This paper evaluates the selection problem using rich data from the national Pregnancy Risk Assessment Monitoring System. We show that relative to Medicaid mothers, all of whom are eligible for WIC, WIC participants are negatively selected on a wide array of observable dimensions, and yet WIC participation is associated with improved birth outcomes, even after controlling for observables and for a full set of state-year interactions intended to capture unobservables that vary at the state-year level. The positive impacts of WIC are larger among subsets of even more disadvantaged women, such as those who received public assistance last year, single high school dropouts, and teen mothers.
Badland, Hannah M.; Oliver, Melody; Kearns, Robin A.; Mavoa, Suzanne; Witten, Karen; Duncan, Mitch J.; Batty, G. David
2012-01-01
Although the neighbourhoods and health field is well established, the relationships between neighbourhood selection, neighbourhood preference, work-related travel behaviours, and transport infrastructure have not been fully explored. It is likely that understanding these complex relationships more fully will inform urban policy development, and planning for neighbourhoods that support health behaviours. Accordingly, the objective of this study was to identify associations between these variables in a sample of employed adults. Self-reported demographic, work-related transport behaviours, and neighbourhood preference data were collected from 1616 employed adults recruited from 48 neighbourhoods located across four New Zealand cities. Data were collected between April 2008 and September 2010. Neighbourhood built environment measures were generated using geographical information systems. Findings demonstrated that more people preferred to live in urban (more walkable), rather than suburban (less walkable) settings. Those living in more suburban neighbourhoods had significantly longer work commute distances and lower density of public transport stops available within the neighbourhood when compared with those who lived in more urban neighbourhoods. Those preferring a suburban style neighbourhood commuted approximately 1.5 km further to work when compared with participants preferring urban settings. Respondents who preferred a suburban style neighbourhood were less likely to take public or active transport to/from work when compared with those who preferred an urban style setting, regardless of the neighbourhood type in which they resided. Although it is unlikely that constructing more walkable environments will result in work-related travel behaviour change for all, providing additional highly walkable environments will help satisfy the demand for these settings, reinforce positive health behaviours, and support those amenable to change to engage in higher levels of work-related public and active transport. PMID:22784376
Marino, Miguel; Killerby, Marie; Lee, Soomi; Klein, Laura Cousino; Moen, Phyllis; Olson, Ryan; Kossek, Ellen Ernst; King, Rosalind; Erickson, Leslie; Berkman, Lisa F.; Buxton, Orfeu M.
2016-01-01
Objectives To evaluate the effects of a workplace-based intervention on actigraphic and self-reported sleep outcomes in an extended care setting. Design Cluster randomized trial. Setting Extended-care (nursing) facilities. Participants US employees and managers at nursing homes. Nursing homes were randomly selected to intervention or control settings. Intervention The Work, Family and Health Study developed an intervention aimed at reducing work-family conflict within a 4-month work-family organizational change process. Employees participated in interactive sessions with facilitated discussions, role-playing, and games designed to increase control over work processes and work time. Managers completed training in family-supportive supervision. Measurements Primary actigraphic outcomes included: total sleep duration, wake after sleep onset, nighttime sleep, variation in nighttime sleep, nap duration, and number of naps. Secondary survey outcomes included work-to-family conflict, sleep insufficiency, insomnia symptoms and sleep quality. Measures were obtained at baseline, 6-months and 12-months post-intervention. Results A total of 1,522 employees and 184 managers provided survey data at baseline. Managers and employees in the intervention arm showed no significant difference in sleep outcomes over time compared to control participants. Sleep outcomes were not moderated by work-to-family conflict or presence of children in the household for managers or employees. Age significantly moderated an intervention effect on nighttime sleep among employees (p=0.040), where younger employees benefited more from the intervention. Conclusion In the context of an extended-care nursing home workplace, the intervention did not significantly alter sleep outcomes in either managers or employees. Moderating effects of age were identified where younger employees’ sleep outcomes benefited more from the intervention. PMID:28239635
On the use of feature selection to improve the detection of sea oil spills in SAR images
NASA Astrophysics Data System (ADS)
Mera, David; Bolon-Canedo, Veronica; Cotos, J. M.; Alonso-Betanzos, Amparo
2017-03-01
Fast and effective oil spill detection systems are crucial to ensure a proper response to environmental emergencies caused by hydrocarbon pollution on the ocean's surface. Typically, these systems uncover not only oil spills, but also a high number of look-alikes. The feature extraction is a critical and computationally intensive phase where each detected dark spot is independently examined. Traditionally, detection systems use an arbitrary set of features to discriminate between oil spills and look-alikes phenomena. However, Feature Selection (FS) methods based on Machine Learning (ML) have proved to be very useful in real domains for enhancing the generalization capabilities of the classifiers, while discarding the existing irrelevant features. In this work, we present a generic and systematic approach, based on FS methods, for choosing a concise and relevant set of features to improve the oil spill detection systems. We have compared five FS methods: Correlation-based feature selection (CFS), Consistency-based filter, Information Gain, ReliefF and Recursive Feature Elimination for Support Vector Machine (SVM-RFE). They were applied on a 141-input vector composed of features from a collection of outstanding studies. Selected features were validated via a Support Vector Machine (SVM) classifier and the results were compared with previous works. Test experiments revealed that the classifier trained with the 6-input feature vector proposed by SVM-RFE achieved the best accuracy and Cohen's kappa coefficient (87.1% and 74.06% respectively). This is a smaller feature combination with similar or even better classification accuracy than previous works. The presented finding allows to speed up the feature extraction phase without reducing the classifier accuracy. Experiments also confirmed the significance of the geometrical features since 75.0% of the different features selected by the applied FS methods as well as 66.67% of the proposed 6-input feature vector belong to this category.
Zawbaa, Hossam M; Szlȩk, Jakub; Grosan, Crina; Jachowicz, Renata; Mendyk, Aleksander
2016-01-01
Poly-lactide-co-glycolide (PLGA) is a copolymer of lactic and glycolic acid. Drug release from PLGA microspheres depends not only on polymer properties but also on drug type, particle size, morphology of microspheres, release conditions, etc. Selecting a subset of relevant properties for PLGA is a challenging machine learning task as there are over three hundred features to consider. In this work, we formulate the selection of critical attributes for PLGA as a multiobjective optimization problem with the aim of minimizing the error of predicting the dissolution profile while reducing the number of attributes selected. Four bio-inspired optimization algorithms: antlion optimization, binary version of antlion optimization, grey wolf optimization, and social spider optimization are used to select the optimal feature set for predicting the dissolution profile of PLGA. Besides these, LASSO algorithm is also used for comparisons. Selection of crucial variables is performed under the assumption that both predictability and model simplicity are of equal importance to the final result. During the feature selection process, a set of input variables is employed to find minimum generalization error across different predictive models and their settings/architectures. The methodology is evaluated using predictive modeling for which various tools are chosen, such as Cubist, random forests, artificial neural networks (monotonic MLP, deep learning MLP), multivariate adaptive regression splines, classification and regression tree, and hybrid systems of fuzzy logic and evolutionary computations (fugeR). The experimental results are compared with the results reported by Szlȩk. We obtain a normalized root mean square error (NRMSE) of 15.97% versus 15.4%, and the number of selected input features is smaller, nine versus eleven.
Zawbaa, Hossam M.; Szlȩk, Jakub; Grosan, Crina; Jachowicz, Renata; Mendyk, Aleksander
2016-01-01
Poly-lactide-co-glycolide (PLGA) is a copolymer of lactic and glycolic acid. Drug release from PLGA microspheres depends not only on polymer properties but also on drug type, particle size, morphology of microspheres, release conditions, etc. Selecting a subset of relevant properties for PLGA is a challenging machine learning task as there are over three hundred features to consider. In this work, we formulate the selection of critical attributes for PLGA as a multiobjective optimization problem with the aim of minimizing the error of predicting the dissolution profile while reducing the number of attributes selected. Four bio-inspired optimization algorithms: antlion optimization, binary version of antlion optimization, grey wolf optimization, and social spider optimization are used to select the optimal feature set for predicting the dissolution profile of PLGA. Besides these, LASSO algorithm is also used for comparisons. Selection of crucial variables is performed under the assumption that both predictability and model simplicity are of equal importance to the final result. During the feature selection process, a set of input variables is employed to find minimum generalization error across different predictive models and their settings/architectures. The methodology is evaluated using predictive modeling for which various tools are chosen, such as Cubist, random forests, artificial neural networks (monotonic MLP, deep learning MLP), multivariate adaptive regression splines, classification and regression tree, and hybrid systems of fuzzy logic and evolutionary computations (fugeR). The experimental results are compared with the results reported by Szlȩk. We obtain a normalized root mean square error (NRMSE) of 15.97% versus 15.4%, and the number of selected input features is smaller, nine versus eleven. PMID:27315205
Caregivers of Infants and Toddlers: Instructor's Guide.
ERIC Educational Resources Information Center
Texas Tech Univ., Lubbock. Home Economics Instructional Materials Center.
This guide for postsecondary child development instructors is intended for use in courses on caring for infants and toddlers in a child care setting. The materials are most effective when coordinated with a carefully selected textbook. Access to a quality care center for laboratory work is essential. An introduction describes the instructor's…
Nutrition, Health, and Safety for Child Caregivers: Instructor's Guide.
ERIC Educational Resources Information Center
Texas Tech Univ., Lubbock. Home Economics Instructional Materials Center.
This guide for postsecondary child development instructors is intended for use in courses on nutrition, health, and safety in a child care setting. The materials are most effective when coordinated with carefully selected textbooks. Access to a quality care center for laboratory work is essential. An introduction describes the instructor's guide…
The Development of a Planning, Programming and Budgeting System. Technical Report.
ERIC Educational Resources Information Center
Appelquist, Claes-Goran; Zandren, S.
In CERI's program on institutional management in higher education, eight universities were brought together to set up teams within their institutions to work on their respective pre-selected problem areas. The planning, programming and budgeting system (PPBS) was developed as a management tool which would improve effectiveness by increasing the…
Pathways to a STEMM Profession
ERIC Educational Resources Information Center
Miller, Jon D.; Kimmel, Linda G.
2012-01-01
The inadequate number of American young adults selecting a scientific or engineering profession continues to be a major national concern. Using data from the 23-year record of the Longitudinal Study of American Youth (LSAY) and working within the social learning paradigm, this analysis uses a set of 21 variables to predict young people's…
ES Review: Selections from 2008 & 2009
ERIC Educational Resources Information Center
Smiles, Robin, Ed.
2009-01-01
This third edition of the "ES Review" brings together, in one setting, some of the best work from 2008-09. It features: (1) K-12 Accountability (Measuring Skills for the 21st Century (Elena Silva); Beyond the Bubble: Technology and the Future of Student Assessment (Bill Tucker); Testing the Limits (Bill Tucker); Changing the Game: The…
Military Curricula for Vocational and Technical Education. X-Ray Specialist, 10-16.
ERIC Educational Resources Information Center
Department of the Army, Washington, DC.
These instructor and student materials for a postsecondary course in radiography are one of a number of military-developed curriculum packages selected for adaptation to vocational instruction and curriculum development in civilian settings. This course is designed to provide a working knowledge of radiography that will enable students to perform…
Family Bonding with Universities. NBER Working Paper No. 15493
ERIC Educational Resources Information Center
Meer, Jonathan; Rosen, Harvey S.
2009-01-01
One justification offered for legacy admissions policies at universities is that they bind entire families to the university. Proponents maintain that these policies have a number of benefits, including increased donations from members of these families. We use a rich set of data from an anonymous selective research institution to investigate…
Insights into the Control of Attentional Set in ADHD Using the Attentional Blink Paradigm
ERIC Educational Resources Information Center
Mason, Deanna J.; Humphreys, Glyn W.; Kent, Lindsey
2005-01-01
Background: Previous work on visual selective attention in Attention Deficit Hyperactivity Disorder (ADHD) has utilised spatial search paradigms. This study compared ADHD to control children on a temporal search task using Rapid Serial Visual Presentation (RSVP). In addition, the effects of irrelevant singleton distractors on search performance…
Classification of burn wounds using support vector machines
NASA Astrophysics Data System (ADS)
Acha, Begona; Serrano, Carmen; Palencia, Sergio; Murillo, Juan Jose
2004-05-01
The purpose of this work is to improve a previous method developed by the authors for the classification of burn wounds into their depths. The inputs of the system are color and texture information, as these are the characteristics observed by physicians in order to give a diagnosis. Our previous work consisted in segmenting the burn wound from the rest of the image and classifying the burn into its depth. In this paper we focus on the classification problem only. We already proposed to use a Fuzzy-ARTMAP neural network (NN). However, we may take advantage of new powerful classification tools such as Support Vector Machines (SVM). We apply the five-folded cross validation scheme to divide the database into training and validating sets. Then, we apply a feature selection method for each classifier, which will give us the set of features that yields the smallest classification error for each classifier. Features used to classify are first-order statistical parameters extracted from the L*, u* and v* color components of the image. The feature selection algorithms used are the Sequential Forward Selection (SFS) and the Sequential Backward Selection (SBS) methods. As data of the problem faced here are not linearly separable, the SVM was trained using some different kernels. The validating process shows that the SVM method, when using a Gaussian kernel of variance 1, outperforms classification results obtained with the rest of the classifiers, yielding an error classification rate of 0.7% whereas the Fuzzy-ARTMAP NN attained 1.6 %.
Perry, H B
1980-01-01
This study describes the effects of personal background and work setting variables upon the job characteristics of a national sample of 939 physician assistants. These data were obtained from a 1974 survey of members of the physician assistant profession and were assessed by means of path analysis. The analysis yielded the following major findings: (1) job characteristics became more favorable with increasing experience as a physician assistant, (2) employment in primary care fields resulted in job characteristics at least as favorable as those found in employment in other specialties, (3) military physician assistants reported greater patient care responsibility but lower levels of occupational prestige and career opportunities, and (4) women physician assistants earned less (even after controlling for number of hours worked) and knew of fewer available alternative job opportunities than their male colleagues.
The child welfare system: through the eyes of public health nurses.
Schneiderman, Janet U
2005-01-01
This qualitative descriptive study investigates how public health nurses working within the child welfare system view the organization and the organization's effect on their case management practice. Semistructured interviews were conducted utilizing the Bolman-Deal Organizational Model. This model identifies four frames of an organization: symbolic, human resources, political, and structural. A purposive sample of nine nurses and one social worker was selected to participate in comprehensive interviews. Data analysis identified two main themes. The first theme was the presence of organizational structural barriers to providing case management. The second theme was the lack of political influence by the nurses to change the structure of the organization; hence, their skills could be more completely utilized. Public health nurses who work in child welfare will need to systematically analyze their role within the organization and understand how to work in "host settings." Nursing educators need to prepare public health nurses to work in non-health care settings by teaching organizational analysis.
O’Donnell, Emily M.; Berkman, Lisa F.; Subramanian, Sv
2012-01-01
Objective Supervisor-level policies and the presence of a manager engaged in an employee’s need to achieve work/family balance, or “supervisory support,” may benefit employee health, including self-reported pain. Methods We conducted a census of employees at four selected extended-care facilities in the Boston metropolitan region (n= 368). Supervisory support was assessed through interviews with managers and pain was employee-reported. Results Our multilevel logistic models indicate that employees with managers who report the lowest levels of support for work/family balance experience twice as much overall pain as employees with managers who report high levels of support. Conclusions Low supervisory support for work/family balance is associated with an increased prevalence of employee-reported pain in extended-care facilities. We recommend that manager-level policies and practices receive additional attention as a potential risk factor for poor health in this setting. PMID:22892547
Criteria for the assessment of analyser practicability
Biosca, C.; Galimany, R.
1993-01-01
This article lists the theoretical criteria that need to be considered to assess the practicability of an automatic analyser. Two essential sets of criteria should be taken into account when selecting an automatic analyser: ‘reliability’ and ‘practicability’. Practibility covers the features that provide information about the suitability of an analyser for specific working conditions. These practibility criteria are classsified in this article and include the environment; work organization; versatility and flexibility; safely controls; staff training; maintenance and operational costs. PMID:18924972
Otterman, Nicoline; Veerbeek, Janne; Schiemanck, Sven; van der Wees, Philip; Nollet, Frans; Kwakkel, Gert
2017-07-01
To select relevant and feasible instruments for the revision of the Dutch clinical practice guideline for physical therapy in patients with stroke. In this implementation study a comprehensive proposal for ICF categories and matching instruments was developed, based on reliability and validity. Relevant instruments were then selected in a consensus round by 11 knowledge brokers who were responsible for the implementation of the selected instruments. The feasibility of the selected instruments was tested by 36 physical therapists at different work settings within stroke services. Finally, instruments that were deemed relevant and feasible were included in the revised guideline. A total of 28 instruments were recommended for inclusion in the revised guideline. Nineteen instruments were retained from the previous guideline. Ten new instruments were tested in clinical practice, seven of which were found feasible. Two more instruments were added after critical appraisal of the set of the measurement instruments. The revised guideline contains 28 relevant and feasible instrument selected and tested in clinical practice by physical therapists. Further education and implementation is needed to integrate instruments in clinical practice. Further research is proposed for developing and implementing a core set of measurement instruments to be used at fixed time points to establish data registries that allow for continuous improvement of rehabilitation for stroke patients. Implications for Rehabilitation The revised Dutch Stroke Physical Therapy Guideline recommends a total of 28 instruments, that are relevant and feasible for clinical practice of physical therapist in the different settings of stroke rehabilitation. The selection of instrument in daily practice should be part of the clinical reasoning process of PTs and be tailored to individual patients' needs and the degree of priority of the affected ICF category. Suggested education strategies for further integration of instruments in of the daily practice of PTs in Stroke Rehabilitation are: 'Training on the job' and 'peer assessment in clinical situations'.
Occupational Safety Review of High Technology Facilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee Cadwallader
2005-01-31
This report contains reviews of operating experiences, selected accident events, and industrial safety performance indicators that document the performance of the major US DOE magnetic fusion experiments and particle accelerators. These data are useful to form a basis for the occupational safety level at matured research facilities with known sets of safety rules and regulations. Some of the issues discussed are radiation safety, electromagnetic energy exposure events, and some of the more widespread issues of working at height, equipment fires, confined space work, electrical work, and other industrial hazards. Nuclear power plant industrial safety data are also included for comparison.
Linear and nonlinear variable selection in competing risks data.
Ren, Xiaowei; Li, Shanshan; Shen, Changyu; Yu, Zhangsheng
2018-06-15
Subdistribution hazard model for competing risks data has been applied extensively in clinical researches. Variable selection methods of linear effects for competing risks data have been studied in the past decade. There is no existing work on selection of potential nonlinear effects for subdistribution hazard model. We propose a two-stage procedure to select the linear and nonlinear covariate(s) simultaneously and estimate the selected covariate effect(s). We use spectral decomposition approach to distinguish the linear and nonlinear parts of each covariate and adaptive LASSO to select each of the 2 components. Extensive numerical studies are conducted to demonstrate that the proposed procedure can achieve good selection accuracy in the first stage and small estimation biases in the second stage. The proposed method is applied to analyze a cardiovascular disease data set with competing death causes. Copyright © 2018 John Wiley & Sons, Ltd.
Long-Term Memory and the Control of Attentional Control
Mayr, Ulrich; Kuhns, David; Hubbard, Jason
2014-01-01
Task-switch costs and in particular the switch-cost asymmetry (i.e., the larger costs of switching to a dominant than a non-dominant task) are usually explained in terms of trial-to-trial carry-over of task-specific control settings. Here we argue that task switches are just one example of situations that trigger a transition from working-memory maintenance to updating, thereby opening working memory to interference from long-term memory. We used a new paradigm that requires selecting a spatial location either on the basis of a central cue (i.e., endogenous control of attention) or a peripheral, sudden onset (i.e., exogenous control of attention). We found a strong cost asymmetry that occurred even after short interruptions of otherwise single-task blocks (Exp. 1-3), but that was much stronger when participants had experienced the competing task under conditions of conflict (Exp. 1-2). Experiment 3 showed that the asymmetric costs were due to interruptions per se, rather than to associative interference tied to specific interruption activities. Experiment 4 generalized the basic pattern across interruptions varying in length or control demands and Experiment 5 across primary tasks with response-selection conflict rather than attentional conflict. Combined, the results support a model in which costs of selecting control settings arise when (a) potentially interfering memory traces have been encoded in long-term memory and (b) working-memory is forced from a maintenance mode into an updating mode (e.g., through task interruptions), thereby allowing unwanted retrieval of the encoded memory traces. PMID:24650696
Exploring Shared Governance for an Academic Nursing Setting.
Boswell, Carol; Opton, Laura; Owen, Donna C
2017-04-01
A beneficial work environment influences staff and employee satisfaction and contributes to enhanced organizational execution. This article communicates a literature review of the existing knowledge to describe the potential influence of job satisfaction, empowerment, and work engagement on a faculty and staff academic shared governance model and impact on the improvement of a healthy work environment. References from PubMed (from 1975 to 2014) and ERIC (from 2006 to 2016), along with manuscripts included in the reference lists of the selected articles, served as the basis of the review. Definitions and descriptions of shared governance in academic settings suggested the potential influence on the development of a faculty and staff shared governance model within a school of nursing on job satisfaction, empowerment, and work engagement, resulting in the maintenance of a healthy work environment. This shift is essential for the resilience of nursing academia and the building of novel, more inclusive approaches to innovation that tap into the talent and skill of all organizational members. [J Nurs Educ. 2017;56(4):197-203.]. Copyright 2017, SLACK Incorporated.
Khazaei, Hamid; Street, Kenneth; Bari, Abdallah; Mackay, Michael; Stoddard, Frederick L.
2013-01-01
Efficient methods to explore plant agro-biodiversity for climate change adaptive traits are urgently required. The focused identification of germplasm strategy (FIGS) is one such approach. FIGS works on the premise that germplasm is likely to reflect the selection pressures of the environment in which it developed. Environmental parameters describing plant germplasm collection sites are used as selection criteria to improve the probability of uncovering useful variation. This study was designed to test the effectiveness of FIGS to search a large faba bean (Vicia faba L.) collection for traits related to drought adaptation. Two sets of faba bean accessions were created, one from moisture-limited environments, and the other from wetter sites. The two sets were grown under well watered conditions and leaf morpho-physiological traits related to plant water use were measured. Machine-learning algorithms split the accessions into two groups based on the evaluation data and the groups created by this process were compared to the original climate-based FIGS sets. The sets defined by trait data were in almost perfect agreement to the FIGS sets, demonstrating that ecotypic differentiation driven by moisture availability has occurred within the faba bean genepool. Leaflet and canopy temperature as well as relative water content contributed more than other traits to the discrimination between sets, indicating that their utility as drought-tolerance selection criteria for faba bean germplasm. This study supports the assertion that FIGS could be an effective tool to enhance the discovery of new genes for abiotic stress adaptation. PMID:23667581
Novel ion channel targets in atrial fibrillation.
Hancox, Jules C; James, Andrew F; Marrion, Neil V; Zhang, Henggui; Thomas, Dierk
2016-08-01
Atrial fibrillation (AF) is the most common arrhythmia in humans. It is progressive and the development of electrical and structural remodeling makes early intervention desirable. Existing antiarrhythmic pharmacological approaches are not always effective and can produce unwanted side effects. Additional atrial-selective antiarrhythmic strategies are therefore desirable. Evidence for three novel ion channel atrial-selective therapeutic targets is evaluated: atrial-selective fast sodium channel current (INa) inhibition; small conductance calcium-activated potassium (SK) channels; and two-pore (K2P) potassium channels. Data from animal models support atrial-ventricular differences in INa kinetics and also suggest atrial-ventricular differences in sodium channel β subunit expression. Further work is required to determine whether intrinsic atrial-ventricular differences in human INa exist or whether functional differences occur due to distinct atrial and ventricular action and resting potentials. SK and K2P channels (particularly K2P 3.1) offer potentially attractive atrial-selective targets. Work is needed to identify the underlying basis of SK current that contributes to (patho)physiological atrial repolarization and settings in which SK inhibition is anti- versus pro-arrhythmic. Although K2P3.1 appears to be a promising target with comparatively selective drugs for experimental use, a lack of selective pharmacology hinders evaluation of other K2P channels as potential atrial-selective targets.
Comparison of data used for setting occupational exposure limits.
Schenk, Linda
2010-01-01
It has previously been shown that occupational exposure limits (OELs) for the same substance can vary significantly between different standard-setters. The work presented in this paper identifies the steps in the process towards establishing an OEL and how variations in those processes could account for these differences. This study selects for further scrutiny substances for which the level of OELs vary by a factor of 100, focussing on 45 documents concerning 14 substances from eight standard-setters. Several of the OELs studied were more than 20 years old and based on outdated knowledge. Furthermore, different standard-setters sometimes based their OELs on different sets of data, and data availability alone could not explain all differences in the selection of data sets used by standard-setters. While the interpretation of key studies did not differ significantly in standard-setters' documentations, the evaluations of the key studies' quality did. Also, differences concerning the critical effect coincided with differences in the level of OELs for half of the substances.
Sexual orientation data collection and progress toward Healthy People 2010.
Sell, R L; Becker, J B
2001-06-01
Without scientifically obtained data and published reports, it is difficult to raise awareness and acquire adequate resources to address the health concerns of lesbian, gay, and bisexual Americans. The Department of Health and Human Services must recognize gaps in its information systems regarding sexual orientation data and take immediate steps to monitor and eliminate health disparities as delineated in Healthy People 2010. A paper supported by funding from the Office of the Assistant Secretary for Planning and Evaluation explores these concerns and suggests that the department (1) create work groups to examine the collection of sexual orientation data; (2) create a set of guiding principles to govern the process of selecting standard definitions and measures; (3) recognize that racial/ethnic, immigrant-status, age, socioeconomic, and geographic differences must be taken into account when standard measures of sexual orientation are selected; (4) select a minimum set of standard sexual orientation measures; and (5) develop a long-range strategic plan for the collection of sexual orientation data.
Sexual orientation data collection and progress toward Healthy People 2010.
Sell, R L; Becker, J B
2001-01-01
Without scientifically obtained data and published reports, it is difficult to raise awareness and acquire adequate resources to address the health concerns of lesbian, gay, and bisexual Americans. The Department of Health and Human Services must recognize gaps in its information systems regarding sexual orientation data and take immediate steps to monitor and eliminate health disparities as delineated in Healthy People 2010. A paper supported by funding from the Office of the Assistant Secretary for Planning and Evaluation explores these concerns and suggests that the department (1) create work groups to examine the collection of sexual orientation data; (2) create a set of guiding principles to govern the process of selecting standard definitions and measures; (3) recognize that racial/ethnic, immigrant-status, age, socioeconomic, and geographic differences must be taken into account when standard measures of sexual orientation are selected; (4) select a minimum set of standard sexual orientation measures; and (5) develop a long-range strategic plan for the collection of sexual orientation data. PMID:11392926
Online medical symbol recognition using a Tablet PC
NASA Astrophysics Data System (ADS)
Kundu, Amlan; Hu, Qian; Boykin, Stanley; Clark, Cheryl; Fish, Randy; Jones, Stephen; Moore, Stephen
2011-01-01
In this paper we describe a scheme to enhance the usability of a Tablet PC's handwriting recognition system by including medical symbols that are not a part of the Tablet PC's symbol library. The goal of this work is to make handwriting recognition more useful for medical professionals accustomed to using medical symbols in medical records. To demonstrate that this new symbol recognition module is robust and expandable, we report results on both a medical symbol set and an expanded symbol test set which includes selected mathematical symbols.
Habitat selection and movements of Piping Plover broods suggest a tradeoff between breeding stages
Wiltermuth, Mark T.; Anteau, Michael J.; Sherfy, Mark H.; Pearse, Aaron T.
2015-01-01
In precocial birds, adults select breeding areas using cues associated with habitat characteristics that are favorable for nesting success and chick survival, but there may be tradeoffs in habitat selection between these breeding stages. Here we describe habitat selection and intra-territory movements of 53 Piping Plover (Charadrius melodus) broods (320 observations) during the 2007–2008 breeding seasons on mainland- and island-shoreline habitats at Lake Sakakawea, North Dakota, USA. We used remotely sensed habitat characteristics to separately examine habitat selection and movements at two spatiotemporal scales to account for potential confounding effects of nest-site selection on brood-rearing habitat used. The scales used were (1) the entire brood-rearing period within available brood-rearing areas and (2) 2-day observation intervals within age-specific discrete habitat selection choice sets. Analyses at both scales indicated that broods selected areas which were non-vegetated, moderately level, and nearer to the shoreline. Rate of brood movement increased with age up to 5 days, then stabilized; broods that hatched >50 m away from the shoreline moved toward the shoreline. Brood movements were greater when they were in vegetated areas, when the brood-rearing area was of greater topographic complexity, and when broods aged 6–25 days were further away from the shoreline. Using inferences from our results and those of previously published work, we postulate how a potential tradeoff in habitat selection between nesting and brood-rearing can contribute to an ecological trap in a novel habitat. This work, in the context of published works, suggests that plover breeding habitat is a complex of both nesting and brood-rearing habitats and provides a basis for making remotely sensed abundance estimates of suitable breeding habitat for Piping Plovers.
ERIC Educational Resources Information Center
Harrison, R. W.; And Others
The worksheets have been developed for use with any production occupational or work experience record book for high school vocational agriculture programs. Separate units have been developed for each of 11 areas in ornamental horticulture, so the student and teacher can select the appropriate one, or several, for the experiences planned by the…
ERIC Educational Resources Information Center
Rice, Karla K.
This study was conducted to examine the transition of newcomer teachers from Pacific Rim countries as they entered selected California school settings. Twelve teachers from China, Hong Kong, Korea, Mexico, and Vietnam were the research participants, and all had had prior teaching experience in their native countries. As the researcher and the…
Students' Task Interpretation and Conceptual Understanding in an Electronics Laboratory
ERIC Educational Resources Information Center
Rivera-Reyes, Presentacion; Lawanto, Oenardi; Pate, Michael L.
2017-01-01
Task interpretation is a critical first step for students in the process of self-regulated learning, and a key determinant when they set goals in their learning and select strategies in assigned work. This paper focuses on the explicit and implicit aspects of task interpretation based on Hadwin's model. Laboratory activities improve students'…
Proposed Ordinance for the Regulation of Cable Television. Working Draft.
ERIC Educational Resources Information Center
Chicago City Council, IL.
A model ordinance is proposed for the regulation of cable television in the city of Chicago. It defines the language of the ordinance, sets forth the method of granting franchises, and describes the terms of the franchises. The duties of a commission to regulate cable television are listed and the method of selecting commission members is…
ERIC Educational Resources Information Center
North Carolina State Univ., Raleigh. Center for Accessible Housing.
The purpose of this project was to develop and present a set of recommendations for supplements to the existing Uniform Federal Accessibility Standards (UFAS) that would apply to environments used by children with physical disabilities. The work was subdivided into six phases: (1) review of selected codes, standards and guidelines; (2) review of…
Construction and Maintenance of Classroom Aquaria. Marine Science Curriculum Aid No. 2.
ERIC Educational Resources Information Center
Lee, Richard S.
This manual introduces teachers to the biological systems at work in a marine aquarium. It provides guidance in selection of the tanks, specifically discussing the effect of capacity on the well-being of the occupants. It guides the teacher in setting up aeration, filtering, lighting, and temperature control for the aquarium. It also advises on…
A Retrospective Examination of a University's Thirteen Years in Latin America.
ERIC Educational Resources Information Center
Blood, Ronald E.; And Others
The context of educational reform within which U.S. higher education has worked in selected Latin American countries is examined, with attention directed to the specific experience of the University of New Mexico. The evolution of the Latin American Programs in Education office (LAPE) in the university setting, the organizational milieu in which…
Bokulich, Nicholas A.
2013-01-01
Ultra-high-throughput sequencing (HTS) of fungal communities has been restricted by short read lengths and primer amplification bias, slowing the adoption of newer sequencing technologies to fungal community profiling. To address these issues, we evaluated the performance of several common internal transcribed spacer (ITS) primers and designed a novel primer set and work flow for simultaneous quantification and species-level interrogation of fungal consortia. Primer comparison and validation were predicted in silico and by sequencing a “mock community” of mixed yeast species to explore the challenges of amplicon length and amplification bias for reconstructing defined yeast community structures. The amplicon size and distribution of this primer set are smaller than for all preexisting ITS primer sets, maximizing sequencing coverage of hypervariable ITS domains by very-short-amplicon, high-throughput sequencing platforms. This feature also enables the optional integration of quantitative PCR (qPCR) directly into the HTS preparatory work flow by substituting qPCR with these primers for standard PCR, yielding quantification of individual community members. The complete work flow described here, utilizing any of the qualified primer sets evaluated, can rapidly profile mixed fungal communities and capably reconstructed well-characterized beer and wine fermentation fungal communities. PMID:23377949
Cárdenas, V; Cordobés, M; Blanco, M; Alcalà, M
2015-10-10
The pharmaceutical industry is under stringent regulations on quality control of their products because is critical for both, productive process and consumer safety. According to the framework of "process analytical technology" (PAT), a complete understanding of the process and a stepwise monitoring of manufacturing are required. Near infrared spectroscopy (NIRS) combined with chemometrics have lately performed efficient, useful and robust for pharmaceutical analysis. One crucial step in developing effective NIRS-based methodologies is selecting an appropriate calibration set to construct models affording accurate predictions. In this work, we developed calibration models for a pharmaceutical formulation during its three manufacturing stages: blending, compaction and coating. A novel methodology is proposed for selecting the calibration set -"process spectrum"-, into which physical changes in the samples at each stage are algebraically incorporated. Also, we established a "model space" defined by Hotelling's T(2) and Q-residuals statistics for outlier identification - inside/outside the defined space - in order to select objectively the factors to be used in calibration set construction. The results obtained confirm the efficacy of the proposed methodology for stepwise pharmaceutical quality control, and the relevance of the study as a guideline for the implementation of this easy and fast methodology in the pharma industry. Copyright © 2015 Elsevier B.V. All rights reserved.
Impact of Neutrino Opacities on Core-collapse Supernova Simulations
NASA Astrophysics Data System (ADS)
Kotake, Kei; Takiwaki, Tomoya; Fischer, Tobias; Nakamura, Ko; Martínez-Pinedo, Gabriel
2018-02-01
The accurate description of neutrino opacities is central to both the core-collapse supernova (CCSN) phenomenon and the validity of the explosion mechanism itself. In this work, we study in a systematic fashion the role of a variety of well-selected neutrino opacities in CCSN simulations where the multi-energy, three-flavor neutrino transport is solved using the isotropic diffusion source approximation (IDSA) scheme. To verify our code, we first present results from one-dimensional (1D) simulations following the core collapse, bounce, and ∼250 ms postbounce of a 15 {M}ȯ star using a standard set of neutrino opacities by Bruenn. A detailed comparison with published results supports the reliability of our three-flavor IDSA scheme using the standard opacity set. We then investigate in 1D simulations how individual opacity updates lead to differences with the baseline run with the standard opacity set. Through detailed comparisons with previous work, we check the validity of our implementation of each update in a step-by-step manner. Individual neutrino opacities with the largest impact on the overall evolution in 1D simulations are selected for systematic comparisons in our two-dimensional (2D) simulations. Special attention is given to the criterion of explodability in the 2D models. We discuss the implications of these results as well as its limitations and the requirements for future, more elaborate CCSN modeling.
Improving accuracy and power with transfer learning using a meta-analytic database.
Schwartz, Yannick; Varoquaux, Gaël; Pallier, Christophe; Pinel, Philippe; Poline, Jean-Baptiste; Thirion, Bertrand
2012-01-01
Typical cohorts in brain imaging studies are not large enough for systematic testing of all the information contained in the images. To build testable working hypotheses, investigators thus rely on analysis of previous work, sometimes formalized in a so-called meta-analysis. In brain imaging, this approach underlies the specification of regions of interest (ROIs) that are usually selected on the basis of the coordinates of previously detected effects. In this paper, we propose to use a database of images, rather than coordinates, and frame the problem as transfer learning: learning a discriminant model on a reference task to apply it to a different but related new task. To facilitate statistical analysis of small cohorts, we use a sparse discriminant model that selects predictive voxels on the reference task and thus provides a principled procedure to define ROIs. The benefits of our approach are twofold. First it uses the reference database for prediction, i.e., to provide potential biomarkers in a clinical setting. Second it increases statistical power on the new task. We demonstrate on a set of 18 pairs of functional MRI experimental conditions that our approach gives good prediction. In addition, on a specific transfer situation involving different scanners at different locations, we show that voxel selection based on transfer learning leads to higher detection power on small cohorts.
Phillips, Tim; Ferguson, Eamonn; Rijsdijk, Fruhling
2010-11-01
Altruistic behaviour raises major questions for psychology and biology. One hypothesis proposes that human altruistic behaviour evolved as a result of sexual selection. Mechanisms that seek to explain how sexual selection works suggest genetic influence acting on both the mate preference for the trait and the preferred trait itself. We used a twin study to estimate whether genetic effects influenced responses to psychometric scales measuring mate preference towards altruistic traits (MPAT) and the preferred trait (i.e., 'altruistic personality'). As predicted, we found significant genetic effects influencing variation in both. We also predicted that individuals expressing stronger MPAT and 'altruistic personality' would have mated at a greater frequency in ancestral populations. We found evidence for this in that 67% of the covariance in the phenotypic correlation between the two scales was associated with significant genetic effects. Both sets of findings are thus consistent with the hypothesized link between sexual selection and human altruism towards non-kin. We discuss how this study contributes to our understanding of altruistic behaviour and how further work might extend this understanding.
Liu, S.; Bremer, P. -T; Jayaraman, J. J.; ...
2016-06-04
Linear projections are one of the most common approaches to visualize high-dimensional data. Since the space of possible projections is large, existing systems usually select a small set of interesting projections by ranking a large set of candidate projections based on a chosen quality measure. However, while highly ranked projections can be informative, some lower ranked ones could offer important complementary information. Therefore, selection based on ranking may miss projections that are important to provide a global picture of the data. Here, the proposed work fills this gap by presenting the Grassmannian Atlas, a framework that captures the global structuresmore » of quality measures in the space of all projections, which enables a systematic exploration of many complementary projections and provides new insights into the properties of existing quality measures.« less
Xu, Ronghui; Hou, Jue; Chambers, Christina D
2018-06-01
Our work was motivated by small cohort studies on the risk of birth defects in infants born to pregnant women exposed to medications. We controlled for confounding using propensity scores (PS). The extremely rare events setting renders the matching or stratification infeasible. In addition, the PS itself may be formed via different approaches to select confounders from a relatively long list of potential confounders. We carried out simulation experiments to compare different combinations of approaches: IPW or regression adjustment, with 1) including all potential confounders without selection, 2) selection based on univariate association between the candidate variable and the outcome, 3) selection based on change in effects (CIE). The simulation showed that IPW without selection leads to extremely large variances in the estimated odds ratio, which help to explain the empirical data analysis results that we had observed. Copyright © 2018 Elsevier Inc. All rights reserved.
Yang, Kai-Fu; Li, Chao-Yi; Li, Yong-Jie
2015-01-01
Both the neurons with orientation-selective and with non-selective surround inhibition have been observed in the primary visual cortex (V1) of primates and cats. Though the inhibition coming from the surround region (named as non-classical receptive field, nCRF) has been considered playing critical role in visual perception, the specific role of orientation-selective and non-selective inhibition in the task of contour detection is less known. To clarify above question, we first carried out computational analysis of the contour detection performance of V1 neurons with different types of surround inhibition, on the basis of which we then proposed two integrated models to evaluate their role in this specific perceptual task by combining the two types of surround inhibition with two different ways. The two models were evaluated with synthetic images and a set of challenging natural images, and the results show that both of the integrated models outperform the typical models with orientation-selective or non-selective inhibition alone. The findings of this study suggest that V1 neurons with different types of center–surround interaction work in cooperative and adaptive ways at least when extracting organized structures from cluttered natural scenes. This work is expected to inspire efficient phenomenological models for engineering applications in field of computational machine-vision. PMID:26136664
Yang, Kai-Fu; Li, Chao-Yi; Li, Yong-Jie
2015-01-01
Both the neurons with orientation-selective and with non-selective surround inhibition have been observed in the primary visual cortex (V1) of primates and cats. Though the inhibition coming from the surround region (named as non-classical receptive field, nCRF) has been considered playing critical role in visual perception, the specific role of orientation-selective and non-selective inhibition in the task of contour detection is less known. To clarify above question, we first carried out computational analysis of the contour detection performance of V1 neurons with different types of surround inhibition, on the basis of which we then proposed two integrated models to evaluate their role in this specific perceptual task by combining the two types of surround inhibition with two different ways. The two models were evaluated with synthetic images and a set of challenging natural images, and the results show that both of the integrated models outperform the typical models with orientation-selective or non-selective inhibition alone. The findings of this study suggest that V1 neurons with different types of center-surround interaction work in cooperative and adaptive ways at least when extracting organized structures from cluttered natural scenes. This work is expected to inspire efficient phenomenological models for engineering applications in field of computational machine-vision.
Automatic parameter selection for feature-based multi-sensor image registration
NASA Astrophysics Data System (ADS)
DelMarco, Stephen; Tom, Victor; Webb, Helen; Chao, Alan
2006-05-01
Accurate image registration is critical for applications such as precision targeting, geo-location, change-detection, surveillance, and remote sensing. However, the increasing volume of image data is exceeding the current capacity of human analysts to perform manual registration. This image data glut necessitates the development of automated approaches to image registration, including algorithm parameter value selection. Proper parameter value selection is crucial to the success of registration techniques. The appropriate algorithm parameters can be highly scene and sensor dependent. Therefore, robust algorithm parameter value selection approaches are a critical component of an end-to-end image registration algorithm. In previous work, we developed a general framework for multisensor image registration which includes feature-based registration approaches. In this work we examine the problem of automated parameter selection. We apply the automated parameter selection approach of Yitzhaky and Peli to select parameters for feature-based registration of multisensor image data. The approach consists of generating multiple feature-detected images by sweeping over parameter combinations and using these images to generate estimated ground truth. The feature-detected images are compared to the estimated ground truth images to generate ROC points associated with each parameter combination. We develop a strategy for selecting the optimal parameter set by choosing the parameter combination corresponding to the optimal ROC point. We present numerical results showing the effectiveness of the approach using registration of collected SAR data to reference EO data.
Resource selection for an interdisciplinary field: a methodology.
Jacoby, Beth E; Murray, Jane; Alterman, Ina; Welbourne, Penny
2002-10-01
The Health Sciences and Human Services Library of the University of Maryland developed and implemented a methodology to evaluate print and digital resources for social work. Although this methodology was devised for the interdisciplinary field of social work, the authors believe it may lend itself to resource selection in other interdisciplinary fields. The methodology was developed in response to the results of two separate surveys conducted in late 1999, which indicated improvement was needed in the library's graduate-level social work collections. Library liaisons evaluated the print collection by identifying forty-five locally relevant Library of Congress subject headings and then using these subjects or synonymous terms to compare the library's titles to collections of peer institutions, publisher catalogs, and Amazon.com. The collection also was compared to social work association bibliographies, ISI Journal Citation Reports, and major social work citation databases. An approval plan for social work books was set up to assist in identifying newly published titles. The library acquired new print and digital social work resources as a result of the evaluation, thus improving both print and digital collections for its social work constituents. Visibility of digital resources was increased by cataloging individual titles in aggregated electronic journal packages and listing each title on the library Web page.
Resource selection for an interdisciplinary field: a methodology*
Jacoby, Beth E.; Murray, Jane; Alterman, Ina; Welbourne, Penny
2002-01-01
The Health Sciences and Human Services Library of the University of Maryland developed and implemented a methodology to evaluate print and digital resources for social work. Although this methodology was devised for the interdisciplinary field of social work, the authors believe it may lend itself to resource selection in other interdisciplinary fields. The methodology was developed in response to the results of two separate surveys conducted in late 1999, which indicated improvement was needed in the library's graduate-level social work collections. Library liaisons evaluated the print collection by identifying forty-five locally relevant Library of Congress subject headings and then using these subjects or synonymous terms to compare the library's titles to collections of peer institutions, publisher catalogs, and Amazon.com. The collection also was compared to social work association bibliographies, ISI Journal Citation Reports, and major social work citation databases. An approval plan for social work books was set up to assist in identifying newly published titles. The library acquired new print and digital social work resources as a result of the evaluation, thus improving both print and digital collections for its social work constituents. Visibility of digital resources was increased by cataloging individual titles in aggregated electronic journal packages and listing each title on the library Web page. PMID:12398245
2011-01-01
Background Machine learning has a vast range of applications. In particular, advanced machine learning methods are routinely and increasingly used in quantitative structure activity relationship (QSAR) modeling. QSAR data sets often encompass tens of thousands of compounds and the size of proprietary, as well as public data sets, is rapidly growing. Hence, there is a demand for computationally efficient machine learning algorithms, easily available to researchers without extensive machine learning knowledge. In granting the scientific principles of transparency and reproducibility, Open Source solutions are increasingly acknowledged by regulatory authorities. Thus, an Open Source state-of-the-art high performance machine learning platform, interfacing multiple, customized machine learning algorithms for both graphical programming and scripting, to be used for large scale development of QSAR models of regulatory quality, is of great value to the QSAR community. Results This paper describes the implementation of the Open Source machine learning package AZOrange. AZOrange is specially developed to support batch generation of QSAR models in providing the full work flow of QSAR modeling, from descriptor calculation to automated model building, validation and selection. The automated work flow relies upon the customization of the machine learning algorithms and a generalized, automated model hyper-parameter selection process. Several high performance machine learning algorithms are interfaced for efficient data set specific selection of the statistical method, promoting model accuracy. Using the high performance machine learning algorithms of AZOrange does not require programming knowledge as flexible applications can be created, not only at a scripting level, but also in a graphical programming environment. Conclusions AZOrange is a step towards meeting the needs for an Open Source high performance machine learning platform, supporting the efficient development of highly accurate QSAR models fulfilling regulatory requirements. PMID:21798025
Stålring, Jonna C; Carlsson, Lars A; Almeida, Pedro; Boyer, Scott
2011-07-28
Machine learning has a vast range of applications. In particular, advanced machine learning methods are routinely and increasingly used in quantitative structure activity relationship (QSAR) modeling. QSAR data sets often encompass tens of thousands of compounds and the size of proprietary, as well as public data sets, is rapidly growing. Hence, there is a demand for computationally efficient machine learning algorithms, easily available to researchers without extensive machine learning knowledge. In granting the scientific principles of transparency and reproducibility, Open Source solutions are increasingly acknowledged by regulatory authorities. Thus, an Open Source state-of-the-art high performance machine learning platform, interfacing multiple, customized machine learning algorithms for both graphical programming and scripting, to be used for large scale development of QSAR models of regulatory quality, is of great value to the QSAR community. This paper describes the implementation of the Open Source machine learning package AZOrange. AZOrange is specially developed to support batch generation of QSAR models in providing the full work flow of QSAR modeling, from descriptor calculation to automated model building, validation and selection. The automated work flow relies upon the customization of the machine learning algorithms and a generalized, automated model hyper-parameter selection process. Several high performance machine learning algorithms are interfaced for efficient data set specific selection of the statistical method, promoting model accuracy. Using the high performance machine learning algorithms of AZOrange does not require programming knowledge as flexible applications can be created, not only at a scripting level, but also in a graphical programming environment. AZOrange is a step towards meeting the needs for an Open Source high performance machine learning platform, supporting the efficient development of highly accurate QSAR models fulfilling regulatory requirements.
Chee, H; Rampal, K
2003-01-01
Aims: To determine the relation between sick leave and selected exposure variables among women semiconductor workers. Methods: This was a cross sectional survey of production workers from 18 semiconductor factories. Those selected had to be women, direct production operators up to the level of line leader, and Malaysian citizens. Sick leave and exposure to physical and chemical hazards were determined by self reporting. Three sick leave variables were used; number of sick leave days taken in the past year was the variable of interest in logistic regression models where the effects of age, marital status, work task, work schedule, work section, and duration of work in factory and work section were also explored. Results: Marital status was strongly linked to the taking of sick leave. Age, work schedule, and duration of work in the factory were significant confounders only in certain cases. After adjusting for these confounders, chemical and physical exposures, with the exception of poor ventilation and smelling chemicals, showed no significant relation to the taking of sick leave within the past year. Work section was a good predictor for taking sick leave, as wafer polishing workers faced higher odds of taking sick leave for each of the three cut off points of seven days, three days, and not at all, while parts assembly workers also faced significantly higher odds of taking sick leave. Conclusion: In Malaysia, the wafer fabrication factories only carry out a limited portion of the work processes, in particular, wafer polishing and the processes immediately prior to and following it. This study, in showing higher illness rates for workers in wafer polishing compared to semiconductor assembly, has implications for the governmental policy of encouraging the setting up of wafer fabrication plants with the full range of work processes. PMID:12660374
Horigian, Viviana E; Anderson, Austen R; Szapocznik, José
2016-09-01
In this article, we review the research evidence generated over 40 years on Brief Strategic Family Therapy illustrating the NIH stages of intervention development and highlighting the translational process. Basic research (Stage 0) led to the discovery of the characteristics of the population and the nature of the problems that needed to be addressed. This step informed the selection of an intervention model that addressed the problems presented by the population, but in a fashion that was congruent with the population's culture, defined in terms of its value orientations. From this basic research, an intervention that integrated structural and strategic elements was selected and refined through testing (Stage I). The second stage of translation (Stage II) included efficacy trials of a specialized engagement module that responded to challenges to the provision of services. It also included several other efficacy trials that documented the effects of the intervention, mostly in research settings or with research therapists. Stages III/IV in the translational process led to the testing of the effectiveness of the intervention in real-world settings with community therapists and some oversight from the developer. This work revealed that an implementation/organizational intervention was required to achieve fidelity and sustainability of the intervention in real-world settings. The work is currently in Stage V in which new model development led to an implementation intervention that can ensure fidelity and sustainability. Future research will evaluate the effectiveness of the current implementation model in increasing adoption, fidelity, and long-term sustainability in real-world settings. © 2016 Family Process Institute.
Sutton, Steven C; Hu, Mingxiu
2006-05-05
Many mathematical models have been proposed for establishing an in vitro/in vivo correlation (IVIVC). The traditional IVIVC model building process consists of 5 steps: deconvolution, model fitting, convolution, prediction error evaluation, and cross-validation. This is a time-consuming process and typically a few models at most are tested for any given data set. The objectives of this work were to (1) propose a statistical tool to screen models for further development of an IVIVC, (2) evaluate the performance of each model under different circumstances, and (3) investigate the effectiveness of common statistical model selection criteria for choosing IVIVC models. A computer program was developed to explore which model(s) would be most likely to work well with a random variation from the original formulation. The process used Monte Carlo simulation techniques to build IVIVC models. Data-based model selection criteria (Akaike Information Criteria [AIC], R2) and the probability of passing the Food and Drug Administration "prediction error" requirement was calculated. To illustrate this approach, several real data sets representing a broad range of release profiles are used to illustrate the process and to demonstrate the advantages of this automated process over the traditional approach. The Hixson-Crowell and Weibull models were often preferred over the linear. When evaluating whether a Level A IVIVC model was possible, the model selection criteria AIC generally selected the best model. We believe that the approach we proposed may be a rapid tool to determine which IVIVC model (if any) is the most applicable.
Gomes, Adriano de Araújo; Alcaraz, Mirta Raquel; Goicoechea, Hector C; Araújo, Mario Cesar U
2014-02-06
In this work the Successive Projection Algorithm is presented for intervals selection in N-PLS for three-way data modeling. The proposed algorithm combines noise-reduction properties of PLS with the possibility of discarding uninformative variables in SPA. In addition, second-order advantage can be achieved by the residual bilinearization (RBL) procedure when an unexpected constituent is present in a test sample. For this purpose, SPA was modified in order to select intervals for use in trilinear PLS. The ability of the proposed algorithm, namely iSPA-N-PLS, was evaluated on one simulated and two experimental data sets, comparing the results to those obtained by N-PLS. In the simulated system, two analytes were quantitated in two test sets, with and without unexpected constituent. In the first experimental system, the determination of the four fluorophores (l-phenylalanine; l-3,4-dihydroxyphenylalanine; 1,4-dihydroxybenzene and l-tryptophan) was conducted with excitation-emission data matrices. In the second experimental system, quantitation of ofloxacin was performed in water samples containing two other uncalibrated quinolones (ciprofloxacin and danofloxacin) by high performance liquid chromatography with UV-vis diode array detector. For comparison purpose, a GA algorithm coupled with N-PLS/RBL was also used in this work. In most of the studied cases iSPA-N-PLS proved to be a promising tool for selection of variables in second-order calibration, generating models with smaller RMSEP, when compared to both the global model using all of the sensors in two dimensions and GA-NPLS/RBL. Copyright © 2013 Elsevier B.V. All rights reserved.
Effective Sensor Selection and Data Anomaly Detection for Condition Monitoring of Aircraft Engines
Liu, Liansheng; Liu, Datong; Zhang, Yujie; Peng, Yu
2016-01-01
In a complex system, condition monitoring (CM) can collect the system working status. The condition is mainly sensed by the pre-deployed sensors in/on the system. Most existing works study how to utilize the condition information to predict the upcoming anomalies, faults, or failures. There is also some research which focuses on the faults or anomalies of the sensing element (i.e., sensor) to enhance the system reliability. However, existing approaches ignore the correlation between sensor selecting strategy and data anomaly detection, which can also improve the system reliability. To address this issue, we study a new scheme which includes sensor selection strategy and data anomaly detection by utilizing information theory and Gaussian Process Regression (GPR). The sensors that are more appropriate for the system CM are first selected. Then, mutual information is utilized to weight the correlation among different sensors. The anomaly detection is carried out by using the correlation of sensor data. The sensor data sets that are utilized to carry out the evaluation are provided by National Aeronautics and Space Administration (NASA) Ames Research Center and have been used as Prognostics and Health Management (PHM) challenge data in 2008. By comparing the two different sensor selection strategies, the effectiveness of selection method on data anomaly detection is proved. PMID:27136561
Effective Sensor Selection and Data Anomaly Detection for Condition Monitoring of Aircraft Engines.
Liu, Liansheng; Liu, Datong; Zhang, Yujie; Peng, Yu
2016-04-29
In a complex system, condition monitoring (CM) can collect the system working status. The condition is mainly sensed by the pre-deployed sensors in/on the system. Most existing works study how to utilize the condition information to predict the upcoming anomalies, faults, or failures. There is also some research which focuses on the faults or anomalies of the sensing element (i.e., sensor) to enhance the system reliability. However, existing approaches ignore the correlation between sensor selecting strategy and data anomaly detection, which can also improve the system reliability. To address this issue, we study a new scheme which includes sensor selection strategy and data anomaly detection by utilizing information theory and Gaussian Process Regression (GPR). The sensors that are more appropriate for the system CM are first selected. Then, mutual information is utilized to weight the correlation among different sensors. The anomaly detection is carried out by using the correlation of sensor data. The sensor data sets that are utilized to carry out the evaluation are provided by National Aeronautics and Space Administration (NASA) Ames Research Center and have been used as Prognostics and Health Management (PHM) challenge data in 2008. By comparing the two different sensor selection strategies, the effectiveness of selection method on data anomaly detection is proved.
Mastenbroek, N J J M; Demerouti, E; van Beukelen, P; Muijtjens, A M M; Scherpbier, A J J A; Jaarsma, A D C
2014-02-15
The Job Demands-Resources model (JD-R model) was used as the theoretical basis of a tailormade questionnaire to measure the psychosocial work environment and personal resources of recently graduated veterinary professionals. According to the JD-R model, two broad categories of work characteristics that determine employee wellbeing can be distinguished: job demands and job resources. Recently, the JD-R model has been expanded by integrating personal resource measures into the model. Three semistructured group interviews with veterinarians active in different work domains were conducted to identify relevant job demands, job resources and personal resources. These demands and resources were organised in themes (constructs). For measurement purposes, a set of questions ('a priori scale') was selected from the literature for each theme. The full set of a priori scales was included in a questionnaire that was administered to 1760 veterinary professionals. Exploratory factor analysis and reliability analysis were conducted to arrive at the final set of validated scales (final scales). 860 veterinarians (73 per cent females) participated. The final set of scales consisted of seven job demands scales (32 items), nine job resources scales (41 items), and six personal resources scales (26 items) which were considered to represent the most relevant potential predictors of work-related wellbeing in this occupational group. The procedure resulted in a tailormade questionnaire: the Veterinary Job Demands and Resources Questionnaire (Vet-DRQ). The use of valid theory and validated scales enhances opportunities for comparative national and international research.
Jürgen Stock: From One End of the Andes to the Other
NASA Astrophysics Data System (ADS)
Vivas, A. K.; Stock, M. J.
2015-05-01
Jürgen Stock (1923-2004) will always be remembered for his work on astronomical site testing. He led the efforts to find the best place for CTIO, and his work had a large influence in the setting of other observatories in Chile. He was the first director of CTIO (1963-1966). After his time in Chile, he moved to the other end of the Andes and was in charge of the site selection and the construction of the only professional observatory in Venezuela, the Llano del Hato National Observatory.
Automated sample plan selection for OPC modeling
NASA Astrophysics Data System (ADS)
Casati, Nathalie; Gabrani, Maria; Viswanathan, Ramya; Bayraktar, Zikri; Jaiswal, Om; DeMaris, David; Abdo, Amr Y.; Oberschmidt, James; Krause, Andreas
2014-03-01
It is desired to reduce the time required to produce metrology data for calibration of Optical Proximity Correction (OPC) models and also maintain or improve the quality of the data collected with regard to how well that data represents the types of patterns that occur in real circuit designs. Previous work based on clustering in geometry and/or image parameter space has shown some benefit over strictly manual or intuitive selection, but leads to arbitrary pattern exclusion or selection which may not be the best representation of the product. Forming the pattern selection as an optimization problem, which co-optimizes a number of objective functions reflecting modelers' insight and expertise, has shown to produce models with equivalent quality to the traditional plan of record (POR) set but in a less time.
Kastenmayer, Robin J; Moore, Rashida M; Bright, Allison L; Torres-Cruz, Rafael; Elkins, William R
2012-01-01
In the interval between the publication of the seventh and eighth editions of the Guide for the Care and Use of Laboratory Animals (Guide), much has changed with regard to the regulation and funding of highly pathogenic biologic agents and toxins (Select Agents). Funding of research involving highly pathogenic agents has increased dramatically during this time, thus increasing the demand for facilities capable of supporting this work. The eighth edition of the Guide briefly mentions Select Agents and provides a limited set of references. Here we provide some background information regarding the relevant laws and regulations, as well as an overview of the programmatic requirements pertaining to the use of Select Agents, with a focus on use in animals. PMID:22776191
Forest conditions and trends in the northern United States
Stephen R. Shifley; Francisco X. Aguilar; Nianfu Song; Susan I. Stewart; David J. Nowak; Dale D. Gormanson; W. Keith Moser; Sherri Wormstead; Eric J. Greenfield
2012-01-01
This section describes current conditions and trends for the 20 Northern States by focusing on selected characteristics associated with forest sustainability. Its format is based upon a set of 64 indicators within 7 broad criteria that the United States and 11 other countries have adopted under the auspices of the Montréal Process Working Group on Criteria and...
ERIC Educational Resources Information Center
Husnaeni
2016-01-01
Critical thinking ability of students' mathematical is a component that must be mastered by the student. Learn to think critically means using mental processes, such as attention, categorize, selection, and rate/decide. Critical thinking ability in giving proper guidance in thinking and working, and assist in determining the relationship between…
ERIC Educational Resources Information Center
Ingleby, Ewan
2018-01-01
This article explores the perceptions of professional development held by a selection of early years educators who have experience of working in statutory and private early years settings in the north of England. The research participants (n = 20) reflected on their experiences of professional development in early years. The research process is…
ERIC Educational Resources Information Center
Fink, C. Dennis; And Others
Recent efforts to assess complex human performances in various work settings are reviewed. The review is based upon recent psychological, educational, and industrial literature, and technical reports sponsored by the military services. A few selected military and industrial locations were also visited in order to learn about current research and…
Training set selection for the prediction of essential genes.
Cheng, Jian; Xu, Zhao; Wu, Wenwu; Zhao, Li; Li, Xiangchen; Liu, Yanlin; Tao, Shiheng
2014-01-01
Various computational models have been developed to transfer annotations of gene essentiality between organisms. However, despite the increasing number of microorganisms with well-characterized sets of essential genes, selection of appropriate training sets for predicting the essential genes of poorly-studied or newly sequenced organisms remains challenging. In this study, a machine learning approach was applied reciprocally to predict the essential genes in 21 microorganisms. Results showed that training set selection greatly influenced predictive accuracy. We determined four criteria for training set selection: (1) essential genes in the selected training set should be reliable; (2) the growth conditions in which essential genes are defined should be consistent in training and prediction sets; (3) species used as training set should be closely related to the target organism; and (4) organisms used as training and prediction sets should exhibit similar phenotypes or lifestyles. We then analyzed the performance of an incomplete training set and an integrated training set with multiple organisms. We found that the size of the training set should be at least 10% of the total genes to yield accurate predictions. Additionally, the integrated training sets exhibited remarkable increase in stability and accuracy compared with single sets. Finally, we compared the performance of the integrated training sets with the four criteria and with random selection. The results revealed that a rational selection of training sets based on our criteria yields better performance than random selection. Thus, our results provide empirical guidance on training set selection for the identification of essential genes on a genome-wide scale.
What are the implications of implementation science for medical education?
Price, David W.; Wagner, Dianne P.; Krane, N. Kevin; Rougas, Steven C.; Lowitt, Nancy R.; Offodile, Regina S.; Easdown, L. Jane; Andrews, Mark A. W.; Kodner, Charles M.; Lypson, Monica; Barnes, Barbara E.
2015-01-01
Background Derived from multiple disciplines and established in industries outside of medicine, Implementation Science (IS) seeks to move evidence-based approaches into widespread use to enable improved outcomes to be realized as quickly as possible by as many as possible. Methods This review highlights selected IS theories and models, chosen based on the experience of the authors, that could be used to plan and deliver medical education activities to help learners better implement and sustain new knowledge and skills in their work settings. Results IS models, theories and approaches can help medical educators promote and determine their success in achieving desired learner outcomes. We discuss the importance of incorporating IS into the training of individuals, teams, and organizations, and employing IS across the medical education continuum. Challenges and specific strategies for the application of IS in educational settings are also discussed. Conclusions Utilizing IS in medical education can help us better achieve changes in competence, performance, and patient outcomes. IS should be incorporated into curricula across disciplines and across the continuum of medical education to facilitate implementation of learning. Educators should start by selecting, applying, and evaluating the teaching and patient care impact one or two IS strategies in their work. PMID:25911282
Garcia Hejl, Carine; Ramirez, Jose Manuel; Vest, Philippe; Chianea, Denis; Renard, Christophe
2014-09-01
Laboratories working towards accreditation by the International Standards Organization (ISO) 15189 standard are required to demonstrate the validity of their analytical methods. The different guidelines set by various accreditation organizations make it difficult to provide objective evidence that an in-house method is fit for the intended purpose. Besides, the required performance characteristics tests and acceptance criteria are not always detailed. The laboratory must choose the most suitable validation protocol and set the acceptance criteria. Therefore, we propose a validation protocol to evaluate the performance of an in-house method. As an example, we validated the process for the detection and quantification of lead in whole blood by electrothermal absorption spectrometry. The fundamental parameters tested were, selectivity, calibration model, precision, accuracy (and uncertainty of measurement), contamination, stability of the sample, reference interval, and analytical interference. We have developed a protocol that has been applied successfully to quantify lead in whole blood by electrothermal atomic absorption spectrometry (ETAAS). In particular, our method is selective, linear, accurate, and precise, making it suitable for use in routine diagnostics.
What are the implications of implementation science for medical education?
Price, David W; Wagner, Dianne P; Krane, N Kevin; Rougas, Steven C; Lowitt, Nancy R; Offodile, Regina S; Easdown, L Jane; Andrews, Mark A W; Kodner, Charles M; Lypson, Monica; Barnes, Barbara E
2015-01-01
Derived from multiple disciplines and established in industries outside of medicine, Implementation Science (IS) seeks to move evidence-based approaches into widespread use to enable improved outcomes to be realized as quickly as possible by as many as possible. This review highlights selected IS theories and models, chosen based on the experience of the authors, that could be used to plan and deliver medical education activities to help learners better implement and sustain new knowledge and skills in their work settings. IS models, theories and approaches can help medical educators promote and determine their success in achieving desired learner outcomes. We discuss the importance of incorporating IS into the training of individuals, teams, and organizations, and employing IS across the medical education continuum. Challenges and specific strategies for the application of IS in educational settings are also discussed. Utilizing IS in medical education can help us better achieve changes in competence, performance, and patient outcomes. IS should be incorporated into curricula across disciplines and across the continuum of medical education to facilitate implementation of learning. Educators should start by selecting, applying, and evaluating the teaching and patient care impact one or two IS strategies in their work.
A mathematical framework for the selection of an optimal set of peptides for epitope-based vaccines.
Toussaint, Nora C; Dönnes, Pierre; Kohlbacher, Oliver
2008-12-01
Epitope-based vaccines (EVs) have a wide range of applications: from therapeutic to prophylactic approaches, from infectious diseases to cancer. The development of an EV is based on the knowledge of target-specific antigens from which immunogenic peptides, so-called epitopes, are derived. Such epitopes form the key components of the EV. Due to regulatory, economic, and practical concerns the number of epitopes that can be included in an EV is limited. Furthermore, as the major histocompatibility complex (MHC) binding these epitopes is highly polymorphic, every patient possesses a set of MHC class I and class II molecules of differing specificities. A peptide combination effective for one person can thus be completely ineffective for another. This renders the optimal selection of these epitopes an important and interesting optimization problem. In this work we present a mathematical framework based on integer linear programming (ILP) that allows the formulation of various flavors of the vaccine design problem and the efficient identification of optimal sets of epitopes. Out of a user-defined set of predicted or experimentally determined epitopes, the framework selects the set with the maximum likelihood of eliciting a broad and potent immune response. Our ILP approach allows an elegant and flexible formulation of numerous variants of the EV design problem. In order to demonstrate this, we show how common immunological requirements for a good EV (e.g., coverage of epitopes from each antigen, coverage of all MHC alleles in a set, or avoidance of epitopes with high mutation rates) can be translated into constraints or modifications of the objective function within the ILP framework. An implementation of the algorithm outperforms a simple greedy strategy as well as a previously suggested evolutionary algorithm and has runtimes on the order of seconds for typical problem sizes.
Adaptive Batch Mode Active Learning.
Chakraborty, Shayok; Balasubramanian, Vineeth; Panchanathan, Sethuraman
2015-08-01
Active learning techniques have gained popularity to reduce human effort in labeling data instances for inducing a classifier. When faced with large amounts of unlabeled data, such algorithms automatically identify the exemplar and representative instances to be selected for manual annotation. More recently, there have been attempts toward a batch mode form of active learning, where a batch of data points is simultaneously selected from an unlabeled set. Real-world applications require adaptive approaches for batch selection in active learning, depending on the complexity of the data stream in question. However, the existing work in this field has primarily focused on static or heuristic batch size selection. In this paper, we propose two novel optimization-based frameworks for adaptive batch mode active learning (BMAL), where the batch size as well as the selection criteria are combined in a single formulation. We exploit gradient-descent-based optimization strategies as well as properties of submodular functions to derive the adaptive BMAL algorithms. The solution procedures have the same computational complexity as existing state-of-the-art static BMAL techniques. Our empirical results on the widely used VidTIMIT and the mobile biometric (MOBIO) data sets portray the efficacy of the proposed frameworks and also certify the potential of these approaches in being used for real-world biometric recognition applications.
Nonequilibrium steady states of ideal bosonic and fermionic quantum gases.
Vorberg, Daniel; Wustmann, Waltraut; Schomerus, Henning; Ketzmerick, Roland; Eckardt, André
2015-12-01
We investigate nonequilibrium steady states of driven-dissipative ideal quantum gases of both bosons and fermions. We focus on systems of sharp particle number that are driven out of equilibrium either by the coupling to several heat baths of different temperature or by time-periodic driving in combination with the coupling to a heat bath. Within the framework of (Floquet-)Born-Markov theory, several analytical and numerical methods are described in detail. This includes a mean-field theory in terms of occupation numbers, an augmented mean-field theory taking into account also nontrivial two-particle correlations, and quantum-jump-type Monte Carlo simulations. For the case of the ideal Fermi gas, these methods are applied to simple lattice models and the possibility of achieving exotic states via bath engineering is pointed out. The largest part of this work is devoted to bosonic quantum gases and the phenomenon of Bose selection, a nonequilibrium generalization of Bose condensation, where multiple single-particle states are selected to acquire a large occupation [Phys. Rev. Lett. 111, 240405 (2013)]. In this context, among others, we provide a theory for transitions where the set of selected states changes, describe an efficient algorithm for finding the set of selected states, investigate beyond-mean-field effects, and identify the dominant mechanisms for heat transport in the Bose-selected state.
NASA Astrophysics Data System (ADS)
Lehmann, Rüdiger; Lösler, Michael
2017-12-01
Geodetic deformation analysis can be interpreted as a model selection problem. The null model indicates that no deformation has occurred. It is opposed to a number of alternative models, which stipulate different deformation patterns. A common way to select the right model is the usage of a statistical hypothesis test. However, since we have to test a series of deformation patterns, this must be a multiple test. As an alternative solution for the test problem, we propose the p-value approach. Another approach arises from information theory. Here, the Akaike information criterion (AIC) or some alternative is used to select an appropriate model for a given set of observations. Both approaches are discussed and applied to two test scenarios: A synthetic levelling network and the Delft test data set. It is demonstrated that they work but behave differently, sometimes even producing different results. Hypothesis tests are well-established in geodesy, but may suffer from an unfavourable choice of the decision error rates. The multiple test also suffers from statistical dependencies between the test statistics, which are neglected. Both problems are overcome by applying information criterions like AIC.
An improved method to detect correct protein folds using partial clustering.
Zhou, Jianjun; Wishart, David S
2013-01-16
Structure-based clustering is commonly used to identify correct protein folds among candidate folds (also called decoys) generated by protein structure prediction programs. However, traditional clustering methods exhibit a poor runtime performance on large decoy sets. We hypothesized that a more efficient "partial" clustering approach in combination with an improved scoring scheme could significantly improve both the speed and performance of existing candidate selection methods. We propose a new scheme that performs rapid but incomplete clustering on protein decoys. Our method detects structurally similar decoys (measured using either C(α) RMSD or GDT-TS score) and extracts representatives from them without assigning every decoy to a cluster. We integrated our new clustering strategy with several different scoring functions to assess both the performance and speed in identifying correct or near-correct folds. Experimental results on 35 Rosetta decoy sets and 40 I-TASSER decoy sets show that our method can improve the correct fold detection rate as assessed by two different quality criteria. This improvement is significantly better than two recently published clustering methods, Durandal and Calibur-lite. Speed and efficiency testing shows that our method can handle much larger decoy sets and is up to 22 times faster than Durandal and Calibur-lite. The new method, named HS-Forest, avoids the computationally expensive task of clustering every decoy, yet still allows superior correct-fold selection. Its improved speed, efficiency and decoy-selection performance should enable structure prediction researchers to work with larger decoy sets and significantly improve their ab initio structure prediction performance.
An improved method to detect correct protein folds using partial clustering
2013-01-01
Background Structure-based clustering is commonly used to identify correct protein folds among candidate folds (also called decoys) generated by protein structure prediction programs. However, traditional clustering methods exhibit a poor runtime performance on large decoy sets. We hypothesized that a more efficient “partial“ clustering approach in combination with an improved scoring scheme could significantly improve both the speed and performance of existing candidate selection methods. Results We propose a new scheme that performs rapid but incomplete clustering on protein decoys. Our method detects structurally similar decoys (measured using either Cα RMSD or GDT-TS score) and extracts representatives from them without assigning every decoy to a cluster. We integrated our new clustering strategy with several different scoring functions to assess both the performance and speed in identifying correct or near-correct folds. Experimental results on 35 Rosetta decoy sets and 40 I-TASSER decoy sets show that our method can improve the correct fold detection rate as assessed by two different quality criteria. This improvement is significantly better than two recently published clustering methods, Durandal and Calibur-lite. Speed and efficiency testing shows that our method can handle much larger decoy sets and is up to 22 times faster than Durandal and Calibur-lite. Conclusions The new method, named HS-Forest, avoids the computationally expensive task of clustering every decoy, yet still allows superior correct-fold selection. Its improved speed, efficiency and decoy-selection performance should enable structure prediction researchers to work with larger decoy sets and significantly improve their ab initio structure prediction performance. PMID:23323835
Lionis, Christos; Papadakaki, Maria; Saridaki, Aristoula; Dowrick, Christopher; O'Donnell, Catherine A; Mair, Frances S; van den Muijsenbergh, Maria; Burns, Nicola; de Brún, Tomas; O'Reilly de Brún, Mary; van Weel-Baumgarten, Evelyn; Spiegel, Wolfgang; MacFarlane, Anne
2016-07-22
Guidelines and training initiatives (G/TIs) are available to support communication in cross-cultural consultations but are rarely implemented in routine practice in primary care. As part of the European Union RESTORE project, our objective was to explore whether the available G/TIs make sense to migrants and other key stakeholders and whether they could collectively choose G/TIs and engage in their implementation in primary care settings. As part of a comparative analysis of 5 linked qualitative case studies, we used purposeful and snowball sampling to recruit migrants and other key stakeholders in primary care settings in Austria, England, Greece, Ireland and the Netherlands. A total of 78 stakeholders participated in the study (Austria 15, England 9, Ireland 11, Greece 16, Netherlands 27), covering a range of groups (migrants, general practitioners, nurses, administrative staff, interpreters, health service planners). We combined Normalisation Process Theory (NPT) and Participatory Learning and Action (PLA) research to conduct a series of PLA style focus groups. Using a standardised protocol, stakeholders' discussions about a set of G/TIs were recorded on PLA commentary charts and their selection process was recorded through a PLA direct-ranking technique. We performed inductive and deductive thematic analysis to investigate sensemaking and engagement with the G/TIs. The need for new ways of working was strongly endorsed by most stakeholders. Stakeholders considered that they were the right people to drive the work forward and were keen to enrol others to support the implementation work. This was evidenced by the democratic selection by stakeholders in each setting of one G/TI as a local implementation project. This theoretically informed participatory approach used across 5 countries with diverse healthcare systems could be used in other settings to establish positive conditions for the start of implementation journeys for G/TIs to improve healthcare for migrants. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Pluto, Delores M; Phillips, Martha M; Matson-Koffman, Dyann; Shepard, Dennis M; Raczynski, James M; Brownstein, J Nell
2004-04-01
Investigators in South Carolina and Alabama assessed the availability of data for measuring 31 policy and environmental indicators for heart disease and stroke prevention. The indicators were intended to determine policy and environmental support for adopting heart disease and stroke prevention guidelines and selected risk factors in 4 settings: community, school, work site, and health care. Research teams used literature searches and key informant interviews to explore the availability of data sources for each indicator. Investigators documented the following 5 qualities for each data source identified: 1) the degree to which the data fit the indicator; 2) the frequency and regularity with which data were collected; 3) the consistency of data collected across time; 4) the costs (time, money, personnel) associated with data collection or access; and 5) the accessibility of data. Among the 31 indicators, 11 (35%) have readily available data sources and 4 (13%) have sources that could provide partial measurement. Data sources are available for most indicators in the school setting and for tobacco control policies in all settings. Data sources for measuring policy and environmental indicators for heart disease and stroke prevention are limited in availability. Effort and resources are required to develop and implement mechanisms for collecting state and local data on policy and environmental indicators in different settings. The level of work needed to expand data sources is comparable to the extensive work already completed in the school setting and for tobacco control.
Can Geostatistical Models Represent Nature's Variability? An Analysis Using Flume Experiments
NASA Astrophysics Data System (ADS)
Scheidt, C.; Fernandes, A. M.; Paola, C.; Caers, J.
2015-12-01
The lack of understanding in the Earth's geological and physical processes governing sediment deposition render subsurface modeling subject to large uncertainty. Geostatistics is often used to model uncertainty because of its capability to stochastically generate spatially varying realizations of the subsurface. These methods can generate a range of realizations of a given pattern - but how representative are these of the full natural variability? And how can we identify the minimum set of images that represent this natural variability? Here we use this minimum set to define the geostatistical prior model: a set of training images that represent the range of patterns generated by autogenic variability in the sedimentary environment under study. The proper definition of the prior model is essential in capturing the variability of the depositional patterns. This work starts with a set of overhead images from an experimental basin that showed ongoing autogenic variability. We use the images to analyze the essential characteristics of this suite of patterns. In particular, our goal is to define a prior model (a minimal set of selected training images) such that geostatistical algorithms, when applied to this set, can reproduce the full measured variability. A necessary prerequisite is to define a measure of variability. In this study, we measure variability using a dissimilarity distance between the images. The distance indicates whether two snapshots contain similar depositional patterns. To reproduce the variability in the images, we apply an MPS algorithm to the set of selected snapshots of the sedimentary basin that serve as training images. The training images are chosen from among the initial set by using the distance measure to ensure that only dissimilar images are chosen. Preliminary investigations show that MPS can reproduce fairly accurately the natural variability of the experimental depositional system. Furthermore, the selected training images provide process information. They fall into three basic patterns: a channelized end member, a sheet flow end member, and one intermediate case. These represent the continuum between autogenic bypass or erosion, and net deposition.
Environmental diversity as a surrogate for species representation.
Beier, Paul; de Albuquerque, Fábio Suzart
2015-10-01
Because many species have not been described and most species ranges have not been mapped, conservation planners often use surrogates for conservation planning, but evidence for surrogate effectiveness is weak. Surrogates are well-mapped features such as soil types, landforms, occurrences of an easily observed taxon (discrete surrogates), and well-mapped environmental conditions (continuous surrogate). In the context of reserve selection, the idea is that a set of sites selected to span diversity in the surrogate will efficiently represent most species. Environmental diversity (ED) is a rarely used surrogate that selects sites to efficiently span multivariate ordination space. Because it selects across continuous environmental space, ED should perform better than discrete surrogates (which necessarily ignore within-bin and between-bin heterogeneity). Despite this theoretical advantage, ED appears to have performed poorly in previous tests of its ability to identify 50 × 50 km cells that represented vertebrates in Western Europe. Using an improved implementation of ED, we retested ED on Western European birds, mammals, reptiles, amphibians, and combined terrestrial vertebrates. We also tested ED on data sets for plants of Zimbabwe, birds of Spain, and birds of Arizona (United States). Sites selected using ED represented European mammals no better than randomly selected cells, but they represented species in the other 7 data sets with 20% to 84% effectiveness. This far exceeds the performance in previous tests of ED, and exceeds the performance of most discrete surrogates. We believe ED performed poorly in previous tests because those tests considered only a few candidate explanatory variables and used suboptimal forms of ED's selection algorithm. We suggest future work on ED focus on analyses at finer grain sizes more relevant to conservation decisions, explore the effect of selecting the explanatory variables most associated with species turnover, and investigate whether nonclimate abiotic variables can provide useful surrogates in an ED framework. © 2015 Society for Conservation Biology.
[Renal denervation in resistant hypertension: proposal for a common multidisciplinary attitude].
Muller, Olivier; Qanadli, Salah D; Waeber, Bernard; Wuerzner, Grégoire
2012-05-30
The prevalence of resistant hypertension ranges between 5-30%. Patients with resistant hypertension are at increased risk of cardiovascular events. Radiofrequency renal denervation is a recent and promising technique that can be used in the setting of resistant hypertension. However, long-term safety and efficacy data are lacking and evidence to use this procedure outside the strict setting of resistant hypertension is missing. The aim of the article is to propose a common work-up for nephrologists, hypertensiologists, cardiologists and interventional radiologists in order to avoid inappropriate selection of patients and a possible misuse of this procedure.
Stochastic subset selection for learning with kernel machines.
Rhinelander, Jason; Liu, Xiaoping P
2012-06-01
Kernel machines have gained much popularity in applications of machine learning. Support vector machines (SVMs) are a subset of kernel machines and generalize well for classification, regression, and anomaly detection tasks. The training procedure for traditional SVMs involves solving a quadratic programming (QP) problem. The QP problem scales super linearly in computational effort with the number of training samples and is often used for the offline batch processing of data. Kernel machines operate by retaining a subset of observed data during training. The data vectors contained within this subset are referred to as support vectors (SVs). The work presented in this paper introduces a subset selection method for the use of kernel machines in online, changing environments. Our algorithm works by using a stochastic indexing technique when selecting a subset of SVs when computing the kernel expansion. The work described here is novel because it separates the selection of kernel basis functions from the training algorithm used. The subset selection algorithm presented here can be used in conjunction with any online training technique. It is important for online kernel machines to be computationally efficient due to the real-time requirements of online environments. Our algorithm is an important contribution because it scales linearly with the number of training samples and is compatible with current training techniques. Our algorithm outperforms standard techniques in terms of computational efficiency and provides increased recognition accuracy in our experiments. We provide results from experiments using both simulated and real-world data sets to verify our algorithm.
Strange Beta: Chaotic Variations for Indoor Rock Climbing Route Setting
NASA Astrophysics Data System (ADS)
Phillips, Caleb; Bradley, Elizabeth
2011-04-01
In this paper we apply chaotic systems to the task of sequence variation for the purpose of aiding humans in setting indoor rock climbing routes. This work expands on prior work where similar variations were used to assist in dance choreography and music composition. We present a formalization for transcription of rock climbing problems and a variation generator that is tuned for this domain and addresses some confounding problems, including a new approach to automatic selection of initial conditions. We analyze our system with a large blinded study in a commercial climbing gym in cooperation with experienced climbers and expert route setters. Our results show that our system is capable of assisting a human setter in producing routes that are at least as good as, and in some cases better than, those produced traditionally.
Subspace techniques to remove artifacts from EEG: a quantitative analysis.
Teixeira, A R; Tome, A M; Lang, E W; Martins da Silva, A
2008-01-01
In this work we discuss and apply projective subspace techniques to both multichannel as well as single channel recordings. The single-channel approach is based on singular spectrum analysis(SSA) and the multichannel approach uses the extended infomax algorithm which is implemented in the opensource toolbox EEGLAB. Both approaches will be evaluated using artificial mixtures of a set of selected EEG signals. The latter were selected visually to contain as the dominant activity one of the characteristic bands of an electroencephalogram (EEG). The evaluation is performed both in the time and frequency domain by using correlation coefficients and coherence function, respectively.
Allden, K; Jones, L; Weissbecker, I; Wessells, M; Bolton, P; Betancourt, T S; Hijazi, Z; Galappatti, A; Yamout, R; Patel, P; Sumathipala, A
2009-01-01
The Working Group on Mental Health and Psychosocial Support was convened as part of the 2009 Harvard Humanitarian Action Summit. The Working Group chose to focus on ethical issues in mental health and psychosocial research and programming in humanitarian settings. The Working Group built on previous work and recommendations, such as the Inter-Agency Standing Committee's Guidelines on Mental Health and Psychosocial Support in Emergency Settings. The objective of this working group was to address one of the factors contributing to the deficiency of research and the need to develop the evidence base on mental health and psychosocial support interventions during complex emergencies by proposing ethical research guidelines. Outcomes research is vital for effective program development in emergency settings, but to date, no comprehensive ethical guidelines exist for guiding such research efforts. Working Group members conducted literature reviews which included peer-reviewed publications, agency reports, and relevant guidelines on the following topics: general ethical principles in research, cross-cultural issues, research in resource-poor countries, and specific populations such as trauma and torture survivors, refugees, minorities, children and youth, and the mentally ill. Working Group members also shared key points regarding ethical issues encountered in their own research and fieldwork. The group adapted a broad definition of the term "research", which encompasses needs assessments and data gathering, as well as monitoring and evaluation. The guidelines are conceptualized as applying to formal and informal processes of assessment and evaluation in which researchers as well as most service providers engage. The group reached consensus that it would be unethical not to conduct research and evaluate outcomes of mental health and psychosocial interventions in emergency settings, given that there currently is very little good evidence base for such interventions. Overarching themes and issues generated by the group for further study and articulation included: purpose and benefits of research, issues of validity, neutrality, risk, subject selection and participation, confidentiality, consent, and dissemination of results. The group outlined several key topics and recommendations that address ethical issues in conducting mental health and psychosocial research in humanitarian settings. The group views this set of recommendations as a living document to be further developed and refined based on input from colleagues representing different regions of the globe with an emphasis on input from colleagues from low-resource countries.
Executive and Perceptual Distraction in Visual Working Memory
2017-01-01
The contents of visual working memory are likely to reflect the influence of both executive control resources and information present in the environment. We investigated whether executive attention is critical in the ability to exclude unwanted stimuli by introducing concurrent potentially distracting irrelevant items to a visual working memory paradigm, and manipulating executive load using simple or more demanding secondary verbal tasks. Across 7 experiments varying in presentation format, timing, stimulus set, and distractor number, we observed clear disruptive effects of executive load and visual distraction, but relatively minimal evidence supporting an interactive relationship between these factors. These findings are in line with recent evidence using delay-based interference, and suggest that different forms of attentional selection operate relatively independently in visual working memory. PMID:28414499
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Jeff; Rylander, Matthew; Boemer, Jens
The fourth solicitation of the California Solar Initiative (CSI) Research, Development, Demonstration and Deployment (RD&D) Program established by the California Public Utilities Commission (CPUC) supported the Electric Power Research Institute (EPRI), National Renewable Energy Laboratory (NREL), and Sandia National Laboratories (SNL) with data provided from Pacific Gas and Electric (PG&E), Southern California Edison (SCE), and San Diego Gas and Electric (SDG&E) conducted research to determine optimal default settings for distributed energy resource advanced inverter controls. The inverter functions studied are aligned with those developed by the California Smart Inverter Working Group (SIWG) and those being considered by the IEEE 1547more » Working Group. The advanced inverter controls examined to improve the distribution system response included power factor, volt-var, and volt-watt. The advanced inverter controls examined to improve the transmission system response included frequency and voltage ride-through as well as Dynamic Voltage Support. This CSI RD&D project accomplished the task of developing methods to derive distribution focused advanced inverter control settings, selecting a diverse set of feeders to evaluate the methods through detailed analysis, and evaluating the effectiveness of each method developed. Inverter settings focused on the transmission system performance were also evaluated and verified. Based on the findings of this work, the suggested advanced inverter settings and methods to determine settings can be used to improve the accommodation of distributed energy resources (PV specifically). The voltage impact from PV can be mitigated using power factor, volt-var, or volt-watt control, while the bulk system impact can be improved with frequency/voltage ride-through.« less
Density-dependent effects of ants on selection for bumble bee pollination in Polemonium viscosum.
Galen, Candace; Geib, Jennifer C
2007-05-01
Mutualisms are commonly exploited by cheater species that usurp rewards without providing reciprocal benefits. Yet most studies of selection between mutualist partners ignore interactions with third species and consequently overlook the impact of cheaters on evolution in the mutualism. Here, we explicitly investigate how the abundance of nectar-thieving ants (cheaters) influences selection in a pollination mutualism between bumble bees and the alpine skypilot, Polemonium viscosum. As suggested in past work with this species, bumble bees accounted for most of the seed production (78% +/- 6% [mean +/- SE]) in our high tundra study population and, in the absence of ants, exerted strong selection for large flowers. We tested for indirect effects of ant abundance on seed set through bumble bee pollination services (pollen delivery and pollen export) and a direct effect through flower damage. Ants reduced seed set per flower by 20% via flower damage. As ant density increased within experimental patches, the rate of flower damage rose, but pollen delivery and export did not vary significantly, showing that indirect effects of increased cheater abundance on pollinator service are negligible in this system. To address how ants affect selection for plant participation in the pollination mutualism we tested the impact of ant abundance on selection for bumble bee-mediated pollination. Results show that the impact of ants on fitness (seed set) accruing under bumble bee pollination is density dependent in P. viscosum. Selection for bumble bee pollination declined with increasing ant abundance in experimental patches, as predicted if cheaters constrain fitness returns of mutualist partner services. We also examined how ant abundance influences selection on flower size, a key component of plant investment in bumble bee pollination. We predicted that direct effects of ants would constrain bumble bee selection for large flowers. However, selection on flower size was significantly positive over a wide range of ant abundance (20-80% of plants visited by ants daily). Although high cheater abundance reduces the fitness returns of bumble bee pollination, it does not completely eliminate selection for bumble bee attraction in P. viscosum.
Wang, Xun; Sun, Beibei; Liu, Boyang; Fu, Yaping; Zheng, Pan
2017-01-01
Experimental design focuses on describing or explaining the multifactorial interactions that are hypothesized to reflect the variation. The design introduces conditions that may directly affect the variation, where particular conditions are purposely selected for observation. Combinatorial design theory deals with the existence, construction and properties of systems of finite sets whose arrangements satisfy generalized concepts of balance and/or symmetry. In this work, borrowing the concept of "balance" in combinatorial design theory, a novel method for multifactorial bio-chemical experiments design is proposed, where balanced templates in combinational design are used to select the conditions for observation. Balanced experimental data that covers all the influencing factors of experiments can be obtianed for further processing, such as training set for machine learning models. Finally, a software based on the proposed method is developed for designing experiments with covering influencing factors a certain number of times.
Schofield, Thomas J; Martin, Monica J; Conger, Katherine J; Neppl, Tricia M; Donnellan, M Brent; Conger, Rand D
2011-01-01
The interactionist model (IM) of human development (R. D. Conger & M. B. Donellan, 2007) proposes that the association between socioeconomic status (SES) and human development involves a dynamic interplay that includes both social causation (SES influences human development) and social selection (individual characteristics affect SES). Using a multigenerational data set involving 271 families, the current study finds empirical support for the IM. Adolescent personality characteristics indicative of social competence, goal-setting, hard work, and emotional stability predicted later SES, parenting, and family characteristics that were related to the positive development of a third-generation child. Processes of both social selection and social causation appear to account for the association between SES and dimensions of human development indicative of healthy functioning across multiple generations. © 2011 The Authors. Child Development © 2011 Society for Research in Child Development, Inc.
Matzel, Louis D.; Light, Kenneth R.; Wass, Christopher; Colas-Zelin, Danielle; Denman-Brice, Alexander; Waddel, Adam C.; Kolata, Stefan
2011-01-01
Learning, attentional, and perseverative deficits are characteristic of cognitive aging. In this study, genetically diverse CD-1 mice underwent longitudinal training in a task asserted to tax working memory capacity and its dependence on selective attention. Beginning at 3 mo of age, animals were trained for 12 d to perform in a dual radial-arm maze task that required the mice to remember and operate on two sets of overlapping guidance (spatial) cues. As previously reported, this training resulted in an immediate (at 4 mo of age) improvement in the animals' aggregate performance across a battery of five learning tasks. Subsequently, these animals received an additional 3 d of working memory training at 3-wk intervals for 15 mo (totaling 66 training sessions), and at 18 mo of age were assessed on a selective attention task, a second set of learning tasks, and variations of those tasks that required the animals to modify the previously learned response. Both attentional and learning abilities (on passive avoidance, active avoidance, and reinforced alternation tasks) were impaired in aged animals that had not received working memory training. Likewise, these aged animals exhibited consistent deficits when required to modify a previously instantiated learned response (in reinforced alternation, active avoidance, and spatial water maze). In contrast, these attentional, learning, and perseverative deficits were attenuated in aged animals that had undergone lifelong working memory exercise. These results suggest that general impairments of learning, attention, and cognitive flexibility may be mitigated by a cognitive exercise regimen that requires chronic attentional engagement. PMID:21521768
Matzel, Louis D; Light, Kenneth R; Wass, Christopher; Colas-Zelin, Danielle; Denman-Brice, Alexander; Waddel, Adam C; Kolata, Stefan
2011-01-01
Learning, attentional, and perseverative deficits are characteristic of cognitive aging. In this study, genetically diverse CD-1 mice underwent longitudinal training in a task asserted to tax working memory capacity and its dependence on selective attention. Beginning at 3 mo of age, animals were trained for 12 d to perform in a dual radial-arm maze task that required the mice to remember and operate on two sets of overlapping guidance (spatial) cues. As previously reported, this training resulted in an immediate (at 4 mo of age) improvement in the animals' aggregate performance across a battery of five learning tasks. Subsequently, these animals received an additional 3 d of working memory training at 3-wk intervals for 15 mo (totaling 66 training sessions), and at 18 mo of age were assessed on a selective attention task, a second set of learning tasks, and variations of those tasks that required the animals to modify the previously learned response. Both attentional and learning abilities (on passive avoidance, active avoidance, and reinforced alternation tasks) were impaired in aged animals that had not received working memory training. Likewise, these aged animals exhibited consistent deficits when required to modify a previously instantiated learned response (in reinforced alternation, active avoidance, and spatial water maze). In contrast, these attentional, learning, and perseverative deficits were attenuated in aged animals that had undergone lifelong working memory exercise. These results suggest that general impairments of learning, attention, and cognitive flexibility may be mitigated by a cognitive exercise regimen that requires chronic attentional engagement.
Marino, Miguel; Killerby, Marie; Lee, Soomi; Klein, Laura Cousino; Moen, Phyllis; Olson, Ryan; Kossek, Ellen Ernst; King, Rosalind; Erickson, Leslie; Berkman, Lisa F; Buxton, Orfeu M
2016-12-01
To evaluate the effects of a workplace-based intervention on actigraphic and self-reported sleep outcomes in an extended care setting. Cluster randomized trial. Extended-care (nursing) facilities. US employees and managers at nursing homes. Nursing homes were randomly selected to intervention or control settings. The Work, Family and Health Study developed an intervention aimed at reducing work-family conflict within a 4-month work-family organizational change process. Employees participated in interactive sessions with facilitated discussions, role-playing, and games designed to increase control over work processes and work time. Managers completed training in family-supportive supervision. Primary actigraphic outcomes included: total sleep duration, wake after sleep onset, nighttime sleep, variation in nighttime sleep, nap duration, and number of naps. Secondary survey outcomes included work-to-family conflict, sleep insufficiency, insomnia symptoms and sleep quality. Measures were obtained at baseline, 6-months and 12-months post-intervention. A total of 1,522 employees and 184 managers provided survey data at baseline. Managers and employees in the intervention arm showed no significant difference in sleep outcomes over time compared to control participants. Sleep outcomes were not moderated by work-to-family conflict or presence of children in the household for managers or employees. Age significantly moderated an intervention effect on nighttime sleep among employees (p=0.040), where younger employees benefited more from the intervention. In the context of an extended-care nursing home workplace, the intervention did not significantly alter sleep outcomes in either managers or employees. Moderating effects of age were identified where younger employees' sleep outcomes benefited more from the intervention.
How do nurse practitioners work in primary health care settings? A scoping review.
Grant, Julian; Lines, Lauren; Darbyshire, Philip; Parry, Yvonne
2017-10-01
This scoping review explores the work of nurse practitioners in primary health care settings in developed countries and critiques their contribution to improved health outcomes. A scoping review design was employed and included development of a research question, identification of potentially relevant studies, selection of relevant studies, charting data, collating, summarising and reporting findings. An additional step was added to evaluate the methodological rigor of each study. Data sources included literature identified by a search of electronic databases conducted in September 2015 (CINAHL, Informit, Web of Science, Scopus and Medline) and repeated in July 2016. Additional studies were located through hand searching and authors' knowledge of other relevant studies. 74 articles from eight countries were identified, with the majority emanating from the United States of America. Nurse practitioners working in communities provided care mostly in primary care centres (n=42), but also in community centres (n=6), outpatient departments (n=6), homes (n=5), schools (n=3), child abuse clinics (n=1), via communication technologies (n=6), and through combined face-to-face and communication technologies (n=5). The scope of nurse practitioner work varied on a continuum from being targeted towards a specific disease process or managing individual health and wellbeing needs in a holistic manner. Enhanced skills included co-ordination, collaboration, education, counselling, connecting clients with services and advocacy. Measures used to evaluate outcomes varied widely from physiological data (n=25), hospital admissions (n=10), use of health services (n=15), self-reported health (n=13), behavioural change (n=14), patient satisfaction (n=17), cost savings (n=3) and mortality/morbidity (n=5). The majority of nurse practitioners working in community settings did so within a selective model of primary health care with some examples of nurse practitioners contributing to comprehensive models of primary health care. Nurse practitioners predominantly worked with populations defined by an illness with structured protocols for curative and rehabilitative care. Nurse practitioner work that also incorporated promotive activities targeted improving social determinants of health for people rendered vulnerable due to ethnicity, Aboriginal identity, socioeconomic disadvantage, remote location, gender and aging. Interventions were at individual and community levels with outcomes including increased access to care, cost savings and salutogenic characteristics of empowerment for social change. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Palou, Anna; Miró, Aira; Blanco, Marcelo; Larraz, Rafael; Gómez, José Francisco; Martínez, Teresa; González, Josep Maria; Alcalà, Manel
2017-06-01
Even when the feasibility of using near infrared (NIR) spectroscopy combined with partial least squares (PLS) regression for prediction of physico-chemical properties of biodiesel/diesel blends has been widely demonstrated, inclusion in the calibration sets of the whole variability of diesel samples from diverse production origins still remains as an important challenge when constructing the models. This work presents a useful strategy for the systematic selection of calibration sets of samples of biodiesel/diesel blends from diverse origins, based on a binary code, principal components analysis (PCA) and the Kennard-Stones algorithm. Results show that using this methodology the models can keep their robustness over time. PLS calculations have been done using a specialized chemometric software as well as the software of the NIR instrument installed in plant, and both produced RMSEP under reproducibility values of the reference methods. The models have been proved for on-line simultaneous determination of seven properties: density, cetane index, fatty acid methyl esters (FAME) content, cloud point, boiling point at 95% of recovery, flash point and sulphur.
Willecke, N; Szepes, A; Wunderlich, M; Remon, J P; Vervaet, C; De Beer, T
2017-04-30
The overall objective of this work is to understand how excipient characteristics influence the process and product performance for a continuous twin-screw wet granulation process. The knowledge gained through this study is intended to be used for a Quality by Design (QbD)-based formulation design approach and formulation optimization. A total of 9 preferred fillers and 9 preferred binders were selected for this study. The selected fillers and binders were extensively characterized regarding their physico-chemical and solid state properties using 21 material characterization techniques. Subsequently, principal component analysis (PCA) was performed on the data sets of filler and binder characteristics in order to reduce the variety of single characteristics to a limited number of overarching properties. Four principal components (PC) explained 98.4% of the overall variability in the fillers data set, while three principal components explained 93.4% of the overall variability in the data set of binders. Both PCA models allowed in-depth evaluation of similarities and differences in the excipient properties. Copyright © 2017. Published by Elsevier B.V.
Optimal SVM parameter selection for non-separable and unbalanced datasets.
Jiang, Peng; Missoum, Samy; Chen, Zhao
2014-10-01
This article presents a study of three validation metrics used for the selection of optimal parameters of a support vector machine (SVM) classifier in the case of non-separable and unbalanced datasets. This situation is often encountered when the data is obtained experimentally or clinically. The three metrics selected in this work are the area under the ROC curve (AUC), accuracy, and balanced accuracy. These validation metrics are tested using computational data only, which enables the creation of fully separable sets of data. This way, non-separable datasets, representative of a real-world problem, can be created by projection onto a lower dimensional sub-space. The knowledge of the separable dataset, unknown in real-world problems, provides a reference to compare the three validation metrics using a quantity referred to as the "weighted likelihood". As an application example, the study investigates a classification model for hip fracture prediction. The data is obtained from a parameterized finite element model of a femur. The performance of the various validation metrics is studied for several levels of separability, ratios of unbalance, and training set sizes.
1999-12-01
strategies that lead to sustained competitive advantage (set of factors or capabilities that allows firms to consistently outperform their rivals). This...a strategy-supportive culture, creating an effective organizational structure, redirecting marketing efforts, preparing budgets, developing and...evaluates if strategies are working well. This is important since external and internal factors are constantly changing. Three fundamental strategy
ERIC Educational Resources Information Center
Meer, Jonathan; Rosen, Harvey S.
2008-01-01
An ongoing controversy in the literature on the economics of higher education centers on whether the success of a school's athletic program affects alumni donations. This paper uses a unique data set to investigate this issue. The data contain detailed information about donations made by alumni of a selective research university as well as a…
Radvansky, Gabriel A.; D’Mello, Sidney K.; Abbott, Robert G.; ...
2016-01-27
The Fluid Events Model is aimed at predicting changes in the actions people take on a moment-by-moment basis. In contrast with other research on action selection, this work does not investigate why some course of action was selected, but rather the likelihood of discontinuing the current course of action and selecting another in the near future. This is done using both task-based and experience-based factors. Prior work evaluated this model in the context of trial-by-trial, independent, interactive events, such as choosing how to copy a figure of a line drawing. In this paper, we extend this model to more covertmore » event experiences, such as reading narratives, as well as to continuous interactive events, such as playing a video game. To this end, the model was applied to existing data sets of reading time and event segmentation for written and picture stories. It was also applied to existing data sets of performance in a strategy board game, an aerial combat game, and a first person shooter game in which a participant’s current state was dependent on prior events. The results revealed that the model predicted behavior changes well, taking into account both the theoretically defined structure of the described events, as well as a person’s prior experience. Hence, theories of event cognition can benefit from efforts that take into account not only how events in the world are structured, but also how people experience those events.« less
Radvansky, Gabriel A.; D’Mello, Sidney K.; Abbott, Robert G.; Bixler, Robert E.
2016-01-01
The Fluid Events Model is aimed at predicting changes in the actions people take on a moment-by-moment basis. In contrast with other research on action selection, this work does not investigate why some course of action was selected, but rather the likelihood of discontinuing the current course of action and selecting another in the near future. This is done using both task-based and experience-based factors. Prior work evaluated this model in the context of trial-by-trial, independent, interactive events, such as choosing how to copy a figure of a line drawing. In this paper, we extend this model to more covert event experiences, such as reading narratives, as well as to continuous interactive events, such as playing a video game. To this end, the model was applied to existing data sets of reading time and event segmentation for written and picture stories. It was also applied to existing data sets of performance in a strategy board game, an aerial combat game, and a first person shooter game in which a participant’s current state was dependent on prior events. The results revealed that the model predicted behavior changes well, taking into account both the theoretically defined structure of the described events, as well as a person’s prior experience. Thus, theories of event cognition can benefit from efforts that take into account not only how events in the world are structured, but also how people experience those events. PMID:26858673
Radvansky, Gabriel A; D'Mello, Sidney K; Abbott, Robert G; Bixler, Robert E
2016-01-01
The Fluid Events Model is aimed at predicting changes in the actions people take on a moment-by-moment basis. In contrast with other research on action selection, this work does not investigate why some course of action was selected, but rather the likelihood of discontinuing the current course of action and selecting another in the near future. This is done using both task-based and experience-based factors. Prior work evaluated this model in the context of trial-by-trial, independent, interactive events, such as choosing how to copy a figure of a line drawing. In this paper, we extend this model to more covert event experiences, such as reading narratives, as well as to continuous interactive events, such as playing a video game. To this end, the model was applied to existing data sets of reading time and event segmentation for written and picture stories. It was also applied to existing data sets of performance in a strategy board game, an aerial combat game, and a first person shooter game in which a participant's current state was dependent on prior events. The results revealed that the model predicted behavior changes well, taking into account both the theoretically defined structure of the described events, as well as a person's prior experience. Thus, theories of event cognition can benefit from efforts that take into account not only how events in the world are structured, but also how people experience those events.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Radvansky, Gabriel A.; D’Mello, Sidney K.; Abbott, Robert G.
The Fluid Events Model is aimed at predicting changes in the actions people take on a moment-by-moment basis. In contrast with other research on action selection, this work does not investigate why some course of action was selected, but rather the likelihood of discontinuing the current course of action and selecting another in the near future. This is done using both task-based and experience-based factors. Prior work evaluated this model in the context of trial-by-trial, independent, interactive events, such as choosing how to copy a figure of a line drawing. In this paper, we extend this model to more covertmore » event experiences, such as reading narratives, as well as to continuous interactive events, such as playing a video game. To this end, the model was applied to existing data sets of reading time and event segmentation for written and picture stories. It was also applied to existing data sets of performance in a strategy board game, an aerial combat game, and a first person shooter game in which a participant’s current state was dependent on prior events. The results revealed that the model predicted behavior changes well, taking into account both the theoretically defined structure of the described events, as well as a person’s prior experience. Hence, theories of event cognition can benefit from efforts that take into account not only how events in the world are structured, but also how people experience those events.« less
Ferrario, M M; Cesana, G
2009-01-01
Due to a new legislation, the assessment of work stress has become compulsory in Italy for all the enterprise. Work stress is become a leading health problem in work settings all over Europe. The two major approaches, the expert-based direct observations and the measurements of the perceived job strain, are briefly introduced emphasizing on strengthens and weaknesses. Among the methods to assess perceived job stress, the Karasek's Job Content Questionnaire has been extensively used in Italy, and the available results support its use because reliable and able to pick up major constrictiveness at work. In addition, because it is now possible to have reference levels, comparisons are possible for either public or private enterprises. Acknowledging the complexity of carrying out reliable assessment of work stress, a multiphase approach is emphasised: first an analysis or current data can be used to estimate the levels of turnover, down-sizing, outsourcing, extra hours, shift work, sickness absenteeism, changes of job titles, work accidents and work-related diseases. At a second step, on groups of workers selected because recognised at risk at the first phase and on control groups, the JCQ can be used to assess workers stress perception. Finally, when constrain conditions emerge,further investigations are required, including: intervention of experts in work organisation analysis, clinical psychological examinations of selected workers, to separate between work-related and personal psychological problems and health consequences.
Gunathunga, M W
2016-10-17
Cognitive ergonomics in the work place has become a serious concern with the need to keep people happy at work while maintaining high productivity. Hence, it is worth exploring how the outcomes of lifestyle-based mind development programs can bring about happiness in workplace while keeping productivity and quality of services high. The objective of the present work was to test a body-mind technique to improve cognitive ergonomics in a health care work setting. Principal investigator explored many body-mind techniques before selecting the present method of "insight meditation" which he mastered before applying it on a group of scholars who made it a part of their lifestyle. Later it was introduced to a sample of 500 volunteer health personnel in the western province to generate a ripple effect of happiness at work. Initial qualitative information indicated improvement of some aspects of cognitive ergonomics among those who practiced it. There was a relief from stress during the practice sessions and improvements in the commitment to work and in team spirit. A demand was observed for further training. A quasi-experimental study to test the improvements is underway. Health workers showed interest in the mind training and potential benefits to individuals and the institutions were observed.
Vermeerbergen, Lander; Van Hootegem, Geert; Benders, Jos
2017-02-01
Ongoing shortages of care workers, together with an ageing population, make it of utmost importance to increase the quality of working life in nursing homes. Since the 1970s, normalised and small-scale nursing homes have been increasingly introduced to provide care in a family and homelike environment, potentially providing a richer work life for care workers as well as improved living conditions for residents. 'Normalised' refers to the opportunities given to residents to live in a manner as close as possible to the everyday life of persons not needing care. The study purpose is to provide a synthesis and overview of empirical research comparing the quality of working life - together with related work and health outcomes - of professional care workers in normalised small-scale nursing homes as compared to conventional large-scale ones. A systematic review of qualitative and quantitative studies. A systematic literature search (April 2015) was performed using the electronic databases Pubmed, Embase, PsycInfo, CINAHL and Web of Science. References and citations were tracked to identify additional, relevant studies. We identified 825 studies in the selected databases. After checking the inclusion and exclusion criteria, nine studies were selected for review. Two additional studies were selected after reference and citation tracking. Three studies were excluded after requesting more information on the research setting. The findings from the individual studies suggest that levels of job control and job demands (all but "time pressure") are higher in normalised small-scale homes than in conventional large-scale nursing homes. Additionally, some studies suggested that social support and work motivation are higher, while risks of burnout and mental strain are lower, in normalised small-scale nursing homes. Other studies found no differences or even opposing findings. The studies reviewed showed that these inconclusive findings can be attributed to care workers in some normalised small-scale homes experiencing isolation and too high job demands in their work roles. This systematic review suggests that normalised small-scale homes are a good starting point for creating a higher quality of working life in the nursing home sector. Higher job control enables care workers to manage higher job demands in normalised small-scale homes. However, some jobs would benefit from interventions to address care workers' perceptions of too low social support and of too high job demands. More research is needed to examine strategies to enhance these working life issues in normalised small-scale settings. Copyright © 2016 Elsevier Ltd. All rights reserved.
Junne, Florian; Rieger, Monika; Michaelis, Martina; Nikendei, Christoph; Gündel, Harald; Zipfel, Stephan; Rothermund, Eva
2017-04-01
Psycho-mental stressors and increased perceived stress in workplace settings may determine the onset and course of stress-related mental and psychosomatic disorders. For the description of psycho-mental stressors three distinct models have widely been used in the analyses of the matter: the Demand-Control-Model by Karasek and Theorell, the Effort-Reward-Imbalance Model by Siegrist, and the Model of Organisational Justice.The interactional or social dimension in work-place settings can be seen as a cross-sectional dimension to the above mentioned models. Here, social conflicts and mobbing, as specific forms of interactional problems, are of importance.Besides measures of primary prevention which can be derived from applying the above mentioned models, attention is paid increasingly to secondary and tertiary preventive measures in work-place settings. Concepts such as the psychosomatic consultation-hour within the context of workplace showed to be effective measures for the early detection of people at risk or early stages of e. g. stress-related psychosomatic disorders.Furthermore, step-wise reintegration of members of the work-force play an important role within the effort to retain the ability to work and the workplace of individuals who suffered from stress-related mental disorders, as it has to be stressed that working and social interactions at the workplace may well be a resource that enhances and stipulates psycho-mental well-being and mental health.This CME-Article describes the above mentioned models and discusses selected perspectives of preventive measures to avoid stress-related mental disorders in members of the work-force. © Georg Thieme Verlag KG Stuttgart · New York.
Diagnosing and ranking retinopathy disease level using diabetic fundus image recuperation approach.
Somasundaram, K; Rajendran, P Alli
2015-01-01
Retinal fundus images are widely used in diagnosing different types of eye diseases. The existing methods such as Feature Based Macular Edema Detection (FMED) and Optimally Adjusted Morphological Operator (OAMO) effectively detected the presence of exudation in fundus images and identified the true positive ratio of exudates detection, respectively. These mechanically detected exudates did not include more detailed feature selection technique to the system for detection of diabetic retinopathy. To categorize the exudates, Diabetic Fundus Image Recuperation (DFIR) method based on sliding window approach is developed in this work to select the features of optic cup in digital retinal fundus images. The DFIR feature selection uses collection of sliding windows with varying range to obtain the features based on the histogram value using Group Sparsity Nonoverlapping Function. Using support vector model in the second phase, the DFIR method based on Spiral Basis Function effectively ranks the diabetic retinopathy disease level. The ranking of disease level on each candidate set provides a much promising result for developing practically automated and assisted diabetic retinopathy diagnosis system. Experimental work on digital fundus images using the DFIR method performs research on the factors such as sensitivity, ranking efficiency, and feature selection time.
Diagnosing and Ranking Retinopathy Disease Level Using Diabetic Fundus Image Recuperation Approach
Somasundaram, K.; Alli Rajendran, P.
2015-01-01
Retinal fundus images are widely used in diagnosing different types of eye diseases. The existing methods such as Feature Based Macular Edema Detection (FMED) and Optimally Adjusted Morphological Operator (OAMO) effectively detected the presence of exudation in fundus images and identified the true positive ratio of exudates detection, respectively. These mechanically detected exudates did not include more detailed feature selection technique to the system for detection of diabetic retinopathy. To categorize the exudates, Diabetic Fundus Image Recuperation (DFIR) method based on sliding window approach is developed in this work to select the features of optic cup in digital retinal fundus images. The DFIR feature selection uses collection of sliding windows with varying range to obtain the features based on the histogram value using Group Sparsity Nonoverlapping Function. Using support vector model in the second phase, the DFIR method based on Spiral Basis Function effectively ranks the diabetic retinopathy disease level. The ranking of disease level on each candidate set provides a much promising result for developing practically automated and assisted diabetic retinopathy diagnosis system. Experimental work on digital fundus images using the DFIR method performs research on the factors such as sensitivity, ranking efficiency, and feature selection time. PMID:25945362
Zhan, Xue-yan; Zhao, Na; Lin, Zhao-zhou; Wu, Zhi-sheng; Yuan, Rui-juan; Qiao, Yan-jiang
2014-12-01
The appropriate algorithm for calibration set selection was one of the key technologies for a good NIR quantitative model. There are different algorithms for calibration set selection, such as Random Sampling (RS) algorithm, Conventional Selection (CS) algorithm, Kennard-Stone(KS) algorithm and Sample set Portioning based on joint x-y distance (SPXY) algorithm, et al. However, there lack systematic comparisons between two algorithms of the above algorithms. The NIR quantitative models to determine the asiaticoside content in Centella total glucosides were established in the present paper, of which 7 indexes were classified and selected, and the effects of CS algorithm, KS algorithm and SPXY algorithm for calibration set selection on the accuracy and robustness of NIR quantitative models were investigated. The accuracy indexes of NIR quantitative models with calibration set selected by SPXY algorithm were significantly different from that with calibration set selected by CS algorithm or KS algorithm, while the robustness indexes, such as RMSECV and |RMSEP-RMSEC|, were not significantly different. Therefore, SPXY algorithm for calibration set selection could improve the predicative accuracy of NIR quantitative models to determine asiaticoside content in Centella total glucosides, and have no significant effect on the robustness of the models, which provides a reference to determine the appropriate algorithm for calibration set selection when NIR quantitative models are established for the solid system of traditional Chinese medcine.
Moore, Katherine Sledge; Weissman, Daniel H
2010-08-01
In the present study, we investigated whether involuntarily directing attention to a target-colored distractor causes the corresponding attentional set to enter a limited-capacity focus of attention, thereby facilitating the identification of a subsequent target whose color matches the same attentional set. As predicted, in Experiment 1, contingent attentional capture effects from a target-colored distractor were only one half to one third as large when subsequent target identification relied on the same (vs. a different) attentional set. In Experiment 2, this effect was eliminated when all of the target colors matched the same attentional set, arguing against bottom-up perceptual priming of the distractor's color as an alternative account of our findings. In Experiment 3, this effect was reversed when a target-colored distractor appeared after the target, ruling out a feature-based interference account of our findings. We conclude that capacity limitations in working memory strongly influence contingent attentional capture when multiple attentional sets guide selection.
Silva, Adão; Gameiro, Atílio
2014-01-01
We present in this work a low-complexity algorithm to solve the sum rate maximization problem in multiuser MIMO broadcast channels with downlink beamforming. Our approach decouples the user selection problem from the resource allocation problem and its main goal is to create a set of quasiorthogonal users. The proposed algorithm exploits physical metrics of the wireless channels that can be easily computed in such a way that a null space projection power can be approximated efficiently. Based on the derived metrics we present a mathematical model that describes the dynamics of the user selection process which renders the user selection problem into an integer linear program. Numerical results show that our approach is highly efficient to form groups of quasiorthogonal users when compared to previously proposed algorithms in the literature. Our user selection algorithm achieves a large portion of the optimum user selection sum rate (90%) for a moderate number of active users. PMID:24574928
NASA Astrophysics Data System (ADS)
Broderick, Scott R.; Santhanam, Ganesh Ram; Rajan, Krishna
2016-08-01
As the size of databases has significantly increased, whether through high throughput computation or through informatics-based modeling, the challenge of selecting the optimal material for specific design requirements has also arisen. Given the multiple, and often conflicting, design requirements, this selection process is not as trivial as sorting the database for a given property value. We suggest that the materials selection process should minimize selector bias, as well as take data uncertainty into account. For this reason, we discuss and apply decision theory for identifying chemical additions to Ni-base alloys. We demonstrate and compare results for both a computational array of chemistries and standard commercial superalloys. We demonstrate how we can use decision theory to select the best chemical additions for enhancing both property and processing, which would not otherwise be easily identifiable. This work is one of the first examples of introducing the mathematical framework of set theory and decision analysis into the domain of the materials selection process.
Actuator placement for active sound and vibration control of cylinders
NASA Technical Reports Server (NTRS)
Kincaid, Rex K.
1995-01-01
Active structural acoustic control is a method in which the control inputs (used to reduce interior noise) are applied directly to a vibrating structural acoustic system. The control concept modeled in this work is the application of in-plane force inputs to piezoceramic patches bonded to the wall of a vibrating cylinder. The cylinder is excited by an exterior noise source -- an acoustic monopole -- located near the outside of the cylinder wall. The goal is to determine the force inputs and sites for the piezoelectric actuators so that (1) the interior noise is effectively damped; (2) the level of vibration of the cylinder shell is not increased; and (3) the power requirements needed to drive the actuators are not excessive. We studied external monopole excitations at two frequencies. A cylinder resonance of 100 Hz, where the interior acoustic field is driven in multiple, off-resonance cylinder cavity modes, and a cylinder resonance of 200 Hz are characterized by both near and off-resonance cylinder vibration modes which couple effectively with a single, dominant, low-order acoustic cavity mode at resonance. Previous work has focused almost exclusively on meeting objective (1) and solving a complex least-squares problem to arrive at an optimal force vector for a given set of actuator sites. In addition, it has been noted that when the cavity mode couples with cylinder vibration modes (our 200 Hz case) control spillover may occur in higher order cylinder shell vibrational modes. How to determine the best set of actuator sites to meet objectives (1)-(3) is the main contribution of our research effort. The selection of the best set of actuator sites from a set of potential sites is done via two metaheuristics -- simulated annealing and tabu search. Each of these metaheuristics partitions the set of potential actuator sites into two disjoint sets: those that are selected to control the noise (on) and those that are not (off). Next, each metaheuristic attempts to improve this initial solution by calculating the change in the objective value when one selected actuator site is turned off and one actuator site that previously was not selected is turned on. All such pairwise exchanges are performed and the exchange that improves the objective the most is made. Eventually the search is unable to improve the objective value and a local optimum (with respect to pairwise exchanges) is reached. Both simulated annealing and tabu search provide mechanisms to escape local optima and allow the search to continue until (hopefully) a global optimum is found. Our experiments with the 100 Hz and 200 Hz cases confirm that both metaheuristics are able to uncover better solutions than those selected based upon engineering judgement alone. In addition, the high quality solutions generated by these metaheuristics, when minimizing interior noise, do not further excite the cylinder shell. Thus, we are able to meet objective (2) without imposing an additional constraint or forming a multiobjective performance measure. An additional observation is that in many cases the amplitude and phase values for several chosen actuator sites were nearly identical. This natural grouping means that fewer control channels are needed and the resulting control system is simpler. Currently no power requirements have been set, so objective (3) cannot be addressed. A set of experiments is planned with a laboratory test article (a cylinder). For these experiments the transfer matrices will be generated experimentally. It is hoped that the predicted performance of the best actuator sites found by our metaheuristics will correlate well with the measured performance.
Iterative h-minima-based marker-controlled watershed for cell nucleus segmentation.
Koyuncu, Can Fahrettin; Akhan, Ece; Ersahin, Tulin; Cetin-Atalay, Rengul; Gunduz-Demir, Cigdem
2016-04-01
Automated microscopy imaging systems facilitate high-throughput screening in molecular cellular biology research. The first step of these systems is cell nucleus segmentation, which has a great impact on the success of the overall system. The marker-controlled watershed is a technique commonly used by the previous studies for nucleus segmentation. These studies define their markers finding regional minima on the intensity/gradient and/or distance transform maps. They typically use the h-minima transform beforehand to suppress noise on these maps. The selection of the h value is critical; unnecessarily small values do not sufficiently suppress the noise, resulting in false and oversegmented markers, and unnecessarily large ones suppress too many pixels, causing missing and undersegmented markers. Because cell nuclei show different characteristics within an image, the same h value may not work to define correct markers for all the nuclei. To address this issue, in this work, we propose a new watershed algorithm that iteratively identifies its markers, considering a set of different h values. In each iteration, the proposed algorithm defines a set of candidates using a particular h value and selects the markers from those candidates provided that they fulfill the size requirement. Working with widefield fluorescence microscopy images, our experiments reveal that the use of multiple h values in our iterative algorithm leads to better segmentation results, compared to its counterparts. © 2016 International Society for Advancement of Cytometry. © 2016 International Society for Advancement of Cytometry.
Zerillo, Jessica A; Schouwenburg, Maartje G; van Bommel, Annelotte C M; Stowell, Caleb; Lippa, Jacob; Bauer, Donna; Berger, Ann M; Boland, Gilles; Borras, Josep M; Buss, Mary K; Cima, Robert; Van Cutsem, Eric; van Duyn, Eino B; Finlayson, Samuel R G; Hung-Chun Cheng, Skye; Langelotz, Corinna; Lloyd, John; Lynch, Andrew C; Mamon, Harvey J; McAllister, Pamela K; Minsky, Bruce D; Ngeow, Joanne; Abu Hassan, Muhammad R; Ryan, Kim; Shankaran, Veena; Upton, Melissa P; Zalcberg, John; van de Velde, Cornelis J; Tollenaar, Rob
2017-05-01
Global health systems are shifting toward value-based care in an effort to drive better outcomes in the setting of rising health care costs. This shift requires a common definition of value, starting with the outcomes that matter most to patients. The International Consortium for Health Outcomes Measurement (ICHOM), a nonprofit initiative, was formed to define standard sets of outcomes by medical condition. In this article, we report the efforts of ICHOM's working group in colorectal cancer. The working group was composed of multidisciplinary oncology specialists in medicine, surgery, radiation therapy, palliative care, nursing, and pathology, along with patient representatives. Through a modified Delphi process during 8 months (July 8, 2015 to February 29, 2016), ICHOM led the working group to a consensus on a final recommended standard set. The process was supported by a systematic PubMed literature review (1042 randomized clinical trials and guidelines from June 3, 2005, to June 3, 2015), a patient focus group (11 patients with early and metastatic colorectal cancer convened during a teleconference in August 2015), and a patient validation survey (among 276 patients with and survivors of colorectal cancer between October 15, 2015, and November 4, 2015). After consolidating findings of the literature review and focus group meeting, a list of 40 outcomes was presented to the WG and underwent voting. The final recommendation includes outcomes in the following categories: survival and disease control, disutility of care, degree of health, and quality of death. Selected case-mix factors were recommended to be collected at baseline to facilitate comparison of results across treatments and health care professionals. A standardized set of patient-centered outcome measures to inform value-based health care in colorectal cancer was developed. Pilot efforts are under way to measure the standard set among members of the working group.
Consultation and participation with children in healthy schools: choice, conflict and context.
Duckett, Paul; Kagan, Carolyn; Sixsmith, Judith
2010-09-01
In this paper we report on our use of a participatory research methodology to consult with children in the UK on how to improve pupil well-being in secondary schools, framed within the wider social policy context of healthy schools. We worked with children on the selection of our research methods and sought to voice the views of children to a local education authority to improve the design of school environments. The consultation process ultimately failed not because the children were unforthcoming with their views on either methods or on well-being in schools, but because of difficulties in how their views were received by adults. We show how the socio-economic, cultural and political context in which those difficulties were set might have led to the eventual break down of the consultation process, and we draw out a number of possible implications for consultative and participatory work with children in school settings.
Umar, Amara; Javaid, Nadeem; Ahmad, Ashfaq; Khan, Zahoor Ali; Qasim, Umar; Alrajeh, Nabil; Hayat, Amir
2015-06-18
Performance enhancement of Underwater Wireless Sensor Networks (UWSNs) in terms of throughput maximization, energy conservation and Bit Error Rate (BER) minimization is a potential research area. However, limited available bandwidth, high propagation delay, highly dynamic network topology, and high error probability leads to performance degradation in these networks. In this regard, many cooperative communication protocols have been developed that either investigate the physical layer or the Medium Access Control (MAC) layer, however, the network layer is still unexplored. More specifically, cooperative routing has not yet been jointly considered with sink mobility. Therefore, this paper aims to enhance the network reliability and efficiency via dominating set based cooperative routing and sink mobility. The proposed work is validated via simulations which show relatively improved performance of our proposed work in terms the selected performance metrics.
Temperature based Restricted Boltzmann Machines
NASA Astrophysics Data System (ADS)
Li, Guoqi; Deng, Lei; Xu, Yi; Wen, Changyun; Wang, Wei; Pei, Jing; Shi, Luping
2016-01-01
Restricted Boltzmann machines (RBMs), which apply graphical models to learning probability distribution over a set of inputs, have attracted much attention recently since being proposed as building blocks of multi-layer learning systems called deep belief networks (DBNs). Note that temperature is a key factor of the Boltzmann distribution that RBMs originate from. However, none of existing schemes have considered the impact of temperature in the graphical model of DBNs. In this work, we propose temperature based restricted Boltzmann machines (TRBMs) which reveals that temperature is an essential parameter controlling the selectivity of the firing neurons in the hidden layers. We theoretically prove that the effect of temperature can be adjusted by setting the parameter of the sharpness of the logistic function in the proposed TRBMs. The performance of RBMs can be improved by adjusting the temperature parameter of TRBMs. This work provides a comprehensive insights into the deep belief networks and deep learning architectures from a physical point of view.
Training a whole-book LSTM-based recognizer with an optimal training set
NASA Astrophysics Data System (ADS)
Soheili, Mohammad Reza; Yousefi, Mohammad Reza; Kabir, Ehsanollah; Stricker, Didier
2018-04-01
Despite the recent progress in OCR technologies, whole-book recognition, is still a challenging task, in particular in case of old and historical books, that the unknown font faces or low quality of paper and print contributes to the challenge. Therefore, pre-trained recognizers and generic methods do not usually perform up to required standards, and usually the performance degrades for larger scale recognition tasks, such as of a book. Such reportedly low error-rate methods turn out to require a great deal of manual correction. Generally, such methodologies do not make effective use of concepts such redundancy in whole-book recognition. In this work, we propose to train Long Short Term Memory (LSTM) networks on a minimal training set obtained from the book to be recognized. We show that clustering all the sub-words in the book, and using the sub-word cluster centers as the training set for the LSTM network, we can train models that outperform any identical network that is trained with randomly selected pages of the book. In our experiments, we also show that although the sub-word cluster centers are equivalent to about 8 pages of text for a 101- page book, a LSTM network trained on such a set performs competitively compared to an identical network that is trained on a set of 60 randomly selected pages of the book.
DECIDE: a software for computer-assisted evaluation of diagnostic test performance.
Chiecchio, A; Bo, A; Manzone, P; Giglioli, F
1993-05-01
The evaluation of the performance of clinical tests is a complex problem involving different steps and many statistical tools, not always structured in an organic and rational system. This paper presents a software which provides an organic system of statistical tools helping evaluation of clinical test performance. The program allows (a) the building and the organization of a working database, (b) the selection of the minimal set of tests with the maximum information content, (c) the search of the model best fitting the distribution of the test values, (d) the selection of optimal diagnostic cut-off value of the test for every positive/negative situation, (e) the evaluation of performance of the combinations of correlated and uncorrelated tests. The uncertainty associated with all the variables involved is evaluated. The program works in a MS-DOS environment with EGA or higher performing graphic card.
An autonomous organic reaction search engine for chemical reactivity.
Dragone, Vincenza; Sans, Victor; Henson, Alon B; Granda, Jaroslaw M; Cronin, Leroy
2017-06-09
The exploration of chemical space for new reactivity, reactions and molecules is limited by the need for separate work-up-separation steps searching for molecules rather than reactivity. Herein we present a system that can autonomously evaluate chemical reactivity within a network of 64 possible reaction combinations and aims for new reactivity, rather than a predefined set of targets. The robotic system combines chemical handling, in-line spectroscopy and real-time feedback and analysis with an algorithm that is able to distinguish and select the most reactive pathways, generating a reaction selection index (RSI) without need for separate work-up or purification steps. This allows the automatic navigation of a chemical network, leading to previously unreported molecules while needing only to do a fraction of the total possible reactions without any prior knowledge of the chemistry. We show the RSI correlates with reactivity and is able to search chemical space using the most reactive pathways.
An autonomous organic reaction search engine for chemical reactivity
NASA Astrophysics Data System (ADS)
Dragone, Vincenza; Sans, Victor; Henson, Alon B.; Granda, Jaroslaw M.; Cronin, Leroy
2017-06-01
The exploration of chemical space for new reactivity, reactions and molecules is limited by the need for separate work-up-separation steps searching for molecules rather than reactivity. Herein we present a system that can autonomously evaluate chemical reactivity within a network of 64 possible reaction combinations and aims for new reactivity, rather than a predefined set of targets. The robotic system combines chemical handling, in-line spectroscopy and real-time feedback and analysis with an algorithm that is able to distinguish and select the most reactive pathways, generating a reaction selection index (RSI) without need for separate work-up or purification steps. This allows the automatic navigation of a chemical network, leading to previously unreported molecules while needing only to do a fraction of the total possible reactions without any prior knowledge of the chemistry. We show the RSI correlates with reactivity and is able to search chemical space using the most reactive pathways.
An autonomous organic reaction search engine for chemical reactivity
Dragone, Vincenza; Sans, Victor; Henson, Alon B.; Granda, Jaroslaw M.; Cronin, Leroy
2017-01-01
The exploration of chemical space for new reactivity, reactions and molecules is limited by the need for separate work-up-separation steps searching for molecules rather than reactivity. Herein we present a system that can autonomously evaluate chemical reactivity within a network of 64 possible reaction combinations and aims for new reactivity, rather than a predefined set of targets. The robotic system combines chemical handling, in-line spectroscopy and real-time feedback and analysis with an algorithm that is able to distinguish and select the most reactive pathways, generating a reaction selection index (RSI) without need for separate work-up or purification steps. This allows the automatic navigation of a chemical network, leading to previously unreported molecules while needing only to do a fraction of the total possible reactions without any prior knowledge of the chemistry. We show the RSI correlates with reactivity and is able to search chemical space using the most reactive pathways. PMID:28598440
Woolf-King, Sarah E.; Maisto, Stephen; Carey, Michael; Vanable, Peter
2013-01-01
Experimental research on sexual decision making is limited, despite the public health importance of such work. We describe formative work conducted in advance of an experimental study designed to evaluate the effects of alcohol intoxication and sexual arousal on risky sexual decision making among men who have sex with men. In Study 1, we describe the procedures for selecting and validating erotic film clips (to be used for the experimental manipulation of arousal). In Study 2, we describe the tailoring of two interactive role-play videos to be used to measure risk perception and communication skills in an analog risky sex situation. Together, these studies illustrate a method for creating experimental stimuli to investigate sexual decision making in a laboratory setting. Research using this approach will support experimental research that affords a stronger basis for drawing causal inferences regarding sexual decision making. PMID:19760530
Trait anxiety and impaired control of reflective attention in working memory.
Hoshino, Takatoshi; Tanno, Yoshihiko
2016-01-01
The present study investigated whether the control of reflective attention in working memory (WM) is impaired in high trait anxiety individuals. We focused on the consequences of refreshing-a simple reflective process of thinking briefly about a just-activated representation in mind-on the subsequent processing of verbal stimuli. Participants performed a selective refreshing task, in which they initially refreshed or read one word from a three-word set, and then refreshed a non-selected item from the initial phrase or read aloud a new word. High trait anxiety individuals exhibited greater latencies when refreshing a word after experiencing the refreshing of a word from the same list of semantic associates. The same pattern was observed for reading a new word after prior refreshing. These findings suggest that high trait anxiety individuals have difficulty resolving interference from active distractors when directing reflective attention towards contents in WM or processing a visually presented word.
Nonequilibrium steady states of ideal bosonic and fermionic quantum gases
NASA Astrophysics Data System (ADS)
Vorberg, Daniel; Wustmann, Waltraut; Schomerus, Henning; Ketzmerick, Roland; Eckardt, André
2015-12-01
We investigate nonequilibrium steady states of driven-dissipative ideal quantum gases of both bosons and fermions. We focus on systems of sharp particle number that are driven out of equilibrium either by the coupling to several heat baths of different temperature or by time-periodic driving in combination with the coupling to a heat bath. Within the framework of (Floquet-)Born-Markov theory, several analytical and numerical methods are described in detail. This includes a mean-field theory in terms of occupation numbers, an augmented mean-field theory taking into account also nontrivial two-particle correlations, and quantum-jump-type Monte Carlo simulations. For the case of the ideal Fermi gas, these methods are applied to simple lattice models and the possibility of achieving exotic states via bath engineering is pointed out. The largest part of this work is devoted to bosonic quantum gases and the phenomenon of Bose selection, a nonequilibrium generalization of Bose condensation, where multiple single-particle states are selected to acquire a large occupation [Phys. Rev. Lett. 111, 240405 (2013), 10.1103/PhysRevLett.111.240405]. In this context, among others, we provide a theory for transitions where the set of selected states changes, describe an efficient algorithm for finding the set of selected states, investigate beyond-mean-field effects, and identify the dominant mechanisms for heat transport in the Bose-selected state.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, H; Wang, J; Chuong, M
2015-06-15
Purpose: To evaluate the role of mid-treatment and post-treatment FDG-PET/CT in predicting progression-free survival (PFS) and distant metastasis (DM) of anal cancer patients treated with chemoradiotherapy (CRT). Methods: 17 anal cancer patients treated with CRT were retrospectively studied. The median prescription dose was 56 Gy (range, 50–62.5 Gy). All patients underwent FDG-PET/CT scans before and after CRT. 16 of the 17 patients had an additional FDG-PET/CT image at 3–5 weeks into the treatment (denoted as mid-treatment FDG-PET/CT). 750 features were extracted from these three sets of scans, which included both traditional PET/CT measures (SUVmax, SUVpeak, tumor diameters, etc.) and spatialtemporalmore » PET/CT features (comprehensively quantify a tumor’s FDG uptake intensity and distribution, spatial variation (texture), geometric property and their temporal changes relative to baseline). 26 clinical parameters (age, gender, TNM stage, histology, GTV dose, etc.) were also analyzed. Advanced analytics including methods to select an optimal set of predictors and a model selection engine, which identifies the most accurate machine learning algorithm for predictive analysis was developed. Results: Comparing baseline + mid-treatment PET/CT set to baseline + posttreatment PET/CT set, 14 predictors were selected from each feature group. Same three clinical parameters (tumor size, T stage and whether 5-FU was held during any cycle of chemotherapy) and two traditional measures (pre- CRT SUVmin and SUVmedian) were selected by both predictor groups. Different mix of spatial-temporal PET/CT features was selected. Using the 14 predictors and Naive Bayes, mid-treatment PET/CT set achieved 87.5% accuracy (2 PFS patients misclassified, all local recurrence and DM patients correctly classified). Post-treatment PET/CT set achieved 94.0% accuracy (all PFS and DM patients correctly predicted, 1 local recurrence patient misclassified) with logistic regression, neural network or support vector machine model. Conclusion: Applying radiomics approach to either midtreatment or post-treatment PET/CT could achieve high accuracy in predicting anal cancer treatment outcomes. This work was supported in part by the National Cancer Institute Grant R01CA172638.« less
Ladshaw, Austin P.; Ivanov, Alexander S.; Das, Sadananda; ...
2018-03-27
Nuclear power is a relatively carbon-free energy source that has the capacity to be utilized today in an effort to stem the tides of global warming. The growing demand for nuclear energy, however, could put significant strain on our uranium ore resources, and the mining activities utilized to extract that ore can leave behind long-term environmental damage. A potential solution to enhance the supply of uranium fuel is to recover uranium from seawater using amidoximated adsorbent fibers. This technology has been studied for decades but is currently plagued by the material’s relatively poor selectivity of uranium over its main competitormore » vanadium. In this work, we investigate the binding schemes between uranium, vanadium, and the amidoxime functional groups on the adsorbent surface. Using quantum chemical methods, binding strengths are approximated for a set of complexation reactions between uranium and vanadium with amidoxime functionalities. Those approximations are then coupled with a comprehensive aqueous adsorption model developed in this work to simulate the adsorption of uranium and vanadium under laboratory conditions. Experimental adsorption studies with uranium and vanadium over a wide pH range are performed, and the data collected are compared against simulation results to validate the model. It was found that coupling ab initio calculations with process level adsorption modeling provides accurate predictions of the adsorption capacity and selectivity of the sorbent materials. Furthermore, this work demonstrates that this multiscale modeling paradigm could be utilized to aid in the selection of superior ligands or ligand compositions for the selective capture of metal ions. Furthermore, this first-principles integrated modeling approach opens the door to the in silico design of next-generation adsorbents with potentially superior efficiency and selectivity for uranium over vanadium in seawater.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ladshaw, Austin P.; Ivanov, Alexander S.; Das, Sadananda
Nuclear power is a relatively carbon-free energy source that has the capacity to be utilized today in an effort to stem the tides of global warming. The growing demand for nuclear energy, however, could put significant strain on our uranium ore resources, and the mining activities utilized to extract that ore can leave behind long-term environmental damage. A potential solution to enhance the supply of uranium fuel is to recover uranium from seawater using amidoximated adsorbent fibers. This technology has been studied for decades but is currently plagued by the material’s relatively poor selectivity of uranium over its main competitormore » vanadium. In this work, we investigate the binding schemes between uranium, vanadium, and the amidoxime functional groups on the adsorbent surface. Using quantum chemical methods, binding strengths are approximated for a set of complexation reactions between uranium and vanadium with amidoxime functionalities. Those approximations are then coupled with a comprehensive aqueous adsorption model developed in this work to simulate the adsorption of uranium and vanadium under laboratory conditions. Experimental adsorption studies with uranium and vanadium over a wide pH range are performed, and the data collected are compared against simulation results to validate the model. It was found that coupling ab initio calculations with process level adsorption modeling provides accurate predictions of the adsorption capacity and selectivity of the sorbent materials. Furthermore, this work demonstrates that this multiscale modeling paradigm could be utilized to aid in the selection of superior ligands or ligand compositions for the selective capture of metal ions. Furthermore, this first-principles integrated modeling approach opens the door to the in silico design of next-generation adsorbents with potentially superior efficiency and selectivity for uranium over vanadium in seawater.« less
Ladshaw, Austin P; Ivanov, Alexander S; Das, Sadananda; Bryantsev, Vyacheslav S; Tsouris, Costas; Yiacoumi, Sotira
2018-04-18
Nuclear power is a relatively carbon-free energy source that has the capacity to be utilized today in an effort to stem the tides of global warming. The growing demand for nuclear energy, however, could put significant strain on our uranium ore resources, and the mining activities utilized to extract that ore can leave behind long-term environmental damage. A potential solution to enhance the supply of uranium fuel is to recover uranium from seawater using amidoximated adsorbent fibers. This technology has been studied for decades but is currently plagued by the material's relatively poor selectivity of uranium over its main competitor vanadium. In this work, we investigate the binding schemes between uranium, vanadium, and the amidoxime functional groups on the adsorbent surface. Using quantum chemical methods, binding strengths are approximated for a set of complexation reactions between uranium and vanadium with amidoxime functionalities. Those approximations are then coupled with a comprehensive aqueous adsorption model developed in this work to simulate the adsorption of uranium and vanadium under laboratory conditions. Experimental adsorption studies with uranium and vanadium over a wide pH range are performed, and the data collected are compared against simulation results to validate the model. It was found that coupling ab initio calculations with process level adsorption modeling provides accurate predictions of the adsorption capacity and selectivity of the sorbent materials. Furthermore, this work demonstrates that this multiscale modeling paradigm could be utilized to aid in the selection of superior ligands or ligand compositions for the selective capture of metal ions. Therefore, this first-principles integrated modeling approach opens the door to the in silico design of next-generation adsorbents with potentially superior efficiency and selectivity for uranium over vanadium in seawater.
Disequilibrium and human capital in pharmacy labor markets: evidence from four states.
Cline, Richard R
2003-01-01
To estimate the association between pharmacists' stocks of human capital (work experience and education), practice setting, demographics, and wage rates in the overall labor market and to estimate the association between these same variables and wage rates within six distinct pharmacy employment sectors. Wage estimation is used as a proxy measure of demand for pharmacists' services. Descriptive survey analysis. Illinois, Minnesota, Ohio, and Wisconsin. Licensed pharmacists working 30 or more hours per week. Analysis of data collected with cross-sectional mail surveys conducted in four states. Hourly wage rates for all pharmacists working 30 or more hours per week and hourly wage rates for pharmacists employed in large chain, independent, mass-merchandiser, hospital, health maintenance organization (HMO), and other settings. A total of 2,235 responses were received, for an adjusted response rate of 53.1%. Application of exclusion criteria left 1,450 responses from full-time pharmacists to analyze. Results from estimations of wages in the pooled sample and for pharmacists in the hospital setting suggest that advanced training and years of experience are associated positively with higher hourly wages. Years of experience were also associated positively with higher wages in independent and other settings, while neither advanced education nor experience was related to wages in large chain, mass-merchandiser, or HMO settings. Overall, the market for full-time pharmacists' labor is competitive, and employers pay wage premiums to those with larger stocks of human capital, especially advanced education and more years of pharmacy practice experience. The evidence supports the hypothesis that demand is exceeding supply in select employment sectors.
DRUMS: Disk Repository with Update Management and Select option for high throughput sequencing data.
Nettling, Martin; Thieme, Nils; Both, Andreas; Grosse, Ivo
2014-02-04
New technologies for analyzing biological samples, like next generation sequencing, are producing a growing amount of data together with quality scores. Moreover, software tools (e.g., for mapping sequence reads), calculating transcription factor binding probabilities, estimating epigenetic modification enriched regions or determining single nucleotide polymorphism increase this amount of position-specific DNA-related data even further. Hence, requesting data becomes challenging and expensive and is often implemented using specialised hardware. In addition, picking specific data as fast as possible becomes increasingly important in many fields of science. The general problem of handling big data sets was addressed by developing specialized databases like HBase, HyperTable or Cassandra. However, these database solutions require also specialized or distributed hardware leading to expensive investments. To the best of our knowledge, there is no database capable of (i) storing billions of position-specific DNA-related records, (ii) performing fast and resource saving requests, and (iii) running on a single standard computer hardware. Here, we present DRUMS (Disk Repository with Update Management and Select option), satisfying demands (i)-(iii). It tackles the weaknesses of traditional databases while handling position-specific DNA-related data in an efficient manner. DRUMS is capable of storing up to billions of records. Moreover, it focuses on optimizing relating single lookups as range request, which are needed permanently for computations in bioinformatics. To validate the power of DRUMS, we compare it to the widely used MySQL database. The test setting considers two biological data sets. We use standard desktop hardware as test environment. DRUMS outperforms MySQL in writing and reading records by a factor of two up to a factor of 10000. Furthermore, it can work with significantly larger data sets. Our work focuses on mid-sized data sets up to several billion records without requiring cluster technology. Storing position-specific data is a general problem and the concept we present here is a generalized approach. Hence, it can be easily applied to other fields of bioinformatics.
Trimming the UCERF2 hazard logic tree
Porter, Keith A.; Field, Edward H.; Milner, Kevin
2012-01-01
The Uniform California Earthquake Rupture Forecast 2 (UCERF2) is a fully time‐dependent earthquake rupture forecast developed with sponsorship of the California Earthquake Authority (Working Group on California Earthquake Probabilities [WGCEP], 2007; Field et al., 2009). UCERF2 contains 480 logic‐tree branches reflecting choices among nine modeling uncertainties in the earthquake rate model shown in Figure 1. For seismic hazard analysis, it is also necessary to choose a ground‐motion‐prediction equation (GMPE) and set its parameters. Choosing among four next‐generation attenuation (NGA) relationships results in a total of 1920 hazard calculations per site. The present work is motivated by a desire to reduce the computational effort involved in a hazard analysis without understating uncertainty. We set out to assess which branching points of the UCERF2 logic tree contribute most to overall uncertainty, and which might be safely ignored (set to only one branch) without significantly biasing results or affecting some useful measure of uncertainty. The trimmed logic tree will have all of the original choices from the branching points that contribute significantly to uncertainty, but only one arbitrarily selected choice from the branching points that do not.
Miller, Alexander L; Lopez, Linda; Gonzalez, Jodi M; Dassori, Albana; Bond, Gary; Velligan, Dawn
2008-11-01
Applying research findings to community mental health practices is slowed by provider concerns that research participants often differ from community populations in duration of illness, comorbid conditions, and illness severity. Selecting participants from community settings makes research results demonstrably relevant, but researchers and community providers can be mistrustful of one another, feeling that the other has little understanding of their needs and work. This mistrust impedes patient referrals for research. This column describes a program to increase researcher knowledge of community clinic procedures through structured interactions with clinic personnel. Follow-up interviews indicate improved attitudes and cooperation of researchers and community providers.
ERIC Educational Resources Information Center
Kelly, Dennis; Soyibo, Kola
2005-01-01
This study was designed to find out if students taught food and nutrition concepts using the lecture method and practical work would perform significantly better than their counterparts taught with the lecture and teacher demonstrations and the lecture method only. The sample comprised 114 Jamaican 10th-graders (56 boys, 58 girls) selected from…
Real-Time Speech/Music Classification With a Hierarchical Oblique Decision Tree
2008-04-01
REAL-TIME SPEECH/ MUSIC CLASSIFICATION WITH A HIERARCHICAL OBLIQUE DECISION TREE Jun Wang, Qiong Wu, Haojiang Deng, Qin Yan Institute of Acoustics...time speech/ music classification with a hierarchical oblique decision tree. A set of discrimination features in frequency domain are selected...handle signals without discrimination and can not work properly in the existence of multimedia signals. This paper proposes a real-time speech/ music
Causal Networks with Selectively Influenced Components
2012-02-29
influences a different vertex. If so, the form of a processing tree accounting for the data can determined. Prior to the work on the grant, processing...their order. Processing trees were found to account well for data in the literature on immediate ordered recall and on effects of sleep and...ordered in the network) or concurrent (unordered). Ordinarily for a given data set, if one directed acyclic network can account for the data
ERIC Educational Resources Information Center
Morgan, Hani
2015-01-01
Online education in K-12 settings has increased considerably in recent years, but there is little research supporting its use at this level. Online courses help students learn at their own pace, select different locations to do their work, and choose flexible times to complete assignments. However, some students learn best in a face-to-face…
Slow and fast solar wind - data selection and statistical analysis
NASA Astrophysics Data System (ADS)
Wawrzaszek, Anna; Macek, Wiesław M.; Bruno, Roberto; Echim, Marius
2014-05-01
In this work we consider the important problem of selection of slow and fast solar wind data measured in-situ by the Ulysses spacecraft during two solar minima (1995-1997, 2007-2008) and solar maximum (1999-2001). To recognise different types of solar wind we use a set of following parameters: radial velocity, proton density, proton temperature, the distribution of charge states of oxygen ions, and compressibility of magnetic field. We present how this idea of the data selection works on Ulysses data. In the next step we consider the chosen intervals for fast and slow solar wind and perform statistical analysis of the fluctuating magnetic field components. In particular, we check the possibility of identification of inertial range by considering the scale dependence of the third and fourth orders scaling exponents of structure function. We try to verify the size of inertial range depending on the heliographic latitudes, heliocentric distance and phase of the solar cycle. Research supported by the European Community's Seventh Framework Programme (FP7/2007 - 2013) under grant agreement no 313038/STORM.
Path scheduling for multiple mobile actors in wireless sensor network
NASA Astrophysics Data System (ADS)
Trapasiya, Samir D.; Soni, Himanshu B.
2017-05-01
In wireless sensor network (WSN), energy is the main constraint. In this work we have addressed this issue for single as well as multiple mobile sensor actor network. In this work, we have proposed Rendezvous Point Selection Scheme (RPSS) in which Rendezvous Nodes are selected by set covering problem approach and from that, Rendezvous Points are selected in a way to reduce the tour length. The mobile actors tour is scheduled to pass through those Rendezvous Points as per Travelling Salesman Problem (TSP). We have also proposed novel rendezvous node rotation scheme for fair utilisation of all the nodes. We have compared RPSS with Stationery Actor scheme as well as RD-VT, RD-VT-SMT and WRP-SMT for performance metrics like energy consumption, network lifetime, route length and found the better outcome in all the cases for single actor. We have also applied RPSS for multiple mobile actor case like Multi-Actor Single Depot (MASD) termination and Multi-Actor Multiple Depot (MAMD) termination and observed by extensive simulation that MAMD saves the network energy in optimised way and enhance network lifetime compared to all other schemes.
Developing New TCOs for Renewable Applications
NASA Astrophysics Data System (ADS)
Ginley, David
2013-03-01
Transparent conducting oxides are enabling for a broad range of optoelectronic technologies. Not only are conductivity and transparency critical but many other factors are critical including: carrier type, processing conditions, work function, chemical stability, and interface properties. The historical set of materials cannot meet all these needs. This has driven a renaissance in new materials development and approaches to transparent contacts. We will discuss these new developments in general and in the context of photovoltaics specifically. We will present results on new materials and also the development bilayer structrues that enable charge selective contacts. Materials set includes amorphous materials for hybrid solar cells like InZnO and ZnSnO, it includes Nb and Ta doped TiO2 as a high refractive index TCO and it includes the use of thin n- and p-type oxides as electron and hole selective contacts such as has been demonstrated for organic photovotaics. This work is supported by the U.S. Department of Energy, Office of Science, Basic Energy Sciences, under Contract No. DE-AC36-08GO28308 to NREL as a part of the DOE Energy Frontier Research Center ``Center for Inverse Design'' and through the US Department of Energy under Contract no. DOE-AC36-08GO28308 through the National Center for Photovoltaics.
Study of Intermediate Age (~10-30 Myr) Open Clusters
NASA Astrophysics Data System (ADS)
Olguin, Lorenzo; Michel, Raul; Contreras, Maria; Hernandez, Jesus; Schuster, William; Chavarria-Kleinhenn, Carlos
2013-07-01
We present the study of a sample of intermediate age open clusters (age ~ 10-30 Myr) using optical (UBVRI) and infrared photometric data. Optical photometry was obtained as part of the San Pedro Martir Open Clusters Project (SPM-OCP, Schuster et al. 2007; Michel et al. 2013). Infrared photometry was retrieved from 2MASS public data archive and WISE database. Open clusters included in the SPM-OCP were selected from catalogues presented by Dias et al. (2002) and Froebrich, Scholz & Raftery (2007). One of the main goals of the SPM-OCP is to compile a self-consistent and homogeneous set of cluster fundamental parameters such as reddening, distance, age, and metallicity whenever possible. In this work, we have analyzed a set of 25 clusters from the SPM-OCP with estimated ages between 10 and 30 Myr. Derived fundamental parameters for each cluster in the sample as well as an example of typical color-color and color-magnitude diagrams are presented. Kinematic membership was established by using proper motion data taken from the literature. Based on infrared photometry, we have searched for candidate stars to posses a circumstellar disk within each clusters. For those selected candidates a follow-up spectroscpic study is being carried out. This work was partially supported by UNAM-PAPIIT grant IN-109311.
The Performance of Short-Term Heart Rate Variability in the Detection of Congestive Heart Failure
Barros, Allan Kardec; Ohnishi, Noboru
2016-01-01
Congestive heart failure (CHF) is a cardiac disease associated with the decreasing capacity of the cardiac output. It has been shown that the CHF is the main cause of the cardiac death around the world. Some works proposed to discriminate CHF subjects from healthy subjects using either electrocardiogram (ECG) or heart rate variability (HRV) from long-term recordings. In this work, we propose an alternative framework to discriminate CHF from healthy subjects by using HRV short-term intervals based on 256 RR continuous samples. Our framework uses a matching pursuit algorithm based on Gabor functions. From the selected Gabor functions, we derived a set of features that are inputted into a hybrid framework which uses a genetic algorithm and k-nearest neighbour classifier to select a subset of features that has the best classification performance. The performance of the framework is analyzed using both Fantasia and CHF database from Physionet archives which are, respectively, composed of 40 healthy volunteers and 29 subjects. From a set of nonstandard 16 features, the proposed framework reaches an overall accuracy of 100% with five features. Our results suggest that the application of hybrid frameworks whose classifier algorithms are based on genetic algorithms has outperformed well-known classifier methods. PMID:27891509
Prompting one low-fat, high-fiber selection in a fast-food restaurant.
Wagner, J L; Winett, R A
1988-01-01
Evidence increasingly links a high-fat, low-fiber diet to coronary heart disease and certain site cancers, indicating a need for large-scale dietary change. Studies showing the effectiveness of particular procedures in specific settings are important at this point. The present study, using an A-B-A-B design and sales data from computerized cash registers, replicated and extended previous work by showing that inexpensive prompts (i.e., signs and fliers) in a national fast-food restaurant could increase the sales of salads, a low-fat, high-fiber menu selection. Suggestions also are made pertinent to more widespread use of the procedures.
Improved building up a model of toxicity towards Pimephales promelas by the Monte Carlo method.
Toropova, Alla P; Toropov, Andrey A; Raskova, Maria; Raska, Ivan
2016-12-01
By optimization of so-called correlation weights of attributes of simplified molecular input-line entry system (SMILES) quantitative structure - activity relationships (QSAR) for toxicity towards Pimephales promelas are established. A new SMILES attribute has been utilized in this work. This attribute is a molecular descriptor, which reflects (i) presence of different kinds of bonds (double, triple, and stereo chemical bonds); (ii) presence of nitrogen, oxygen, sulphur, and phosphorus atoms; and (iii) presence of fluorine, chlorine, bromine, and iodine atoms. The statistical characteristics of the best model are the following: n=226, r 2 =0.7630, RMSE=0.654 (training set); n=114, r 2 =0.7024, RMSE=0.766 (calibration set); n=226, r 2 =0.6292, RMSE=0.870 (validation set). A new criterion to select a preferable split into the training and validation sets are suggested and discussed. Copyright © 2016 Elsevier B.V. All rights reserved.
[Research progress of probe design software of oligonucleotide microarrays].
Chen, Xi; Wu, Zaoquan; Liu, Zhengchun
2014-02-01
DNA microarray has become an essential medical genetic diagnostic tool for its high-throughput, miniaturization and automation. The design and selection of oligonucleotide probes are critical for preparing gene chips with high quality. Several sets of probe design software have been developed and are available to perform this work now. Every set of the software aims to different target sequences and shows different advantages and limitations. In this article, the research and development of these sets of software are reviewed in line with three main criteria, including specificity, sensitivity and melting temperature (Tm). In addition, based on the experimental results from literatures, these sets of software are classified according to their applications. This review will be helpful for users to choose an appropriate probe-design software. It will also reduce the costs of microarrays, improve the application efficiency of microarrays, and promote both the research and development (R&D) and commercialization of high-performance probe design software.
Taming Big Data: An Information Extraction Strategy for Large Clinical Text Corpora.
Gundlapalli, Adi V; Divita, Guy; Carter, Marjorie E; Redd, Andrew; Samore, Matthew H; Gupta, Kalpana; Trautner, Barbara
2015-01-01
Concepts of interest for clinical and research purposes are not uniformly distributed in clinical text available in electronic medical records. The purpose of our study was to identify filtering techniques to select 'high yield' documents for increased efficacy and throughput. Using two large corpora of clinical text, we demonstrate the identification of 'high yield' document sets in two unrelated domains: homelessness and indwelling urinary catheters. For homelessness, the high yield set includes homeless program and social work notes. For urinary catheters, concepts were more prevalent in notes from hospitalized patients; nursing notes accounted for a majority of the high yield set. This filtering will enable customization and refining of information extraction pipelines to facilitate extraction of relevant concepts for clinical decision support and other uses.
Copper-catalyzed selective hydroamination reactions of alkynes
Shi, Shi-Liang; Buchwald, Stephen L.
2014-01-01
The development of selective reactions that utilize easily available and abundant precursors for the efficient synthesis of amines is a longstanding goal of chemical research. Despite the centrality of amines in a number of important research areas, including medicinal chemistry, total synthesis and materials science, a general, selective, and step-efficient synthesis of amines is still needed. In this work we describe a set of mild catalytic conditions utilizing a single copper-based catalyst that enables the direct preparation of three distinct and important amine classes (enamines, α-chiral branched alkylamines, and linear alkylamines) from readily available alkyne starting materials with high levels of chemo-, regio-, and stereoselectivity. This methodology was applied to the asymmetric synthesis of rivastigmine and the formal synthesis of several other pharmaceutical agents, including duloxetine, atomoxetine, fluoxetine, and tolterodine. PMID:25515888
Basnayake, Shiromani W V; Moyle, Richard; Birch, Robert G
2011-03-01
Amenability to tissue culture stages required for gene transfer, selection and plant regeneration are the main determinants of genetic transformation efficiency via particle bombardment into sugarcane. The technique is moving from the experimental phase, where it is sufficient to work in a few amenable genotypes, to practical application in a diverse and changing set of elite cultivars. Therefore, we investigated the response to callus initiation, proliferation, regeneration and selection steps required for microprojectile-mediated transformation, in a diverse set of Australian sugarcane cultivars. 12 of 16 tested cultivars were sufficiently amenable to existing routine tissue-culture conditions for practical genetic transformation. Three cultivars required adjustments to 2,4-D levels during callus proliferation, geneticin concentration during selection, and/or light intensity during regeneration. One cultivar gave an extreme necrotic response in leaf spindle explants and produced no callus tissue under the tested culture conditions. It was helpful to obtain spindle explants for tissue culture from plants with good water supply for growth, especially for genotypes that were harder to culture. It was generally possible to obtain several independent transgenic plants per bombardment, with time in callus culture limited to 11-15 weeks. A caution with this efficient transformation system is that separate shoots arose from different primary transformed cells in more than half of tested calli after selection for geneticin resistance. The results across this diverse cultivar set are likely to be a useful guide to key variables for rapid optimisation of tissue culture conditions for efficient genetic transformation of other sugarcane cultivars.
Chalfoun, J; Majurski, M; Peskin, A; Breen, C; Bajcsy, P; Brady, M
2015-10-01
New microscopy technologies are enabling image acquisition of terabyte-sized data sets consisting of hundreds of thousands of images. In order to retrieve and analyze the biological information in these large data sets, segmentation is needed to detect the regions containing cells or cell colonies. Our work with hundreds of large images (each 21,000×21,000 pixels) requires a segmentation method that: (1) yields high segmentation accuracy, (2) is applicable to multiple cell lines with various densities of cells and cell colonies, and several imaging modalities, (3) can process large data sets in a timely manner, (4) has a low memory footprint and (5) has a small number of user-set parameters that do not require adjustment during the segmentation of large image sets. None of the currently available segmentation methods meet all these requirements. Segmentation based on image gradient thresholding is fast and has a low memory footprint. However, existing techniques that automate the selection of the gradient image threshold do not work across image modalities, multiple cell lines, and a wide range of foreground/background densities (requirement 2) and all failed the requirement for robust parameters that do not require re-adjustment with time (requirement 5). We present a novel and empirically derived image gradient threshold selection method for separating foreground and background pixels in an image that meets all the requirements listed above. We quantify the difference between our approach and existing ones in terms of accuracy, execution speed, memory usage and number of adjustable parameters on a reference data set. This reference data set consists of 501 validation images with manually determined segmentations and image sizes ranging from 0.36 Megapixels to 850 Megapixels. It includes four different cell lines and two image modalities: phase contrast and fluorescent. Our new technique, called Empirical Gradient Threshold (EGT), is derived from this reference data set with a 10-fold cross-validation method. EGT segments cells or colonies with resulting Dice accuracy index measurements above 0.92 for all cross-validation data sets. EGT results has also been visually verified on a much larger data set that includes bright field and Differential Interference Contrast (DIC) images, 16 cell lines and 61 time-sequence data sets, for a total of 17,479 images. This method is implemented as an open-source plugin to ImageJ as well as a standalone executable that can be downloaded from the following link: https://isg.nist.gov/. © 2015 The Authors Journal of Microscopy © 2015 Royal Microscopical Society.
Halabi, Susan; Lin, Chen-Yen; Kelly, W. Kevin; Fizazi, Karim S.; Moul, Judd W.; Kaplan, Ellen B.; Morris, Michael J.; Small, Eric J.
2014-01-01
Purpose Prognostic models for overall survival (OS) for patients with metastatic castration-resistant prostate cancer (mCRPC) are dated and do not reflect significant advances in treatment options available for these patients. This work developed and validated an updated prognostic model to predict OS in patients receiving first-line chemotherapy. Methods Data from a phase III trial of 1,050 patients with mCRPC were used (Cancer and Leukemia Group B CALGB-90401 [Alliance]). The data were randomly split into training and testing sets. A separate phase III trial served as an independent validation set. Adaptive least absolute shrinkage and selection operator selected eight factors prognostic for OS. A predictive score was computed from the regression coefficients and used to classify patients into low- and high-risk groups. The model was assessed for its predictive accuracy using the time-dependent area under the curve (tAUC). Results The model included Eastern Cooperative Oncology Group performance status, disease site, lactate dehydrogenase, opioid analgesic use, albumin, hemoglobin, prostate-specific antigen, and alkaline phosphatase. Median OS values in the high- and low-risk groups, respectively, in the testing set were 17 and 30 months (hazard ratio [HR], 2.2; P < .001); in the validation set they were 14 and 26 months (HR, 2.9; P < .001). The tAUCs were 0.73 (95% CI, 0.70 to 0.73) and 0.76 (95% CI, 0.72 to 0.76) in the testing and validation sets, respectively. Conclusion An updated prognostic model for OS in patients with mCRPC receiving first-line chemotherapy was developed and validated on an external set. This model can be used to predict OS, as well as to better select patients to participate in trials on the basis of their prognosis. PMID:24449231
Analysis of genetic association using hierarchical clustering and cluster validation indices.
Pagnuco, Inti A; Pastore, Juan I; Abras, Guillermo; Brun, Marcel; Ballarin, Virginia L
2017-10-01
It is usually assumed that co-expressed genes suggest co-regulation in the underlying regulatory network. Determining sets of co-expressed genes is an important task, based on some criteria of similarity. This task is usually performed by clustering algorithms, where the genes are clustered into meaningful groups based on their expression values in a set of experiment. In this work, we propose a method to find sets of co-expressed genes, based on cluster validation indices as a measure of similarity for individual gene groups, and a combination of variants of hierarchical clustering to generate the candidate groups. We evaluated its ability to retrieve significant sets on simulated correlated and real genomics data, where the performance is measured based on its detection ability of co-regulated sets against a full search. Additionally, we analyzed the quality of the best ranked groups using an online bioinformatics tool that provides network information for the selected genes. Copyright © 2017 Elsevier Inc. All rights reserved.
A graphical, rule based robotic interface system
NASA Technical Reports Server (NTRS)
Mckee, James W.; Wolfsberger, John
1988-01-01
The ability of a human to take control of a robotic system is essential in any use of robots in space in order to handle unforeseen changes in the robot's work environment or scheduled tasks. But in cases in which the work environment is known, a human controlling a robot's every move by remote control is both time consuming and frustrating. A system is needed in which the user can give the robotic system commands to perform tasks but need not tell the system how. To be useful, this system should be able to plan and perform the tasks faster than a telerobotic system. The interface between the user and the robot system must be natural and meaningful to the user. A high level user interface program under development at the University of Alabama, Huntsville, is described. A graphical interface is proposed in which the user selects objects to be manipulated by selecting representations of the object on projections of a 3-D model of the work environment. The user may move in the work environment by changing the viewpoint of the projections. The interface uses a rule based program to transform user selection of items on a graphics display of the robot's work environment into commands for the robot. The program first determines if the desired task is possible given the abilities of the robot and any constraints on the object. If the task is possible, the program determines what movements the robot needs to make to perform the task. The movements are transformed into commands for the robot. The information defining the robot, the work environment, and how objects may be moved is stored in a set of data bases accessible to the program and displayable to the user.
Five Guidelines for Selecting Hydrological Signatures
NASA Astrophysics Data System (ADS)
McMillan, H. K.; Westerberg, I.; Branger, F.
2017-12-01
Hydrological signatures are index values derived from observed or modeled series of hydrological data such as rainfall, flow or soil moisture. They are designed to extract relevant information about hydrological behavior, such as to identify dominant processes, and to determine the strength, speed and spatiotemporal variability of the rainfall-runoff response. Hydrological signatures play an important role in model evaluation. They allow us to test whether particular model structures or parameter sets accurately reproduce the runoff generation processes within the watershed of interest. Most modeling studies use a selection of different signatures to capture different aspects of the catchment response, for example evaluating overall flow distribution as well as high and low flow extremes and flow timing. Such studies often choose their own set of signatures, or may borrow subsets of signatures used in multiple other works. The link between signature values and hydrological processes is not always straightforward, leading to uncertainty and variability in hydrologists' signature choices. In this presentation, we aim to encourage a more rigorous approach to hydrological signature selection, which considers the ability of signatures to represent hydrological behavior and underlying processes for the catchment and application in question. To this end, we propose a set of guidelines for selecting hydrological signatures. We describe five criteria that any hydrological signature should conform to: Identifiability, Robustness, Consistency, Representativeness, and Discriminatory Power. We describe an example of the design process for a signature, assessing possible signature designs against the guidelines above. Due to their ubiquity, we chose a signature related to the Flow Duration Curve, selecting the FDC mid-section slope as a proposed signature to quantify catchment overall behavior and flashiness. We demonstrate how assessment against each guideline could be used to compare or choose between alternative signature definitions. We believe that reaching a consensus on selection criteria for hydrological signatures will assist modelers to choose between competing signatures, facilitate comparison between hydrological studies, and help hydrologists to fully evaluate their models.
Adaptive training diminishes distractibility in aging across species.
Mishra, Jyoti; de Villers-Sidani, Etienne; Merzenich, Michael; Gazzaley, Adam
2014-12-03
Aging is associated with deficits in the ability to ignore distractions, which has not yet been remediated by any neurotherapeutic approach. Here, in parallel auditory experiments with older rats and humans, we evaluated a targeted cognitive training approach that adaptively manipulated distractor challenge. Training resulted in enhanced discrimination abilities in the setting of irrelevant information in both species that was driven by selectively diminished distraction-related errors. Neural responses to distractors in auditory cortex were selectively reduced in both species, mimicking the behavioral effects. Sensory receptive fields in trained rats exhibited improved spectral and spatial selectivity. Frontal theta measures of top-down engagement with distractors were selectively restrained in trained humans. Finally, training gains generalized to group and individual level benefits in aspects of working memory and sustained attention. Thus, we demonstrate converging cross-species evidence for training-induced selective plasticity of distractor processing at multiple neural scales, benefitting distractor suppression and cognitive control. Copyright © 2014 Elsevier Inc. All rights reserved.
Preconception sex selection for non-medical and intermediate reasons: ethical reflections.
de Wert, G; Dondorp, W
2010-01-01
Sex selection for non-medical reasons is forbidden in many countries. Focusing on preconception sex selection, the authors first observe that it is unclear what should count as a 'medical reason' in this context and argue for the existence of 'intermediate reasons' that do not fit well within the rigid distinction between 'medical'and 'non-medical'. The article further provides a critical review of the arguments for the prohibition of sex selection for non-medical reasons and finds that none of these are conclusive. The authors conclude that the ban should be reconsidered, but also that existing-- societal concerns about possible harmful effects should be taken seriously. Measures to this effect may include limiting the practice to couples who already have at least one child of the sex opposite to that which they now want to select ('family balancing'). Finally, a difficult set of questions is raised by concerns about the reliability and unproven (long-term) safety of the only technology (flow cytometry) proven to work.
Selection of organisms for the co-evolution-based study of protein interactions.
Herman, Dorota; Ochoa, David; Juan, David; Lopez, Daniel; Valencia, Alfonso; Pazos, Florencio
2011-09-12
The prediction and study of protein interactions and functional relationships based on similarity of phylogenetic trees, exemplified by the mirrortree and related methodologies, is being widely used. Although dependence between the performance of these methods and the set of organisms used to build the trees was suspected, so far nobody assessed it in an exhaustive way, and, in general, previous works used as many organisms as possible. In this work we asses the effect of using different sets of organism (chosen according with various phylogenetic criteria) on the performance of this methodology in detecting protein interactions of different nature. We show that the performance of three mirrortree-related methodologies depends on the set of organisms used for building the trees, and it is not always directly related to the number of organisms in a simple way. Certain subsets of organisms seem to be more suitable for the predictions of certain types of interactions. This relationship between type of interaction and optimal set of organism for detecting them makes sense in the light of the phylogenetic distribution of the organisms and the nature of the interactions. In order to obtain an optimal performance when predicting protein interactions, it is recommended to use different sets of organisms depending on the available computational resources and data, as well as the type of interactions of interest.
Selection of organisms for the co-evolution-based study of protein interactions
2011-01-01
Background The prediction and study of protein interactions and functional relationships based on similarity of phylogenetic trees, exemplified by the mirrortree and related methodologies, is being widely used. Although dependence between the performance of these methods and the set of organisms used to build the trees was suspected, so far nobody assessed it in an exhaustive way, and, in general, previous works used as many organisms as possible. In this work we asses the effect of using different sets of organism (chosen according with various phylogenetic criteria) on the performance of this methodology in detecting protein interactions of different nature. Results We show that the performance of three mirrortree-related methodologies depends on the set of organisms used for building the trees, and it is not always directly related to the number of organisms in a simple way. Certain subsets of organisms seem to be more suitable for the predictions of certain types of interactions. This relationship between type of interaction and optimal set of organism for detecting them makes sense in the light of the phylogenetic distribution of the organisms and the nature of the interactions. Conclusions In order to obtain an optimal performance when predicting protein interactions, it is recommended to use different sets of organisms depending on the available computational resources and data, as well as the type of interactions of interest. PMID:21910884
The Impact of Normalization Methods on RNA-Seq Data Analysis
Zyprych-Walczak, J.; Szabelska, A.; Handschuh, L.; Górczak, K.; Klamecka, K.; Figlerowicz, M.; Siatkowski, I.
2015-01-01
High-throughput sequencing technologies, such as the Illumina Hi-seq, are powerful new tools for investigating a wide range of biological and medical problems. Massive and complex data sets produced by the sequencers create a need for development of statistical and computational methods that can tackle the analysis and management of data. The data normalization is one of the most crucial steps of data processing and this process must be carefully considered as it has a profound effect on the results of the analysis. In this work, we focus on a comprehensive comparison of five normalization methods related to sequencing depth, widely used for transcriptome sequencing (RNA-seq) data, and their impact on the results of gene expression analysis. Based on this study, we suggest a universal workflow that can be applied for the selection of the optimal normalization procedure for any particular data set. The described workflow includes calculation of the bias and variance values for the control genes, sensitivity and specificity of the methods, and classification errors as well as generation of the diagnostic plots. Combining the above information facilitates the selection of the most appropriate normalization method for the studied data sets and determines which methods can be used interchangeably. PMID:26176014
Approximate l-fold cross-validation with Least Squares SVM and Kernel Ridge Regression
DOE Office of Scientific and Technical Information (OSTI.GOV)
Edwards, Richard E; Zhang, Hao; Parker, Lynne Edwards
2013-01-01
Kernel methods have difficulties scaling to large modern data sets. The scalability issues are based on computational and memory requirements for working with a large matrix. These requirements have been addressed over the years by using low-rank kernel approximations or by improving the solvers scalability. However, Least Squares Support VectorMachines (LS-SVM), a popular SVM variant, and Kernel Ridge Regression still have several scalability issues. In particular, the O(n^3) computational complexity for solving a single model, and the overall computational complexity associated with tuning hyperparameters are still major problems. We address these problems by introducing an O(n log n) approximate l-foldmore » cross-validation method that uses a multi-level circulant matrix to approximate the kernel. In addition, we prove our algorithm s computational complexity and present empirical runtimes on data sets with approximately 1 million data points. We also validate our approximate method s effectiveness at selecting hyperparameters on real world and standard benchmark data sets. Lastly, we provide experimental results on using a multi-level circulant kernel approximation to solve LS-SVM problems with hyperparameters selected using our method.« less
The upper bound of abutment scour defined by selected laboratory and field data
Benedict, Stephen; Caldwell, Andral W.
2015-01-01
The U.S. Geological Survey, in cooperation with the South Carolina Department of Transportation, conducted a field investigation of abutment scour in South Carolina and used that data to develop envelope curves defining the upper bound of abutment scour. To expand upon this previous work, an additional cooperative investigation was initiated to combine the South Carolina data with abutment-scour data from other sources and evaluate the upper bound of abutment scour with the larger data set. To facilitate this analysis, a literature review was made to identify potential sources of published abutment-scour data, and selected data, consisting of 446 laboratory and 331 field measurements, were compiled for the analysis. These data encompassed a wide range of laboratory and field conditions and represent field data from 6 states within the United States. The data set was used to evaluate the South Carolina abutment-scour envelope curves. Additionally, the data were used to evaluate a dimensionless abutment-scour envelope curve developed by Melville (1992), highlighting the distinct difference in the upper bound for laboratory and field data. The envelope curves evaluated in this investigation provide simple but useful tools for assessing the potential maximum abutment-scour depth in the field setting.
NASA Technical Reports Server (NTRS)
McConnell, Joshua B.
2000-01-01
The scientific exploration of Mars will require the collection and return of subterranean samples to Earth for examination. This necessitates the use of some type of device or devices that possesses the ability to effectively penetrate the Martian surface, collect suitable samples and return them to the surface in a manner consistent with imposed scientific constraints. The first opportunity for such a device will occur on the 2003 and 2005 Mars Sample Return missions, being performed by NASA. This paper reviews the work completed on the compilation of a database containing viable penetrating and sampling devices, the performance of a system level trade study comparing selected devices to a set of prescribed parameters and the employment of a metric for the evaluation and ranking of the traded penetration and sampling devices, with respect to possible usage on the 03 and 05 sample return missions. The trade study performed is based on a select set of scientific, engineering, programmatic and socio-political criterion. The use of a metric for the various penetration and sampling devices will act to expedite current and future device selection.
Davy, Jonathan; Göbel, Matthias
2013-01-01
This study compared the effects of a 1 h self-selected recovery period to those of a standard night shift arrangement (with a total break time of 1-h) over a simulated three-day night shift schedule in a laboratory setting. Results showed that the inclusion of the flexible nap scheme resulted in higher performance output, improvements in physiological strain responses and reduced sleepiness during each night shift and generally over the three-night cycle. Certain variables also revealed the impact of napping compared with the standard rest break condition on the circadian rhythm. The sleep diary records show that the inclusion of the current intervention did not significantly reduce daytime recovery sleep. The results suggest that the potential benefits of flexible napping may outweigh the logistical effort it requires in a workplace environment. Consensus on appropriate napping strategies for shift work remains a challenge. This simulated night shift laboratory study sought to determine the effects of a 1-h self-selected nap opportunity relative to a normal shift set-up. The nap improved performance and decreased sleepiness, without affecting daytime sleep.
Nana, Roger; Hu, Xiaoping
2010-01-01
k-space-based reconstruction in parallel imaging depends on the reconstruction kernel setting, including its support. An optimal choice of the kernel depends on the calibration data, coil geometry and signal-to-noise ratio, as well as the criterion used. In this work, data consistency, imposed by the shift invariance requirement of the kernel, is introduced as a goodness measure of k-space-based reconstruction in parallel imaging and demonstrated. Data consistency error (DCE) is calculated as the sum of squared difference between the acquired signals and their estimates obtained based on the interpolation of the estimated missing data. A resemblance between DCE and the mean square error in the reconstructed image was found, demonstrating DCE's potential as a metric for comparing or choosing reconstructions. When used for selecting the kernel support for generalized autocalibrating partially parallel acquisition (GRAPPA) reconstruction and the set of frames for calibration as well as the kernel support in temporal GRAPPA reconstruction, DCE led to improved images over existing methods. Data consistency error is efficient to evaluate, robust for selecting reconstruction parameters and suitable for characterizing and optimizing k-space-based reconstruction in parallel imaging.
A primer for effective organization of professional conferences.
Werner, Susan E; Kenefick, Colleen
2005-01-01
The challenge of organizing a successful conference is a tremendous commitment requiring extensive preparation and teamwork. It is tempting but dangerous to underestimate the details needed to coordinate an outstanding event. Conferences follow a natural life cycle from proposal, gaining administrative support, planning, implementing, and then finally evaluating outcomes. These guidelines identify the tasks and areas of responsibilities required including setting objectives, budgeting, selecting a venue, publicity, programming, and working with vendors.
Infrastructure dynamics: A selected bibliography
NASA Technical Reports Server (NTRS)
Dajani, J. S.; Bencosme, A. J.
1978-01-01
The term infrastructure is used to denote the set of life support and public service systems which is necessary for the development of growth of human settlements. Included are some basic references in the field of dynamic simulation, as well as a number of relevant applications in the area of infrastructure planning. The intent is to enable the student or researcher to quickly identify such applications to the extent necessary for initiating further work in the field.
UAV Swarm Behavior Modeling for Early Exposure of Failure Modes
2016-09-01
Systems Center Atlantic, for his patience with me through this two-year process. He worked with my schedule and was very understanding of the...emergence of new failure modes? The MP modeling environment provides a breakdown of all potential event traces. Given that the research questions call...for the revelation of potential failure modes, MP was selected as the modeling environment because it provides a substantial set of results and data
Nexo, Mette Andersen; Watt, Torquil; Bonnema, Steen Joop; Hegedüs, Laszlo; Rasmussen, Åse Krogh; Feldt-Rasmussen, Ulla; Bjorner, Jakob Bue
2015-07-01
We aimed to identify the best approach to work ability assessment in patients with thyroid disease by evaluating the factor structure, measurement equivalence, known-groups validity, and predictive validity of a broad set of work ability items. Based on the literature and interviews with thyroid patients, 24 work ability items were selected from previous questionnaires, revised, or developed anew. Items were tested among 632 patients with thyroid disease (non-toxic goiter, toxic nodular goiter, Graves' disease (with or without orbitopathy), autoimmune hypothyroidism, and other thyroid diseases), 391 of which had participated in a study 5 years previously. Responses to select items were compared to general population data. We used confirmatory factor analyses for categorical data, logistic regression analyses and tests of differential item function, and head-to-head comparisons of relative validity in distinguishing known groups. Although all work ability items loaded on a common factor, the optimal factor solution included five factors: role physical, role emotional, thyroid-specific limitations, work limitations (without disease attribution), and work performance. The scale on thyroid-specific limitations showed the most power in distinguishing clinical groups and time since diagnosis. A global single item proved useful for comparisons with the general population, and a thyroid-specific item predicted labor market exclusion within the next 5 years (OR 5.0, 95 % CI 2.7-9.1). Items on work limitations with attribution to thyroid disease were most effective in detecting impact on work ability and showed good predictive validity. Generic work ability items remain useful for general population comparisons.
Investigation of materials for inert electrodes in aluminum electrodeposition cells
NASA Astrophysics Data System (ADS)
Haggerty, J. S.; Sadoway, D. R.
1987-09-01
Work was divided into major efforts. The first was the growth and characterization of specimens; the second was Hall cell performance testing. Cathode and anode materials were the subject of investigation. Preparation of specimens included growth of single crystals and synthesis of ultra high purity powders. Special attention was paid to ferrites as they were considered to be the most promising anode materials. Ferrite anode corrosion rates were studied and the electrical conductivities of a set of copper-manganese ferrites were measured. Float Zone, Pendant Drop Cryolite Experiments were undertaken because unsatisfactory choices of candidate materials were being made on the basis of a flawed set of selection criteria applied to an incomplete and sometimes inaccurate data base. This experiment was then constructed to determine whether the apparatus used for float zone crystal growth could be adapted to make a variety of important based melts and their interactions with candidate inert anode materials. Compositions), driven by our perception that the basis for prior selection of candidate materials was inadequate. Results are presented.
Working fluid selection for space-based two-phase heat transport systems
NASA Technical Reports Server (NTRS)
Mclinden, Mark O.
1988-01-01
The working fluid for externally-mounted, space-based two-phase heat transport systems is considered. A sequence of screening criteria involving freezing and critical point temperatures and latent heat of vaporization and vapor density are applied to a data base of 860 fluids. The thermal performance of the 52 fluids which pass this preliminary screening are then ranked according to their impact on the weight of a reference system. Upon considering other nonthermal criteria (flammability, toxicity, and chemical stability) a final set of 10 preferred fluids is obtained. The effects of variations in system parameters is investigated for these 10 fluids by means of a factorial design.
Gesture Analysis for Astronomy Presentation Software
NASA Astrophysics Data System (ADS)
Robinson, Marc A.
Astronomy presentation software in a planetarium setting provides a visually stimulating way to introduce varied scientific concepts, including computer science concepts, to a wide audience. However, the underlying computational complexity and opportunities for discussion are often overshadowed by the brilliance of the presentation itself. To bring this discussion back out into the open, a method needs to be developed to make the computer science applications more visible. This thesis introduces the GAAPS system, which endeavors to implement free-hand gesture-based control of astronomy presentation software, with the goal of providing that talking point to begin the discussion of computer science concepts in a planetarium setting. The GAAPS system incorporates gesture capture and analysis in a unique environment presenting unique challenges, and introduces a novel algorithm called a Bounding Box Tree to create and select features for this particular gesture data. This thesis also analyzes several different machine learning techniques to determine a well-suited technique for the classification of this particular data set, with an artificial neural network being chosen as the implemented algorithm. The results of this work will allow for the desired introduction of computer science discussion into the specific setting used, as well as provide for future work pertaining to gesture recognition with astronomy presentation software.
Lilley, Rebbecca; Feyer, Anne-Marie; Firth, Hilda; Cunningham, Chris; Paul, Charlotte
2010-02-01
Changes to work and the impact of these changes on worker health and safety have been significant. A core surveillance data set is needed to understand the impact of working conditions and work environments. Yet, there is little harmony amongst international surveys and a critical lack of guidance identifying the best directions for surveillance efforts. This paper describes the establishment of an instrument suitable for use as a hazard surveillance tool for New Zealand workers. An iterative process of critical review was undertaken to create a dimensional framework and select specific measures from existing instruments. Pilot testing to ascertain participant acceptability of the questions was undertaken. The final questionnaire includes measures of socio-demographic characteristics, occupational history, work organisation, physicochemical, ergonomic and psychosocial hazards. Outcome measures were also included. A robust New Zealand hazard surveillance questionnaire comprehensively covering the key measures of work organisation and work environments that impact upon worker health and safety outcomes was developed. Recommended measures of work organisation, work environment and health outcomes that should be captured in work environment surveillance are made.
Liver Full Reference Set Application: David Lubman - Univ of Michigan (2011) — EDRN Public Portal
In this work we will perform the next step in the biomarker development and validation. This step will be the Phase 2 validation of glycoproteins that have passed Phase 1 blinded validation using ELISA kits based on target glycoproteins selected based on our previous work. This will be done in a large Phase 2 sample set obtained in a multicenter study funded by the EDRN. The assays will be performed in our research lab located in the Center for Cancer Proteomics in the University of Michigan Medical Center. This study will include patients in whom serum was stored for future validation and includes samples from early HCC (n = 158), advanced cases (n=214) and cirrhotic controls (n = 417). These samples will be supplied by the EDRN (per Dr. Jo Ann Rinaudo) and will be analyzed in a blinded fashion by Dr. Feng from the Fred Hutchinson Cancer Center. This phase 2 study was designed to have above 90% power at one-sided 5% type-I error for comparing the joint sensitivity and specificity for differentiating early stage HCC from cirrhotic patients between AFP and a new marker. Sample sizes of 200 for early stage HCC and 400 for cirrhotics were required to achieve the stated power (14). We will select our candidates for this larger phase validation set based on the results of previous work. These will include HGF and CD14 and the results of these assays will be used to evaluate the performance of each of these markers and combinations of HGF and CD14 and AFP and HGF. It is expected that each assay will be repeated three times for each marker and will also be performed for AFP as the standard for comparison. 250 uL of each sample is requested for analysis.
Choo, Esther K; Kass, Dara; Westergaard, Mary; Watts, Susan H; Berwald, Nicole; Regan, Linda; Promes, Susan B; Clem, Kathleen J; Schneider, Sandra M; Kuhn, Gloria J; Abbuhl, Stephanie; Nobay, Flavia
2016-11-01
Women in medicine continue to experience disparities in earnings, promotion, and leadership roles. There are few guidelines in place defining organization-level factors that promote a supportive workplace environment beneficial to women in emergency medicine (EM). We assembled a working group with the goal of developing specific and feasible recommendations to support women's professional development in both community and academic EM settings. We formed a working group from the leadership of two EM women's organizations, the Academy of Women in Academic Emergency Medicine (AWAEM) and the American Association of Women Emergency Physicians (AAWEP). Through a literature search and discussion, working group members identified four domains where organizational policies and practices supportive of women were needed: 1) global approaches to supporting the recruitment, retention, and advancement of women in EM; 2) recruitment, hiring, and compensation of women emergency physicians; 3) supporting development and advancement of women in EM; and 4) physician health and wellness (in the context of pregnancy, childbirth, and maternity leave). Within each of these domains, the working group created an initial set of specific recommendations. The working group then recruited a stakeholder group of EM physician leaders across the country, selecting for diversity in practice setting, geographic location, age, race, and gender. Stakeholders were asked to score and provide feedback on each of the recommendations. Specific recommendations were retained by the working group if they achieved high rates of approval from the stakeholder group for importance and perceived feasibility. Those with >80% agreement on importance and >50% agreement on feasibility were retained. Finally, recommendations were posted in an open online forum (blog) and invited public commentary. An initial set of 29 potential recommendations was created by the working group. After stakeholder voting and feedback, 16 final recommendations were retained. Recommendations were refined through qualitative comments from stakeholders and blog respondents. Using a consensus building process that included male and female stakeholders from both academic and community EM settings, we developed recommendations for organizations to implement to create a workplace environment supportive of women in EM that were perceived as acceptable and feasible. This process may serve as a model for other medical specialties to establish clear, discrete organization-level practices aimed at supporting women physicians. © 2016 by the Society for Academic Emergency Medicine.
Nguyen, Thanh-Tung; Huang, Joshua; Wu, Qingyao; Nguyen, Thuy; Li, Mark
2015-01-01
Single-nucleotide polymorphisms (SNPs) selection and identification are the most important tasks in Genome-wide association data analysis. The problem is difficult because genome-wide association data is very high dimensional and a large portion of SNPs in the data is irrelevant to the disease. Advanced machine learning methods have been successfully used in Genome-wide association studies (GWAS) for identification of genetic variants that have relatively big effects in some common, complex diseases. Among them, the most successful one is Random Forests (RF). Despite of performing well in terms of prediction accuracy in some data sets with moderate size, RF still suffers from working in GWAS for selecting informative SNPs and building accurate prediction models. In this paper, we propose to use a new two-stage quality-based sampling method in random forests, named ts-RF, for SNP subspace selection for GWAS. The method first applies p-value assessment to find a cut-off point that separates informative and irrelevant SNPs in two groups. The informative SNPs group is further divided into two sub-groups: highly informative and weak informative SNPs. When sampling the SNP subspace for building trees for the forest, only those SNPs from the two sub-groups are taken into account. The feature subspaces always contain highly informative SNPs when used to split a node at a tree. This approach enables one to generate more accurate trees with a lower prediction error, meanwhile possibly avoiding overfitting. It allows one to detect interactions of multiple SNPs with the diseases, and to reduce the dimensionality and the amount of Genome-wide association data needed for learning the RF model. Extensive experiments on two genome-wide SNP data sets (Parkinson case-control data comprised of 408,803 SNPs and Alzheimer case-control data comprised of 380,157 SNPs) and 10 gene data sets have demonstrated that the proposed model significantly reduced prediction errors and outperformed most existing the-state-of-the-art random forests. The top 25 SNPs in Parkinson data set were identified by the proposed model including four interesting genes associated with neurological disorders. The presented approach has shown to be effective in selecting informative sub-groups of SNPs potentially associated with diseases that traditional statistical approaches might fail. The new RF works well for the data where the number of case-control objects is much smaller than the number of SNPs, which is a typical problem in gene data and GWAS. Experiment results demonstrated the effectiveness of the proposed RF model that outperformed the state-of-the-art RFs, including Breiman's RF, GRRF and wsRF methods.
Löbner, Margrit; Luppa, Melanie; Konnopka, Alexander; Meisel, Hans J.; Günther, Lutz; Meixensberger, Jürgen; Stengler, Katarina; Angermeyer, Matthias C.; König, Hans-Helmut; Riedel-Heller, Steffi G.
2014-01-01
Objective To examine rehabilitation preferences, participation and determinants for the choice of a certain rehabilitation setting (inpatient vs. outpatient) and setting-specific rehabilitation outcomes. Methods The longitudinal observational study referred to 534 consecutive disc surgery patients (18–55 years). Face-to-face baseline interviews took place about 3.6 days after disc surgery during acute hospital stay. 486 patients also participated in a follow-up interview via telephone three months later (dropout-rate: 9%). The following instruments were used: depression and anxiety (Hospital Anxiety and Depression Scale), pain intensity (numeric analog scale), health-related quality of life (Short Form 36 Health Survey), subjective prognosis of gainful employment (SPE-scale) as well as questions on rehabilitation attendance, return to work, and amount of sick leave days. Results The vast majority of patients undergoing surgery for a herniated disc attended a post-hospital rehabilitation treatment program (93%). Thereby two-thirds of these patients took part in an inpatient rehabilitation program (67.9%). Physical, psychological, vocational and health-related quality of life characteristics differed widely before as well as after rehabilitation depending on the setting. Inpatient rehabilitees were significantly older, reported more pain, worse physical quality of life, more anxiety and depression and a worse subjective prognosis of gainful employment before rehabilitation. Pre-rehabilitation differences remained significant after rehabilitation. More than half of the outpatient rehabilitees (56%) compared to only one third of the inpatient rehabilitees (33%) returned to work three months after disc surgery (p<.001). Conclusion The results suggest a “pre-selection” of patients with better health status in outpatient rehabilitation. Gaining better knowledge about setting-specific selection processes may help optimizing rehabilitation allocation procedures and improve rehabilitation effects such as return to work. PMID:24598904
Methodology for extracting local constants from petroleum cracking flows
Chang, Shen-Lin; Lottes, Steven A.; Zhou, Chenn Q.
2000-01-01
A methodology provides for the extraction of local chemical kinetic model constants for use in a reacting flow computational fluid dynamics (CFD) computer code with chemical kinetic computations to optimize the operating conditions or design of the system, including retrofit design improvements to existing systems. The coupled CFD and kinetic computer code are used in combination with data obtained from a matrix of experimental tests to extract the kinetic constants. Local fluid dynamic effects are implicitly included in the extracted local kinetic constants for each particular application system to which the methodology is applied. The extracted local kinetic model constants work well over a fairly broad range of operating conditions for specific and complex reaction sets in specific and complex reactor systems. While disclosed in terms of use in a Fluid Catalytic Cracking (FCC) riser, the inventive methodology has application in virtually any reaction set to extract constants for any particular application and reaction set formulation. The methodology includes the step of: (1) selecting the test data sets for various conditions; (2) establishing the general trend of the parametric effect on the measured product yields; (3) calculating product yields for the selected test conditions using coupled computational fluid dynamics and chemical kinetics; (4) adjusting the local kinetic constants to match calculated product yields with experimental data; and (5) validating the determined set of local kinetic constants by comparing the calculated results with experimental data from additional test runs at different operating conditions.
Callwood, Alison; Cooke, Debbie; Allan, Helen
2014-12-01
Published research has demonstrated that the multiple mini-interview (MMI) is a reliable assessment instrument in medical and nursing student selection. There is a dearth of evidence specifically relating to the advancement and subsequent evaluation of MMIs in the context of student midwife selection. To develop, pilot and examine the reliability of MMIs in pre-registration student midwife selection in a UK setting. DeVellis' framework for questionnaire development underpinned the generation of MMI scenarios. BSc (Hons) Midwifery Studies students at a Higher Education Institution in the UK volunteered to participate in 'mock' MMI circuits during the first week of their programme. An eight station model was piloted. Communication skills were rated at each station as a generic attribute. Station specific attributes assessed included: compassion and empathy; respect for difference and diversity; honesty and integrity; intellectual curiosity and reflective nature; advocacy; respect for privacy and dignity; team working and initiative; the role of the midwife and motivation to become a midwife. Participants' responses to scenario questions were rated on a 7 point scale. Cronbach's alpha scores measuring internal consistency ranged from 0.91 to 0.97 CONCLUSION: The systematic development of the MMI model and scenarios resulted in 'excellent' reliability across all stations. These findings endorse the MMI technique as a reliable alternative to the personal interview in informing final decisions in pre-registration student midwife selection. Copyright © 2014 Elsevier Ltd. All rights reserved.
Design and fabrication of forward-swept counterrotation blade configuration for wind tunnel testing
NASA Technical Reports Server (NTRS)
Nichols, G. H.
1994-01-01
Work performed by GE Aircraft on advanced counterrotation blade configuration concepts for high speed turboprop system is described. Primary emphasis was placed on theoretically and experimentally evaluating the aerodynamic, aeromechanical, and acoustic performance of GE-defined counterrotating blade concepts. Several blade design concepts were considered. Feasibility studies were conducted to evaluate a forward-swept versus an aft-swept blade application and how the given blade design would affect interaction between rotors. Two blade designs were initially selected. Both designs involved in-depth aerodynamic, aeromechanical, mechanical, and acoustic analyses followed by the fabrication of forward-swept, forward rotor blade sets to be wind tunnel tested with an aft-swept, aft rotor blade set. A third blade set was later produced from a NASA design that was based on wind tunnel test results from the first two blade sets. This blade set had a stiffer outer ply material added to the original blade design, in order to reach the design point operating line. Detailed analyses, feasibility studies, and fabrication procedures for all blade sets are presented.
Health technology assessment process of a cardiovascular medical device in four different settings.
Olry de Labry Lima, Antonio; Espín Balbino, Jaime; Lemgruber, Alexandre; Caro Martínez, Araceli; García-Mochón, Leticia; Martín Ruiz, Eva; Lessa, Fernanda
2017-10-01
Health technology assessment (HTA) is a tool to help the decision-making process. The aim is to describe methods and processes used in the reimbursement decision making for drug-eluting stents (DES) in four different settings. DES as a technology under study was selected according to different criteria, all of them agreed by a working group. A survey of key informants was designed. DES was evaluated following well-structured HTA processes. Nonetheless, scope for improvement was observed in relation to the data considered for the final decision, the transparency and inclusiveness of the process as well as in the methods employed. An attempt to describe the HTA processes of a well-known medical device.
Greedy Algorithms for Nonnegativity-Constrained Simultaneous Sparse Recovery
Kim, Daeun; Haldar, Justin P.
2016-01-01
This work proposes a family of greedy algorithms to jointly reconstruct a set of vectors that are (i) nonnegative and (ii) simultaneously sparse with a shared support set. The proposed algorithms generalize previous approaches that were designed to impose these constraints individually. Similar to previous greedy algorithms for sparse recovery, the proposed algorithms iteratively identify promising support indices. In contrast to previous approaches, the support index selection procedure has been adapted to prioritize indices that are consistent with both the nonnegativity and shared support constraints. Empirical results demonstrate for the first time that the combined use of simultaneous sparsity and nonnegativity constraints can substantially improve recovery performance relative to existing greedy algorithms that impose less signal structure. PMID:26973368
NASA Technical Reports Server (NTRS)
Crutcher, H. L.; Falls, L. W.
1976-01-01
Sets of experimentally determined or routinely observed data provide information about the past, present and, hopefully, future sets of similarly produced data. An infinite set of statistical models exists which may be used to describe the data sets. The normal distribution is one model. If it serves at all, it serves well. If a data set, or a transformation of the set, representative of a larger population can be described by the normal distribution, then valid statistical inferences can be drawn. There are several tests which may be applied to a data set to determine whether the univariate normal model adequately describes the set. The chi-square test based on Pearson's work in the late nineteenth and early twentieth centuries is often used. Like all tests, it has some weaknesses which are discussed in elementary texts. Extension of the chi-square test to the multivariate normal model is provided. Tables and graphs permit easier application of the test in the higher dimensions. Several examples, using recorded data, illustrate the procedures. Tests of maximum absolute differences, mean sum of squares of residuals, runs and changes of sign are included in these tests. Dimensions one through five with selected sample sizes 11 to 101 are used to illustrate the statistical tests developed.
Neural bases of orthographic long-term memory and working memory in dysgraphia
Purcell, Jeremy; Hillis, Argye E.; Capasso, Rita; Miceli, Gabriele
2016-01-01
Spelling a word involves the retrieval of information about the word’s letters and their order from long-term memory as well as the maintenance and processing of this information by working memory in preparation for serial production by the motor system. While it is known that brain lesions may selectively affect orthographic long-term memory and working memory processes, relatively little is known about the neurotopographic distribution of the substrates that support these cognitive processes, or the lesions that give rise to the distinct forms of dysgraphia that affect these cognitive processes. To examine these issues, this study uses a voxel-based mapping approach to analyse the lesion distribution of 27 individuals with dysgraphia subsequent to stroke, who were identified on the basis of their behavioural profiles alone, as suffering from deficits only affecting either orthographic long-term or working memory, as well as six other individuals with deficits affecting both sets of processes. The findings provide, for the first time, clear evidence of substrates that selectively support orthographic long-term and working memory processes, with orthographic long-term memory deficits centred in either the left posterior inferior frontal region or left ventral temporal cortex, and orthographic working memory deficits primarily arising from lesions of the left parietal cortex centred on the intraparietal sulcus. These findings also contribute to our understanding of the relationship between the neural instantiation of written language processes and spoken language, working memory and other cognitive skills. PMID:26685156
Black, Jeffrey J; Dolan, Andrew; Harper, Jason B; Aldous, Leigh
2018-06-06
Solvate ionic liquids are a relatively new class of liquids produced by combining a coordinating solvent with a salt. They have a variety of uses and their suitability for such depends upon the ratio of salt to coordinating solvent. This work investigates the Kamlet-Taft solvent parameters of, NMR chemical shifts of nuclei in, and thermoelectrochemistry of a selected set of solvate ionic liquids produced from glymes (methyl terminated oligomers of ethylene glycol) and lithium bis(trifluoromethylsulfonyl)imide at two different compositions. The aim is to improve the understanding of the interactions occurring in these ionic liquids to help select suitable solvate ionic liquids for future applications.
Part 1. Statistical Learning Methods for the Effects of Multiple Air Pollution Constituents.
Coull, Brent A; Bobb, Jennifer F; Wellenius, Gregory A; Kioumourtzoglou, Marianthi-Anna; Mittleman, Murray A; Koutrakis, Petros; Godleski, John J
2015-06-01
The United States Environmental Protection Agency (U.S. EPA*) currently regulates individual air pollutants on a pollutant-by-pollutant basis, adjusted for other pollutants and potential confounders. However, the National Academies of Science concluded that a multipollutant regulatory approach that takes into account the joint effects of multiple constituents is likely to be more protective of human health. Unfortunately, the large majority of existing research had focused on health effects of air pollution for one pollutant or for one pollutant with control for the independent effects of a small number of copollutants. Limitations in existing statistical methods are at least partially responsible for this lack of information on joint effects. The goal of this project was to fill this gap by developing flexible statistical methods to estimate the joint effects of multiple pollutants, while allowing for potential nonlinear or nonadditive associations between a given pollutant and the health outcome of interest. We proposed Bayesian kernel machine regression (BKMR) methods as a way to simultaneously achieve the multifaceted goals of variable selection, flexible estimation of the exposure-response relationship, and inference on the strength of the association between individual pollutants and health outcomes in a health effects analysis of mixtures. We first developed a BKMR variable-selection approach, which we call component-wise variable selection, to make estimating such a potentially complex exposure-response function possible by effectively using two types of penalization (or regularization) of the multivariate exposure-response surface. Next we developed an extension of this first variable-selection approach that incorporates knowledge about how pollutants might group together, such as multiple constituents of particulate matter that might represent a common pollution source category. This second grouped, or hierarchical, variable-selection procedure is applicable when groups of highly correlated pollutants are being studied. To investigate the properties of the proposed methods, we conducted three simulation studies designed to evaluate the ability of BKMR to estimate environmental mixtures responsible for health effects under potentially complex but plausible exposure-response relationships. An attractive feature of our simulation studies is that we used actual exposure data rather than simulated values. This real-data simulation approach allowed us to evaluate the performance of BKMR and several other models under realistic joint distributions of multipollutant exposure. The simulation studies compared the two proposed variable-selection approaches (component-wise and hierarchical variable selection) with each other and with existing frequentist treatments of kernel machine regression (KMR). After the simulation studies, we applied the newly developed methods to an epidemiologic data set and to a toxicologic data set. To illustrate the applicability of the proposed methods to human epidemiologic data, we estimated associations between short-term exposures to fine particulate matter constituents and blood pressure in the Maintenance of Balance, Independent Living, Intellect, and Zest in the Elderly (MOBILIZE) Boston study, a prospective cohort study of elderly subjects. To illustrate the applicability of these methods to animal toxicologic studies, we analyzed data on the associations between both blood pressure and heart rate in canines exposed to a composition of concentrated ambient particles (CAPs) in a study conducted at the Harvard T. H. Chan School of Public Health (the Harvard Chan School; formerly Harvard School of Public Health; Bartoli et al. 2009). We successfully developed the theory and computational tools required to apply the proposed methods to the motivating data sets. Collectively, the three simulation studies showed that component-wise variable selection can identify important pollutants within a mixture as long as the correlations among pollutant concentrations are low to moderate. The hierarchical variable-selection method was more effective in high-dimension, high-correlation settings. Variable selection in existing frequentist KMR models can incur inflated type I error rates, particularly when pollutants are highly correlated. The analyses of the MOBILIZE data yielded evidence of a linear and additive association of black carbon (BC) or Cu exposure with standing diastolic blood pressure (DBP), and a linear association of S exposure with standing systolic blood pressure (SBP). Cu is thought to be a marker of urban road dust associated with traffic; and S is a marker of power plant emissions or regional long-range transported air pollution or both. Therefore, these analyses of the MOBILIZE data set suggest that emissions from these three source categories were most strongly associated with hemodynamic responses in this cohort. In contrast, in the Harvard Chan School canine study, after controlling for an overall effect of CAPs exposure, we did not observe any associations between DBP or SBP and any elemental concentrations. Instead, we observed strong evidence of an association between Mn concentrations and heart rate in that heart rate increased linearly with increasing concentrations of Mn. According to the positive matrix factorization (PMF) source apportionment analyses of the multipollutant data set from the Harvard Chan School Boston Supersite, Mn loads on the two factors that represent the mobile and road dust source categories. The results of the BKMR analyses in both the MOBILIZE and canine studies were similar to those from existing linear mixed model analyses of the same multipollutant data because the effects have linear and additive forms that could also have been detected using standard methods. This work provides several contributions to the KMR literature. First, to our knowledge this is the first time KMR methods have been used to estimate the health effects of multipollutant mixtures. Second, we developed a novel hierarchical variable-selection approach within BKMR that is able to account for the structure of the mixture and systematically handle highly correlated exposures. The analyses of the epidemiologic and toxicologic data on associations between fine particulate matter constituents and blood pressure or heart rate demonstrated associations with constituents that are typically associated with traffic emissions, power plants, and long-range transported pollutants. The simulation studies showed that the BKMR methods proposed here work well for small to moderate data sets; more work is needed to develop computationally fast methods for large data sets. This will be a goal of future work.
DAT/SERT Selectivity of Flexible GBR 12909 Analogs Modeled Using 3D-QSAR Methods
Gilbert, Kathleen M.; Boos, Terrence L.; Dersch, Christina M.; Greiner, Elisabeth; Jacobson, Arthur E.; Lewis, David; Matecka, Dorota; Prisinzano, Thomas E.; Zhang, Ying; Rothman, Richard B.; Rice, Kenner C.; Venanzi, Carol A.
2007-01-01
The dopamine reuptake inhibitor GBR 12909 (1-{2-[bis(4-fluorophenyl)methoxy]ethyl}-4-(3-phenylpropyl)piperazine, 1) and its analogs have been developed as tools to test the hypothesis that selective dopamine transporter (DAT) inhibitors will be useful therapeutics for cocaine addiction. This 3D-QSAR study focuses on the effect of substitutions in the phenylpropyl region of 1. CoMFA and CoMSIA techniques were used to determine a predictive and stable model for the DAT/serotonin transporter (SERT) selectivity (represented by pKi (DAT/SERT)) of a set of flexible analogs of 1, most of which have eight rotatable bonds. In the absence of a rigid analog to use as a 3D-QSAR template, six conformational families of analogs were constructed from six pairs of piperazine and piperidine template conformers identified by hierarchical clustering as representative molecular conformations. Three models stable to y-value scrambling were identified after a comprehensive CoMFA and CoMSIA survey with Region Focusing. Test set correlation validation led to an acceptable model, with q2 = 0.508, standard error of prediction = 0.601, two components, r2 = 0.685, standard error of estimate = 0.481, F value = 39, percent steric contribution = 65, and percent electrostatic contribution = 35. A CoMFA contour map identified areas of the molecule that affect pKi (DAT/SERT). This work outlines a protocol for deriving a stable and predictive model of the biological activity of a set of very flexible molecules. PMID:17127069
Experiments on automatic classification of tissue malignancy in the field of digital pathology
NASA Astrophysics Data System (ADS)
Pereira, J.; Barata, R.; Furtado, Pedro
2017-06-01
Automated analysis of histological images helps diagnose and further classify breast cancer. Totally automated approaches can be used to pinpoint images for further analysis by the medical doctor. But tissue images are especially challenging for either manual or automated approaches, due to mixed patterns and textures, where malignant regions are sometimes difficult to detect unless they are in very advanced stages. Some of the major challenges are related to irregular and very diffuse patterns, as well as difficulty to define winning features and classifier models. Although it is also hard to segment correctly into regions, due to the diffuse nature, it is still crucial to take low-level features over individualized regions instead of the whole image, and to select those with the best outcomes. In this paper we report on our experiments building a region classifier with a simple subspace division and a feature selection model that improves results over image-wide and/or limited feature sets. Experimental results show modest accuracy for a set of classifiers applied over the whole image, while the conjunction of image division, per-region low-level extraction of features and selection of features, together with the use of a neural network classifier achieved the best levels of accuracy for the dataset and settings we used in the experiments. Future work involves deep learning techniques, adding structures semantics and embedding the approach as a tumor finding helper in a practical Medical Imaging Application.
Affect-Aware Adaptive Tutoring Based on Human-Automation Etiquette Strategies.
Yang, Euijung; Dorneich, Michael C
2018-06-01
We investigated adapting the interaction style of intelligent tutoring system (ITS) feedback based on human-automation etiquette strategies. Most ITSs adapt the content difficulty level, adapt the feedback timing, or provide extra content when they detect cognitive or affective decrements. Our previous work demonstrated that changing the interaction style via different feedback etiquette strategies has differential effects on students' motivation, confidence, satisfaction, and performance. The best etiquette strategy was also determined by user frustration. Based on these findings, a rule set was developed that systemically selected the proper etiquette strategy to address one of four learning factors (motivation, confidence, satisfaction, and performance) under two different levels of user frustration. We explored whether etiquette strategy selection based on this rule set (systematic) or random changes in etiquette strategy for a given level of frustration affected the four learning factors. Participants solved mathematics problems under different frustration conditions with feedback that adapted dynamic changes in etiquette strategies either systematically or randomly. The results demonstrated that feedback with etiquette strategies chosen systematically via the rule set could selectively target and improve motivation, confidence, satisfaction, and performance more than changing etiquette strategies randomly. The systematic adaptation was effective no matter the level of frustration for the participant. If computer tutors can vary the interaction style to effectively mitigate negative emotions, then ITS designers would have one more mechanism in which to design affect-aware adaptations that provide the proper responses in situations where human emotions affect the ability to learn.
Ouyang, Qin; Zhao, Jiewen; Chen, Quansheng
2015-01-01
The non-sugar solids (NSS) content is one of the most important nutrition indicators of Chinese rice wine. This study proposed a rapid method for the measurement of NSS content in Chinese rice wine using near infrared (NIR) spectroscopy. We also systemically studied the efficient spectral variables selection algorithms that have to go through modeling. A new algorithm of synergy interval partial least square with competitive adaptive reweighted sampling (Si-CARS-PLS) was proposed for modeling. The performance of the final model was back-evaluated using root mean square error of calibration (RMSEC) and correlation coefficient (Rc) in calibration set and similarly tested by mean square error of prediction (RMSEP) and correlation coefficient (Rp) in prediction set. The optimum model by Si-CARS-PLS algorithm was achieved when 7 PLS factors and 18 variables were included, and the results were as follows: Rc=0.95 and RMSEC=1.12 in the calibration set, Rp=0.95 and RMSEP=1.22 in the prediction set. In addition, Si-CARS-PLS algorithm showed its superiority when compared with the commonly used algorithms in multivariate calibration. This work demonstrated that NIR spectroscopy technique combined with a suitable multivariate calibration algorithm has a high potential in rapid measurement of NSS content in Chinese rice wine. Copyright © 2015 Elsevier B.V. All rights reserved.
Random sampling of elementary flux modes in large-scale metabolic networks.
Machado, Daniel; Soons, Zita; Patil, Kiran Raosaheb; Ferreira, Eugénio C; Rocha, Isabel
2012-09-15
The description of a metabolic network in terms of elementary (flux) modes (EMs) provides an important framework for metabolic pathway analysis. However, their application to large networks has been hampered by the combinatorial explosion in the number of modes. In this work, we develop a method for generating random samples of EMs without computing the whole set. Our algorithm is an adaptation of the canonical basis approach, where we add an additional filtering step which, at each iteration, selects a random subset of the new combinations of modes. In order to obtain an unbiased sample, all candidates are assigned the same probability of getting selected. This approach avoids the exponential growth of the number of modes during computation, thus generating a random sample of the complete set of EMs within reasonable time. We generated samples of different sizes for a metabolic network of Escherichia coli, and observed that they preserve several properties of the full EM set. It is also shown that EM sampling can be used for rational strain design. A well distributed sample, that is representative of the complete set of EMs, should be suitable to most EM-based methods for analysis and optimization of metabolic networks. Source code for a cross-platform implementation in Python is freely available at http://code.google.com/p/emsampler. dmachado@deb.uminho.pt Supplementary data are available at Bioinformatics online.
NASA Astrophysics Data System (ADS)
Filizola, Marta; Villar, Hugo O.; Loew, Gilda H.
2001-04-01
Compounds that bind with significant affinity to the opioid receptor types, δ, μ, and κ, with different combinations of activation and inhibition at these three receptors could be promising behaviorally selective agents. Working on this hypothesis, the chemical moieties common to three different sets of opioid receptor agonists with significant affinity for each of the three receptor types δ, μ, or κ were identified. Using a distance analysis approach, common geometric arrangements of these chemical moieties were found for selected δ, μ, or κ opioid agonists. The chemical and geometric commonalities among agonists at each opioid receptor type were then compared with a non-specific opioid recognition pharmacophore recently developed. The comparison provided identification of the additional requirements for activation of δ, μ, and κ opioid receptors. The distance analysis approach was able to clearly discriminate κ-agonists, while global molecular properties for all compounds were calculated to identify additional requirements for activation of δ and μ receptors. Comparisons of the combined geometric and physicochemical properties calculated for each of the three sets of agonists allowed the determination of unique requirements for activation of each of the three opioid receptors. These results can be used to improve the activation selectivity of known opioid agonists and as a guide for the identification of novel selective opioid ligands with potential therapeutic usefulness.
A P2P Botnet detection scheme based on decision tree and adaptive multilayer neural networks.
Alauthaman, Mohammad; Aslam, Nauman; Zhang, Li; Alasem, Rafe; Hossain, M A
2018-01-01
In recent years, Botnets have been adopted as a popular method to carry and spread many malicious codes on the Internet. These malicious codes pave the way to execute many fraudulent activities including spam mail, distributed denial-of-service attacks and click fraud. While many Botnets are set up using centralized communication architecture, the peer-to-peer (P2P) Botnets can adopt a decentralized architecture using an overlay network for exchanging command and control data making their detection even more difficult. This work presents a method of P2P Bot detection based on an adaptive multilayer feed-forward neural network in cooperation with decision trees. A classification and regression tree is applied as a feature selection technique to select relevant features. With these features, a multilayer feed-forward neural network training model is created using a resilient back-propagation learning algorithm. A comparison of feature set selection based on the decision tree, principal component analysis and the ReliefF algorithm indicated that the neural network model with features selection based on decision tree has a better identification accuracy along with lower rates of false positives. The usefulness of the proposed approach is demonstrated by conducting experiments on real network traffic datasets. In these experiments, an average detection rate of 99.08 % with false positive rate of 0.75 % was observed.
Ahmed, Shaheen; Iftekharuddin, Khan M; Vossough, Arastoo
2011-03-01
Our previous works suggest that fractal texture feature is useful to detect pediatric brain tumor in multimodal MRI. In this study, we systematically investigate efficacy of using several different image features such as intensity, fractal texture, and level-set shape in segmentation of posterior-fossa (PF) tumor for pediatric patients. We explore effectiveness of using four different feature selection and three different segmentation techniques, respectively, to discriminate tumor regions from normal tissue in multimodal brain MRI. We further study the selective fusion of these features for improved PF tumor segmentation. Our result suggests that Kullback-Leibler divergence measure for feature ranking and selection and the expectation maximization algorithm for feature fusion and tumor segmentation offer the best results for the patient data in this study. We show that for T1 and fluid attenuation inversion recovery (FLAIR) MRI modalities, the best PF tumor segmentation is obtained using the texture feature such as multifractional Brownian motion (mBm) while that for T2 MRI is obtained by fusing level-set shape with intensity features. In multimodality fused MRI (T1, T2, and FLAIR), mBm feature offers the best PF tumor segmentation performance. We use different similarity metrics to evaluate quality and robustness of these selected features for PF tumor segmentation in MRI for ten pediatric patients.
A hybrid feature selection approach for the early diagnosis of Alzheimer’s disease
NASA Astrophysics Data System (ADS)
Gallego-Jutglà, Esteve; Solé-Casals, Jordi; Vialatte, François-Benoît; Elgendi, Mohamed; Cichocki, Andrzej; Dauwels, Justin
2015-02-01
Objective. Recently, significant advances have been made in the early diagnosis of Alzheimer’s disease (AD) from electroencephalography (EEG). However, choosing suitable measures is a challenging task. Among other measures, frequency relative power (RP) and loss of complexity have been used with promising results. In the present study we investigate the early diagnosis of AD using synchrony measures and frequency RP on EEG signals, examining the changes found in different frequency ranges. Approach. We first explore the use of a single feature for computing the classification rate (CR), looking for the best frequency range. Then, we present a multiple feature classification system that outperforms all previous results using a feature selection strategy. These two approaches are tested in two different databases, one containing mild cognitive impairment (MCI) and healthy subjects (patients age: 71.9 ± 10.2, healthy subjects age: 71.7 ± 8.3), and the other containing Mild AD and healthy subjects (patients age: 77.6 ± 10.0 healthy subjects age: 69.4 ± 11.5). Main results. Using a single feature to compute CRs we achieve a performance of 78.33% for the MCI data set and of 97.56% for Mild AD. Results are clearly improved using the multiple feature classification, where a CR of 95% is found for the MCI data set using 11 features, and 100% for the Mild AD data set using four features. Significance. The new features selection method described in this work may be a reliable tool that could help to design a realistic system that does not require prior knowledge of a patient's status. With that aim, we explore the standardization of features for MCI and Mild AD data sets with promising results.
Di Ruggiero, Erica; Cohen, Joanna E; Cole, Donald C
2014-07-01
Global labour markets continue to undergo significant transformations resulting from socio-political instability combined with rises in structural inequality, employment insecurity, and poor working conditions. Confronted by these challenges, global institutions are providing policy guidance to protect and promote the health and well-being of workers. This article provides an account of how the International Labour Organization's Decent Work Agenda contributes to the work policy agendas of the World Health Organization and the World Bank. This qualitative study involved semi-structured interviews with representatives from three global institutions--the International Labour Organization (ILO), the World Health Organization and the World Bank. Of the 25 key informants invited to participate, 16 took part in the study. Analysis for key themes was followed by interpretation using selected agenda setting theories. Interviews indicated that through the Decent Work Agenda, the International Labour Organization is shaping the global policy narrative about work among UN agencies, and that the pursuit of decent work and the Agenda were perceived as important goals with the potential to promote just policies. The Agenda was closely linked to the World Health Organization's conception of health as a human right. However, decent work was consistently identified by World Bank informants as ILO terminology in contrast to terms such as job creation and job access. The limited evidence base and its conceptual nature were offered as partial explanations for why the Agenda has yet to fully influence other global institutions. Catalytic events such as the economic crisis were identified as creating the enabling conditions to influence global work policy agendas. Our evidence aids our understanding of how an issue like decent work enters and stays on the policy agendas of global institutions, using the Decent Work Agenda as an illustrative example. Catalytic events and policy precedents were found to contribute positively to agenda setting. Questions remain, however, across key informants about the robustness of the underlying evidence base for this Agenda and what meaningful impacts have been realized on the ground as a result.
2014-01-01
Background Global labour markets continue to undergo significant transformations resulting from socio-political instability combined with rises in structural inequality, employment insecurity, and poor working conditions. Confronted by these challenges, global institutions are providing policy guidance to protect and promote the health and well-being of workers. This article provides an account of how the International Labour Organization’s Decent Work Agenda contributes to the work policy agendas of the World Health Organization and the World Bank. Methods This qualitative study involved semi-structured interviews with representatives from three global institutions – the International Labour Organization (ILO), the World Health Organization and the World Bank. Of the 25 key informants invited to participate, 16 took part in the study. Analysis for key themes was followed by interpretation using selected agenda setting theories. Results Interviews indicated that through the Decent Work Agenda, the International Labour Organization is shaping the global policy narrative about work among UN agencies, and that the pursuit of decent work and the Agenda were perceived as important goals with the potential to promote just policies. The Agenda was closely linked to the World Health Organization’s conception of health as a human right. However, decent work was consistently identified by World Bank informants as ILO terminology in contrast to terms such as job creation and job access. The limited evidence base and its conceptual nature were offered as partial explanations for why the Agenda has yet to fully influence other global institutions. Catalytic events such as the economic crisis were identified as creating the enabling conditions to influence global work policy agendas. Conclusions Our evidence aids our understanding of how an issue like decent work enters and stays on the policy agendas of global institutions, using the Decent Work Agenda as an illustrative example. Catalytic events and policy precedents were found to contribute positively to agenda setting. Questions remain, however, across key informants about the robustness of the underlying evidence base for this Agenda and what meaningful impacts have been realized on the ground as a result. PMID:24986161
NASA Technical Reports Server (NTRS)
Luckring, James M.; Rizzi, Arthur; Davis, M. Bruce
2014-01-01
A coordinated project has been underway to improve CFD predictions of slender airframe aerodynamics. The work is focused on two flow conditions and leverages a unique flight data set obtained with an F-16XL aircraft. These conditions, a low-speed high angleof- attack case and a transonic low angle-of-attack case, were selected from a prior prediction campaign wherein the CFD failed to provide acceptable results. In this paper the background, objectives and approach to the current project are presented. The work embodies predictions from multiple numerical formulations that are contributed from multiple organizations, and the context of this campaign to other multi-code, multiorganizational efforts is included. The relevance of this body of work toward future supersonic commercial transport concepts is also briefly addressed.
Learning with Professionals. Selected Works from the Joint Military Intelligence College
2005-07-01
ten, Berlin Game, Mexico Set, and London Match (all published by Alfred A. Knopf) are the first three and most rewarding. The tenth volume, Winter... Mexico in the late 1980. The police chief is conducting a murder investigation of the estranged wife of a U.S. ambassador. His inquiries point to a...occasional lapses with Mexico that ended in violence) and broad expanses of ocean to the east and west. Whenever threats from within have manifested
Overall Heat Transfer Coefficients for a Horizontal Cylinder in a Fluidized Bed.
1984-04-01
The distribution system is composed of 2 in. PVC pipe and fittings arranged in a convenient air-tight geometry. Pressure regulators, pressure gauges...uniform fluidization. After i£ A_ 4 passing through the beads, the air is exhausted to the outside by means of galvanized duct work. Fluidized Bed...design is the matching with the copper cylinder of outer diameters, the fastening with recessed set screws , their length and the material selection. In
Detection of Erroneous Payments Utilizing Supervised And Unsupervised Data Mining Techniques
2004-09-01
will look at which statistical analysis technique will work best in developing and enhancing existing erroneous payment models . Chapter I and II... payment models that are used for selection of records to be audited. The models are set up such that if two or more records have the same payment...Identification Number, Invoice Number and Delivery Order Number are not compared. The DM0102 Duplicate Payment Model will be analyzed in this thesis
1986-11-01
Various setting agents have been used to treat industrial wastes and flue gas desulfurization sludges. These include cement, lime, kiln dust, blast furnace...will determine the type of leachate control strategy that can be successfully implemented. Potential leachate control strategies include site selection...AND ADDRESS 10. PROGRAM ELEMENT, PROJECT, TASK same AREA & WORK UNIT NUMBERS 1I. CONTROLLING OFFICE NAME AND ADDRESS 12. REPORT DATE U.S. Army Corps
DOE Office of Scientific and Technical Information (OSTI.GOV)
Draeger, E. W.
The Advanced Architecture and Portability Specialists team (AAPS) worked with a select set of LLNL application teams to develop and/or implement a portability strategy for next-generation architectures. The team also investigated new and updated programming models and helped develop programming abstractions targeting maintainability and performance portability. Significant progress was made on both fronts in FY17, resulting in multiple applications being significantly more prepared for the nextgeneration machines than before.
Do the contents of working memory capture attention? Yes, but cognitive control matters.
Han, Suk Won; Kim, Min-Shik
2009-10-01
There has been a controversy on whether working memory can guide attentional selection. Some researchers have reported that the contents of working memory guide attention automatically in visual search (D. Soto, D. Heinke, G. W. Humphreys, & M. J. Blanco, 2005). On the other hand, G.F. Woodman and S. J. Luck (2007) reported that they could not find any evidence of attentional capture by working memory. In the present study, we tried to find an integrative explanation for the different sets of results. We report evidence for attentional capture by working memory, but this effect was eliminated when search was perceptually demanding or the onset of the search was delayed long enough for cognitive control of search to be implemented under particular conditions. We suggest that perceptual difficulty and the time course of cognitive control as important factors that determine when information in working memory influences attention. PsycINFO Database Record (c) 2009 APA, all rights reserved.
Large storms: Airglow and related measurements. VLF observations, volume 4
NASA Technical Reports Server (NTRS)
1981-01-01
The data presented show the typical values and range of ionospheric and magnetospheric characteristics, as viewed from 1400 km with the ISIS 2 instruments. The definition of each data set depends partly on geophysical parameters and partly on satellite operating mode. Preceding the data set is a description of the organizational parameters and a review of the objectives and general characteristics of the data set. The data are shown as a selection from 12 different data formats. Each data set has a different selection of formats, but uniformity of a given format selection is preserved throughout each data set. Each data set consists of a selected number of passes, each comprising a format combination that is most appropriae for the particular data set. Description of ISIS 2 instruments are provided.
Goal setting education and counseling practices of diabetes educators.
Malemute, Charlene L; Shultz, Jill Armstrong; Ballejos, Miriam; Butkus, Sue; Early, Kathaleen Briggs
2011-01-01
The purpose of this study was to identify goal setting education practices used by diabetes educators working with type 2 diabetes patients. Data were collected by a mail questionnaire with 179 diabetes educators purposively selected from the 2008 American Association of Diabetes Educators membership listing. Many diabetes educators (52%) reported that more than 75% of their patients set goals for diabetes control. Independent factor patterns for the frequency of information collected from the patient for the first diabetes education session showed that educators either focused on patients' self-management practices (exercise and dietary practices, knowledge, and social impacts of diabetes) or issues with learning about self-management, such as understanding the patient's learning style and motivation for managing diabetes. Factor patterns overall showed diverse approaches to working with patients, including strategies used with patients struggling with dietary goals and the importance of tasks to complete during the first patient session. Although most educators reported practices that were largely patient centered as promoted by the American Diabetes Association (ADA) and models of chronic disease management, patterns of practice suggest that diabetes educators vary considerably in how they apply education practices, especially with dietary self-management education.
Choosy but not chaste: multiple mating in human females.
Scelza, Brooke A
2013-01-01
When Charles Darwin set out to relate his theory of evolution by natural selection to humans he discovered that a complementary explanation was needed to properly understand the great variation seen in human behavior. The resulting work, The Descent of Man and Selection in Relation to Sex, laid out the defining principles and evidence of sexual selection. In brief, this work is best known for illuminating the typically male strategy of intrasexual competition and the typically female response of intersexual choice. While these sexual stereotypes were first laid out by Darwin, they grew in importance when, years later, A. J. Bateman, in a careful study of Drosophila mating strategies, noted that multiple mating appeared to provide great benefit to male reproductive success, but to have no such effect on females. As a result, female choice soon became synonymous with being coy, and only males were thought to gain from promiscuous behavior. However, the last thirty years of research have served to question much of the traditional wisdom about sex differences proposed by Darwin and Bateman, illuminating the many ways that women (and females more generally) can and do engage in multiple mating. Copyright © 2013 Wiley Periodicals, Inc.
Binding and strategic selection in working memory: a lifespan dissociation.
Sander, Myriam C; Werkle-Bergner, Markus; Lindenberger, Ulman
2011-09-01
Working memory (WM) shows a gradual increase during childhood, followed by accelerating decline from adulthood to old age. To examine these lifespan differences more closely, we asked 34 children (10-12 years), 40 younger adults (20-25 years), and 39 older adults (70-75 years) to perform a color change detection task. Load levels and encoding durations were varied for displays including targets only (Experiment 1) or targets plus distracters (Experiment 2, investigating a subsample of Experiment 1). WM performance was lower in older adults and children than in younger adults. Longer presentation times were associated with better performance in all age groups, presumably reflecting increasing effects of strategic selection mechanisms on WM performance. Children outperformed older adults when encoding times were short, and distracter effects were larger in children and older adults than in younger adults. We conclude that strategic selection in WM develops more slowly during childhood than basic binding operations, presumably reflecting the delay in maturation of frontal versus medio-temporal brain networks. In old age, both sets of mechanisms decline, reflecting senescent change in both networks. We discuss similarities to episodic memory development and address open questions for future research.
Telework as an employment option for people with disabilities.
Murray, B; Kenny, S
1990-01-01
This feasibility study, based on intensive casework, examined the potential of home-based teleworking arrangements for people with severe physical disabilities. Eleven teleworking arrangements, each involving a unique combination of work, working conditions and worker characteristics, were set up in different parts of Ireland and monitored over periods ranging from 6 to 18 months. Eight were still operational at the end of the project and, with one exception, were set to continue as longer-term arrangements. Outcomes from this action-research project suggest that teleworking is a feasible form of employment for such persons--provided care is taken over selection of workers, identification of work that is suited to the telework format and management of telework units by employers. They also suggest that teleworking arrangements can be quite flexible, ranging from examples in which work is performed mainly from home to those which combine home-based activity with varying degrees of conventional office-based activity. It is concluded that telework will create new opportunities for people with severe disabilities, as well as enabling others who become disabled during employment to retain their jobs. However, it is important that workers are appropriately trained in the use of computers and advanced telecommunications and, in many cases, home-delivered training is required.
Differential recruitment of executive resources during mind wandering.
Kam, Julia W Y; Handy, Todd C
2014-05-01
Recent research has shown that mind wandering recruits executive resources away from the external task towards inner thoughts. No studies however have determined whether executive functions are drawn away in a unitary manner during mind wandering episodes, or whether there is variation in specific functions impacted. Accordingly, we examined whether mind wandering differentially modulates three core executive functions-response inhibition, updating of working memory, and mental set shifting. In three experiments, participants performed one of these three executive function tasks and reported their attentional state as either on-task or mind wandering at random intervals. We found that mind wandering led to poorer performance in the response inhibition and working memory tasks, but not the set-shifting task. These findings suggest that mind wandering does not recruit executive functions in a monolithic manner. Rather, it appears to selectively engage certain executive functions, which may reflect the adaptive maintenance of ongoing task performance. Copyright © 2014 Elsevier Inc. All rights reserved.
Anderson, James S M; Ayers, Paul W
2018-06-30
Generalizing our recent work on relativistic generalizations of the quantum theory of atoms in molecules, we present the general setting under which the principle of stationary action for a region leads to open quantum subsystems. The approach presented here is general and works for any Hamiltonian, and when a reasonable Lagrangian is selected, it often leads to the integral of the Laplacian of the electron density on the region vanishing as a necessary condition for the zero-flux surface. Alternatively, with this method, one can design a Lagrangian that leads to a surface of interest (though this Lagrangian may not be, and indeed probably will not be, "reasonable"). For any reasonable Lagrangian for the electronic wave function and any two-component method (related by integration by parts to the Hamiltonian) considered, the Bader definition of an atom is recaptured. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.
Electrophysiological Evidence for a Sensory Recruitment Model of Somatosensory Working Memory.
Katus, Tobias; Grubert, Anna; Eimer, Martin
2015-12-01
Sensory recruitment models of working memory assume that information storage is mediated by the same cortical areas that are responsible for the perceptual processing of sensory signals. To test this assumption, we measured somatosensory event-related brain potentials (ERPs) during a tactile delayed match-to-sample task. Participants memorized a tactile sample set at one task-relevant hand to compare it with a subsequent test set on the same hand. During the retention period, a sustained negativity (tactile contralateral delay activity, tCDA) was elicited over primary somatosensory cortex contralateral to the relevant hand. The amplitude of this component increased with memory load and was sensitive to individual limitations in memory capacity, suggesting that the tCDA reflects the maintenance of tactile information in somatosensory working memory. The tCDA was preceded by a transient negativity (N2cc component) with a similar contralateral scalp distribution, which is likely to reflect selection of task-relevant tactile stimuli at the encoding stage. The temporal sequence of N2cc and tCDA components mirrors previous observations from ERP studies of working memory in vision. The finding that the sustained somatosensory delay period activity varies as a function of memory load supports a sensory recruitment model for spatial working memory in touch. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Chui, Wing Hong; Cheng, Kevin Kwok-yin
2013-02-01
Although there have been a handful of studies examining the work of chaplains and prison volunteers in a Western setting, few have endeavored to conduct research into the experiences of religious workers in Asian penitentiaries. To fill this gap, this article reports on exploratory research examining the work of a selected group of religious workers in Hong Kong prisons. A total of 17 religious workers were interviewed: 10 prison chaplains and 7 Buddhist volunteers who paid regular prison visits. Qualitative findings generated from in-depth interviews present three themes: the range of religious activities performed, the importance of religion for the rehabilitation of inmates, and the hope of continued religious support to prisoners after discharge. The significance of this research is that it sheds light on the understudied work of prison chaplains and volunteers in Hong Kong and portrays the difference between the works of the Christian ministry and Buddhist volunteers.
Ihmaid, Saleh K; Ahmed, Hany E A; Zayed, Mohamed F; Abadleh, Mohammed M
2016-01-30
The main step in a successful drug discovery pipeline is the identification of small potent compounds that selectively bind to the target of interest with high affinity. However, there is still a shortage of efficient and accurate computational methods with powerful capability to study and hence predict compound selectivity properties. In this work, we propose an affordable machine learning method to perform compound selectivity classification and prediction. For this purpose, we have collected compounds with reported activity and built a selectivity database formed of 153 cathepsin K and S inhibitors that are considered of medicinal interest. This database has three compound sets, two K/S and S/K selective ones and one non-selective KS one. We have subjected this database to the selectivity classification tool 'Emergent Self-Organizing Maps' for exploring its capability to differentiate selective cathepsin inhibitors for one target over the other. The method exhibited good clustering performance for selective ligands with high accuracy (up to 100 %). Among the possibilites, BAPs and MACCS molecular structural fingerprints were used for such a classification. The results exhibited the ability of the method for structure-selectivity relationship interpretation and selectivity markers were identified for the design of further novel inhibitors with high activity and target selectivity.
Data preprocessing methods of FT-NIR spectral data for the classification cooking oil
NASA Astrophysics Data System (ADS)
Ruah, Mas Ezatul Nadia Mohd; Rasaruddin, Nor Fazila; Fong, Sim Siong; Jaafar, Mohd Zuli
2014-12-01
This recent work describes the data pre-processing method of FT-NIR spectroscopy datasets of cooking oil and its quality parameters with chemometrics method. Pre-processing of near-infrared (NIR) spectral data has become an integral part of chemometrics modelling. Hence, this work is dedicated to investigate the utility and effectiveness of pre-processing algorithms namely row scaling, column scaling and single scaling process with Standard Normal Variate (SNV). The combinations of these scaling methods have impact on exploratory analysis and classification via Principle Component Analysis plot (PCA). The samples were divided into palm oil and non-palm cooking oil. The classification model was build using FT-NIR cooking oil spectra datasets in absorbance mode at the range of 4000cm-1-14000cm-1. Savitzky Golay derivative was applied before developing the classification model. Then, the data was separated into two sets which were training set and test set by using Duplex method. The number of each class was kept equal to 2/3 of the class that has the minimum number of sample. Then, the sample was employed t-statistic as variable selection method in order to select which variable is significant towards the classification models. The evaluation of data pre-processing were looking at value of modified silhouette width (mSW), PCA and also Percentage Correctly Classified (%CC). The results show that different data processing strategies resulting to substantial amount of model performances quality. The effects of several data pre-processing i.e. row scaling, column standardisation and single scaling process with Standard Normal Variate indicated by mSW and %CC. At two PCs model, all five classifier gave high %CC except Quadratic Distance Analysis.
Ensemble based on static classifier selection for automated diagnosis of Mild Cognitive Impairment.
Nanni, Loris; Lumini, Alessandra; Zaffonato, Nicolò
2018-05-15
Alzheimer's disease (AD) is the most common cause of neurodegenerative dementia in the elderly population. Scientific research is very active in the challenge of designing automated approaches to achieve an early and certain diagnosis. Recently an international competition among AD predictors has been organized: "A Machine learning neuroimaging challenge for automated diagnosis of Mild Cognitive Impairment" (MLNeCh). This competition is based on pre-processed sets of T1-weighted Magnetic Resonance Images (MRI) to be classified in four categories: stable AD, individuals with MCI who converted to AD, individuals with MCI who did not convert to AD and healthy controls. In this work, we propose a method to perform early diagnosis of AD, which is evaluated on MLNeCh dataset. Since the automatic classification of AD is based on the use of feature vectors of high dimensionality, different techniques of feature selection/reduction are compared in order to avoid the curse-of-dimensionality problem, then the classification method is obtained as the combination of Support Vector Machines trained using different clusters of data extracted from the whole training set. The multi-classifier approach proposed in this work outperforms all the stand-alone method tested in our experiments. The final ensemble is based on a set of classifiers, each trained on a different cluster of the training data. The proposed ensemble has the great advantage of performing well using a very reduced version of the data (the reduction factor is more than 90%). The MATLAB code for the ensemble of classifiers will be publicly available 1 to other researchers for future comparisons. Copyright © 2017 Elsevier B.V. All rights reserved.
Mohd Shukoor, Nor Shuhada; Mohd Tamrin, Shamsul Bahri; Guan, Ng Yee; Mohd Suadi Nata, Dayana Hazwani
2018-05-22
Hard hats are among the personal protective equipment (PPE) used in many industries to reduce the impact of any falling object on the skull and also to prevent head and brain injuries. However, the practice of wearing a safety helmet during working hours is still low. This is due to the physical discomfort perceived by safety helmet users. Given the unpopularity of the current hard hat, the general perception of workers concerning its use and its measurements are the determining factors in the development of a new hard hat. A cross-sectional study was conducted in which 132 male oil palm harvesters between 19 and 60 years of age were selected from among the employees of the same oil palm harvesting company. A set of questionnaires was developed to collect their socio-demographic information as well as their perceptions of comfort and the prevalence of head injury. In addition, a set of measuring instruments, including Martin's anthropometry set, was used for head measurement and data collection in respect of the current hard hat. In this research, six respondents were randomly selected to attend an interview session for qualitative assessment.RESULTSBased on the questionnaires, the unpopularity in the use of the hard hat was largely influenced by factors related to poor design, in general, and, specifically, poor ventilation (64%), load (67% ), and physical discomfort (42% ). The measurements of the anthropometric parameters and the dimensions of the hard hat also showed a significant mismatch. The unpopularity of the current hard hat among oil palm harvesters stemmed from the discomfort from wearing, which showed that the development of a new hard hat could lead to better usage and the greater likelihood of wearing a hard hat throughout the working day.
Mafra, Valéria; Kubo, Karen S.; Alves-Ferreira, Marcio; Ribeiro-Alves, Marcelo; Stuart, Rodrigo M.; Boava, Leonardo P.; Rodrigues, Carolina M.; Machado, Marcos A.
2012-01-01
Real-time reverse transcription PCR (RT-qPCR) has emerged as an accurate and widely used technique for expression profiling of selected genes. However, obtaining reliable measurements depends on the selection of appropriate reference genes for gene expression normalization. The aim of this work was to assess the expression stability of 15 candidate genes to determine which set of reference genes is best suited for transcript normalization in citrus in different tissues and organs and leaves challenged with five pathogens (Alternaria alternata, Phytophthora parasitica, Xylella fastidiosa and Candidatus Liberibacter asiaticus). We tested traditional genes used for transcript normalization in citrus and orthologs of Arabidopsis thaliana genes described as superior reference genes based on transcriptome data. geNorm and NormFinder algorithms were used to find the best reference genes to normalize all samples and conditions tested. Additionally, each biotic stress was individually analyzed by geNorm. In general, FBOX (encoding a member of the F-box family) and GAPC2 (GAPDH) was the most stable candidate gene set assessed under the different conditions and subsets tested, while CYP (cyclophilin), TUB (tubulin) and CtP (cathepsin) were the least stably expressed genes found. Validation of the best suitable reference genes for normalizing the expression level of the WRKY70 transcription factor in leaves infected with Candidatus Liberibacter asiaticus showed that arbitrary use of reference genes without previous testing could lead to misinterpretation of data. Our results revealed FBOX, SAND (a SAND family protein), GAPC2 and UPL7 (ubiquitin protein ligase 7) to be superior reference genes, and we recommend their use in studies of gene expression in citrus species and relatives. This work constitutes the first systematic analysis for the selection of superior reference genes for transcript normalization in different citrus organs and under biotic stress. PMID:22347455
Puertas, E Benjamín; Rivera, Tamara Y
2016-11-01
To 1) describe patterns of specialty choice; 2) investigate relationships between career selection and selected demographic indicators; and 3) identify salary perception, factors that influence career choice in primary care, and factors that influence desired location of future medical practice. The study used a mixed-methods approach that included a cross-sectional questionnaire survey applied to 234 last-year medical students in Honduras (September 2014), and semi-structured interviews with eight key informants (October 2014). Statistical analysis included chi-square and factor analysis. An alpha level of 0.05 was used to determine significance. In the qualitative analysis, several codes were associated with each other, and five major themes emerged. Primary care careers were the preferred choice for 8.1% of students, who preferred urban settings for future practice location. The perceived salary of specialties other than primary care was significantly higher than those of general practitioners, family practitioners, and pediatricians (P < 0.001). Participants considered "making a difference," income, teaching, prestige, and challenging work the most important factors influencing career choice. Practice in ambulatory settings was significantly associated with a preference for primary care specialties (P = < 0.05). Logistic regression analysis found that factors related to patient-based care were statistically significant for selecting primary care (P = 0.006). The qualitative analysis further endorsed the survey findings, identifying additional factors that influence career choice (future work option; availability of residency positions; and social factors, including violence). Rationales behind preference of a specialty appeared to be based on a combination of ambition and prestige, and on personal and altruistic considerations. Most factors that influence primary care career choice are similar to those found in the literature. There are several factors distinctive to medical students in Honduras-most of them barriers to primary care career choice.
Konstantinou, Nikos; Beal, Eleanor; King, Jean-Remi; Lavie, Nilli
2014-10-01
We establish a new dissociation between the roles of working memory (WM) cognitive control and visual maintenance in selective attention as measured by the efficiency of distractor rejection. The extent to which focused selective attention can prevent distraction has been shown to critically depend on the level and type of load involved in the task. High perceptual load that consumes perceptual capacity leads to reduced distractor processing, whereas high WM load that reduces WM ability to exert priority-based executive cognitive control over the task results in increased distractor processing (e.g., Lavie, Trends in Cognitive Sciences, 9(2), 75-82, 2005). WM also serves to maintain task-relevant visual representations, and such visual maintenance is known to recruit the same sensory cortices as those involved in perception (e.g., Pasternak & Greenlee, Nature Reviews Neuroscience, 6(2), 97-107, 2005). These findings led us to hypothesize that loading WM with visual maintenance would reduce visual capacity involved in perception, thus resulting in reduced distractor processing-similar to perceptual load and opposite to WM cognitive control load. Distractor processing was assessed in a response competition task, presented during the memory interval (or during encoding; Experiment 1a) of a WM task. Loading visual maintenance or encoding by increased set size for a memory sample of shapes, colors, and locations led to reduced distractor response competition effects. In contrast, loading WM cognitive control with verbal rehearsal of a random letter set led to increased distractor effects. These findings confirm load theory predictions and provide a novel functional distinction between the roles of WM maintenance and cognitive control in selective attention.
Physician consideration of patients' out-of-pocket costs in making common clinical decisions.
Pham, Hoangmai H; Alexander, G Caleb; O'Malley, Ann S
2007-04-09
Patients face growing cost-sharing through higher deductibles and other out-of-pocket (OP) expenses, with uncertain effects on clinical decision making. We analyzed data on 6628 respondents to the nationally representative 2004-2005 Community Tracking Study Physician Survey to examine how frequently physicians report considering their insured patients' OP expenses when prescribing drugs, selecting diagnostic tests, and choosing inpatient vs outpatient care settings. Responses were dichotomized as always/usually vs sometimes/rarely/never. In separate multivariate logistic regressions, we examined associations between physicians' reported frequency of considering OP costs for each type of decision and characteristics of individual physicians and their practices. Seventy-eight percent of physicians reported routinely considering OP costs when prescribing drugs, while 51.2% reported doing so when selecting care settings, and 40.2% when selecting diagnostic tests. In adjusted analyses, primary care physicians were more likely than medical specialists to consider patients' OP costs in choosing prescription drugs (85.3% vs 74.5%) (P<.001), care settings (53.9% vs 43.1%) (P<.001), and diagnostic tests (46.3% vs 29.9%) (P<.001). Physicians working in large groups or health maintenance organizations were more likely to consider OP costs in prescribing generic drugs (P<.001 for comparisons with solo and 2-person practices), but those in solo or 2-person practices were more likely to do so in choosing tests and care settings (P<.05 for all comparisons with other practice types). Physicians providing at least 10 hours of charity care a month were more likely than those not providing any to consider OP costs in both diagnostic testing (40.7% vs 35.8%) (P<.001) and care setting decisions (51.4% vs 47.6%) (P<.005). Cost-sharing arrangements targeting patients are likely to have limited effects in safely reducing health care spending because physicians do not routinely consider patients' OP costs when making decisions regarding more expensive medical services.
Mbonye, Martin; Nakamanya, Sarah; Nalukenge, Winifred; King, Rachel; Vandepitte, Judith; Seeley, Janet
2013-08-10
Effective interventions among female sex workers require a thorough knowledge of the context of local sex industries. We explore the organisation of female sex work in a low socio-economic setting in Kampala, Uganda. We conducted a qualitative study with 101 participants selected from an epidemiological cohort of 1027 women at high risk of HIV in Kampala. Repeat in-depth life history and work practice interviews were conducted from March 2010 to June 2011. Context specific factors of female sex workers' day-to-day lives were captured. Reported themes were identified and categorised inductively. Of the 101 women, 58 were active self-identified sex workers operating in different locations within the area of study and nine had quit sex work. This paper focuses on these 67 women who gave information about their involvement in sex work. The majority had not gone beyond primary level of education and all had at least one child. Thirty one voluntarily disclosed that they were HIV-positive. Common sex work locations were streets/roadsides, bars and night clubs. Typically sex occurred in lodges near bars/night clubs, dark alleyways or car parking lots. Overall, women experienced sex work-related challenges at their work locations but these were more apparent in outdoor settings. These settings exposed women to violence, visibility to police, a stigmatising public as well as competition for clients, while bars provided some protection from these challenges. Older sex workers tended to prefer bars while the younger ones were mostly based on the streets. Alcohol consumption was a feature in all locations and women said it gave them courage and helped them to withstand the night chill. Condom use was determined by clients' willingness, a woman's level of sobriety or price offered. Sex work operates across a variety of locations in the study area in Kampala, with each presenting different strategies and challenges for those operating there. Risky practices are present in all locations although they are higher on the streets compared to other locations. Location specific interventions are required to address the complex challenges in sex work environments.
NASA Astrophysics Data System (ADS)
Escalona, Luis; Díaz-Montiel, Paulina; Venkataraman, Satchi
2016-04-01
Laminated carbon fiber reinforced polymer (CFRP) composite materials are increasingly used in aerospace structures due to their superior mechanical properties and reduced weight. Assessing the health and integrity of these structures requires non-destructive evaluation (NDE) techniques to detect and measure interlaminar delamination and intralaminar matrix cracking damage. The electrical resistance change (ERC) based NDE technique uses the inherent changes in conductive properties of the composite to characterize internal damage. Several works that have explored the ERC technique have been limited to thin cross-ply laminates with simple linear or circular electrode arrangements. This paper investigates a method of optimum selection of electrode configurations for delamination detection in thick cross-ply laminates using ERC. Inverse identification of damage requires numerical optimization of the measured response with a model predicted response. Here, the electrical voltage field in the CFRP composite laminate is calculated using finite element analysis (FEA) models for different specified delamination size and locations, and location of ground and current electrodes. Reducing the number of sensor locations and measurements is needed to reduce hardware requirements, and computational effort needed for inverse identification. This paper explores the use of effective independence (EI) measure originally proposed for sensor location optimization in experimental vibration modal analysis. The EI measure is used for selecting the minimum set of resistance measurements among all possible combinations of selecting a pair of electrodes among the n electrodes. To enable use of EI to ERC required, it is proposed in this research a singular value decomposition SVD to obtain a spectral representation of the resistance measurements in the laminate. The effectiveness of EI measure in eliminating redundant electrode pairs is demonstrated by performing inverse identification of damage using the full set of resistance measurements and the reduced set of measurements. The investigation shows that the EI measure is effective for optimally selecting the electrode pairs needed for resistance measurements in ERC based damage detection.
Validating a large geophysical data set: Experiences with satellite-derived cloud parameters
NASA Technical Reports Server (NTRS)
Kahn, Ralph; Haskins, Robert D.; Knighton, James E.; Pursch, Andrew; Granger-Gallegos, Stephanie
1992-01-01
We are validating the global cloud parameters derived from the satellite-borne HIRS2 and MSU atmospheric sounding instrument measurements, and are using the analysis of these data as one prototype for studying large geophysical data sets in general. The HIRS2/MSU data set contains a total of 40 physical parameters, filling 25 MB/day; raw HIRS2/MSU data are available for a period exceeding 10 years. Validation involves developing a quantitative sense for the physical meaning of the derived parameters over the range of environmental conditions sampled. This is accomplished by comparing the spatial and temporal distributions of the derived quantities with similar measurements made using other techniques, and with model results. The data handling needed for this work is possible only with the help of a suite of interactive graphical and numerical analysis tools. Level 3 (gridded) data is the common form in which large data sets of this type are distributed for scientific analysis. We find that Level 3 data is inadequate for the data comparisons required for validation. Level 2 data (individual measurements in geophysical units) is needed. A sampling problem arises when individual measurements, which are not uniformly distributed in space or time, are used for the comparisons. Standard 'interpolation' methods involve fitting the measurements for each data set to surfaces, which are then compared. We are experimenting with formal criteria for selecting geographical regions, based upon the spatial frequency and variability of measurements, that allow us to quantify the uncertainty due to sampling. As part of this project, we are also dealing with ways to keep track of constraints placed on the output by assumptions made in the computer code. The need to work with Level 2 data introduces a number of other data handling issues, such as accessing data files across machine types, meeting large data storage requirements, accessing other validated data sets, processing speed and throughput for interactive graphical work, and problems relating to graphical interfaces.
Clinical placements in mental health: a literature review.
Happell, Brenda; Gaskin, Cadeyrn J; Byrne, Louise; Welch, Anthony; Gellion, Stephen
2015-01-01
Gaining experience in clinical mental health settings is central to the education of health practitioners. To facilitate the ongoing development of knowledge and practice in this area, we performed a review of the literature on clinical placements in mental health settings. Searches in Academic Search Complete, CINAHL, Medline and PsycINFO databases returned 244 records, of which 36 met the selection criteria for this review. Five additional papers were obtained through scanning the reference lists of those papers included from the initial search. The evidence suggests that clinical placements may have multiple benefits (e.g. improving students' skills, knowledge, attitudes towards people with mental health issues and confidence, as well as reducing their fears and anxieties about working in mental health). The location and structure of placements may affect outcomes, with mental health placements in non-mental health settings appearing to have minimal impact on key outcomes. The availability of clinical placements in mental health settings varies considerably among education providers, with some students completing their training without undertaking such structured clinical experiences. Students have generally reported that their placements in mental health settings have been positive and valuable experiences, but have raised concerns about the amount of support they received from education providers and healthcare staff. Several strategies have been shown to enhance clinical placement experiences (e.g. providing students with adequate preparation in the classroom, implementing learning contracts and providing clinical supervision). Educators and healthcare staff need to work together for the betterment of student learning and the healthcare professions.
A Reduced Set of Features for Chronic Kidney Disease Prediction
Misir, Rajesh; Mitra, Malay; Samanta, Ranjit Kumar
2017-01-01
Chronic kidney disease (CKD) is one of the life-threatening diseases. Early detection and proper management are solicited for augmenting survivability. As per the UCI data set, there are 24 attributes for predicting CKD or non-CKD. At least there are 16 attributes need pathological investigations involving more resources, money, time, and uncertainties. The objective of this work is to explore whether we can predict CKD or non-CKD with reasonable accuracy using less number of features. An intelligent system development approach has been used in this study. We attempted one important feature selection technique to discover reduced features that explain the data set much better. Two intelligent binary classification techniques have been adopted for the validity of the reduced feature set. Performances were evaluated in terms of four important classification evaluation parameters. As suggested from our results, we may more concentrate on those reduced features for identifying CKD and thereby reduces uncertainty, saves time, and reduces costs. PMID:28706750
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rodriguez, Mario E.
An area in earthquake risk reduction that needs an urgent examination is the selection of earthquake records for nonlinear dynamic analysis of structures. An often-mentioned shortcoming from results of nonlinear dynamic analyses of structures is that these results are limited to the type of records that these analyses use as input data. This paper proposes a procedure for selecting earthquake records for nonlinear dynamic analysis of structures. This procedure uses a seismic damage index evaluated using the hysteretic energy dissipated by a Single Degree of Freedom System (SDOF) representing a multi-degree-of freedom structure responding to an earthquake record, and themore » plastic work capacity of the system at collapse. The type of structural system is considered using simple parameters. The proposed method is based on the evaluation of the damage index for a suite of earthquake records and a selected type of structural system. A set of 10 strong ground motion records is analyzed to show an application of the proposed procedure for selecting earthquake records for structural design.« less
Barch, Deanna M.; Carter, Cameron S.; Arnsten, Amy; Buchanan, Robert W.; Cohen, Jonathan D.; Geyer, Mark; Green, Michael F.; Krystal, John H.; Nuechterlein, Keith; Robbins, Trevor; Silverstein, Steven; Smith, Edward E.; Strauss, Milton; Wykes, Til; Heinssen, Robert
2009-01-01
This overview describes the goals and objectives of the third conference conducted as part of the Cognitive Neuroscience Treatment Research to Improve Cognition in Schizophrenia (CNTRICS) initiative. This third conference was focused on selecting specific paradigms from cognitive neuroscience that measured the constructs identified in the first CNTRICS meeting, with the goal of facilitating the translation of these paradigms into use in clinical trials contexts. To identify such paradigms, we had an open nomination process in which the field was asked to nominate potentially relevant paradigms and to provide information on several domains relevant to selecting the most promising tasks for each construct (eg, construct validity, neural bases, psychometrics, availability of animal models). Our goal was to identify 1–2 promising tasks for each of the 11 constructs identified at the first CNTRICS meeting. In this overview article, we describe the on-line survey used to generate nominations for promising tasks, the criteria that were used to select the tasks, the rationale behind the criteria, and the ways in which breakout groups worked together to identify the most promising tasks from among those nominated. This article serves as an introduction to the set of 6 articles included in this special issue that provide information about the specific tasks discussed and selected for the constructs from each of 6 broad domains (working memory, executive control, attention, long-term memory, perception, and social cognition). PMID:19023126
Developing Global Nurse Influencers.
Spies, Lori A
2016-01-01
How can universities create engaged citizens and global leaders? Each year, a select group of advanced practice nursing students at Baylor University Louise Herrington School of Nursing travel to Africa for a month-long clinical mission experience. Students work alongside local and missionary healthcare providers in a comprehensive Christian outreach to the community at a high-volume clinic. Creating rich learning experiences in a global setting in significant and sustainable ways is difficult, but intentionally focusing on what we are called to do and who we serve provides ballast for faculty and students. The success of the trip in preparing students to be global influencers is evident by the work graduates elect to do around the world, following graduation.
Simulating Terrestrial Gamma-ray Flashes using SWORD (Invited)
NASA Astrophysics Data System (ADS)
Gwon, C.; Grove, J.; Dwyer, J. R.; Mattson, K.; Polaski, D.; Jackson, L.
2013-12-01
We report on simulations of the relativistic feedback discharges involved with the production of terrestrial gamma-ray flashes (TGFs). The simulations were conducted using Geant4 using the SoftWare for the Optimization of Radiation Detectors (SWORD) framework. SWORD provides a graphical interface for setting up simulations in select high-energy radiation transport engines. Using Geant4, we determine avalanche length, the energy spectrum of the electrons and gamma-rays as they leave the field region, and the feedback factor describing the degree to which the production of energetic particles is self-sustaining. We validate our simulations against previous work in order to determine the reliability of our results. This work is funded by the Office of Naval Research.
Proverb interpretation changes in aging.
Uekermann, Jennifer; Thoma, Patrizia; Daum, Irene
2008-06-01
Recent investigations have emphasized the involvement of fronto-subcortical networks to proverb comprehension. Although the prefrontal cortex is thought to be affected by normal aging, relatively little work has been carried out to investigate potential effects of aging on proverb comprehension. In the present investigation participants in three age groups were assessed on a proverb comprehension task and a range of executive function tasks. The older group showed impairment in selecting correct interpretations from alternatives. They also showed executive function deficits, as reflected by reduced working memory and deficient set shifting and inhibition abilities. The findings of the present investigation showed proverb comprehension deficits in normal aging which appeared to be related to reduced executive skills.
Xu, G; Hughes-Oliver, J M; Brooks, J D; Yeatts, J L; Baynes, R E
2013-01-01
Quantitative structure-activity relationship (QSAR) models are being used increasingly in skin permeation studies. The main idea of QSAR modelling is to quantify the relationship between biological activities and chemical properties, and thus to predict the activity of chemical solutes. As a key step, the selection of a representative and structurally diverse training set is critical to the prediction power of a QSAR model. Early QSAR models selected training sets in a subjective way and solutes in the training set were relatively homogenous. More recently, statistical methods such as D-optimal design or space-filling design have been applied but such methods are not always ideal. This paper describes a comprehensive procedure to select training sets from a large candidate set of 4534 solutes. A newly proposed 'Baynes' rule', which is a modification of Lipinski's 'rule of five', was used to screen out solutes that were not qualified for the study. U-optimality was used as the selection criterion. A principal component analysis showed that the selected training set was representative of the chemical space. Gas chromatograph amenability was verified. A model built using the training set was shown to have greater predictive power than a model built using a previous dataset [1].
Psychological empowerment and job satisfaction between Baby Boomer and Generation X nurses.
Sparks, Amy M
2012-05-01
This paper is a report of a study of differences in nurses' generational psychological empowerment and job satisfaction. Generations differ in work styles such as autonomy, work ethics, involvement, views on leadership, and primary views on what constitutes innovation, quality, and service. A secondary analysis was conducted from two data sets resulting in a sample of 451 registered nurses employed at five hospitals in West Virginia. One data set was gathered from a convenience sample and one from a randomly selected sample. Data were collected from 2000 to 2004. Baby Boomer nurses reported higher mean total psychological empowerment scores than Generation X nurses. There were no differences in total job satisfaction scores between the generations. There were significant differences among the generations' psychological empowerment scores. Generational differences related to psychological empowerment could provide insight into inconsistent findings related to nurse job satisfaction. Nurse administrators may consider this evidence when working on strategic plans to motivate and entice Generation X nurses and retain Baby Boomers. Although implications based on this study are tentative, the results indicate the need for administrators to consider the differences between Baby Boomer and Generation X nurses. © 2011 Blackwell Publishing Ltd.
Analytic Thermoelectric Couple Modeling: Variable Material Properties and Transient Operation
NASA Technical Reports Server (NTRS)
Mackey, Jonathan A.; Sehirlioglu, Alp; Dynys, Fred
2015-01-01
To gain a deeper understanding of the operation of a thermoelectric couple a set of analytic solutions have been derived for a variable material property couple and a transient couple. Using an analytic approach, as opposed to commonly used numerical techniques, results in a set of useful design guidelines. These guidelines can serve as useful starting conditions for further numerical studies, or can serve as design rules for lab built couples. The analytic modeling considers two cases and accounts for 1) material properties which vary with temperature and 2) transient operation of a couple. The variable material property case was handled by means of an asymptotic expansion, which allows for insight into the influence of temperature dependence on different material properties. The variable property work demonstrated the important fact that materials with identical average Figure of Merits can lead to different conversion efficiencies due to temperature dependence of the properties. The transient couple was investigated through a Greens function approach; several transient boundary conditions were investigated. The transient work introduces several new design considerations which are not captured by the classic steady state analysis. The work helps to assist in designing couples for optimal performance, and also helps assist in material selection.
The GS (genetic selection) Principle.
Abel, David L
2009-01-01
The GS (Genetic Selection) Principle states that biological selection must occur at the nucleotide-sequencing molecular-genetic level of 3'5' phosphodiester bond formation. After-the-fact differential survival and reproduction of already-living phenotypic organisms (ordinary natural selection) does not explain polynucleotide prescription and coding. All life depends upon literal genetic algorithms. Even epigenetic and "genomic" factors such as regulation by DNA methylation, histone proteins and microRNAs are ultimately instructed by prior linear digital programming. Biological control requires selection of particular configurable switch-settings to achieve potential function. This occurs largely at the level of nucleotide selection, prior to the realization of any integrated biofunction. Each selection of a nucleotide corresponds to the setting of two formal binary logic gates. The setting of these switches only later determines folding and binding function through minimum-free-energy sinks. These sinks are determined by the primary structure of both the protein itself and the independently prescribed sequencing of chaperones. The GS Principle distinguishes selection of existing function (natural selection) from selection for potential function (formal selection at decision nodes, logic gates and configurable switch-settings).
Nonmathematical concepts of selection, evolutionary energy, and levels of evolution.
Darlington, P J
1972-05-01
The place of mathematics in hypotheticodeductive processes and in biological research is discussed. (Natural) Selection is defined and described as differential elimination of performed sets at any level. Sets and acting sets are groups of units (themselves sets of smaller units) at any level that may or do interact. A pseudomathematical equation describes directional change (evolution) in sets at any level. Selection is the ram of evolution; it cannot generate, but can only direct, evolutionary energy. The energy of evolution is derived from molecular or chemical levels, is transmitted upwards through the increasingly complex sets of sets that form living systems, and is turned in directions determined by the sum of selective processes, at different levels, which may either supplement or oppose each other. All evolutionary processes conform to the pseudomathematical equation referred to above, use energy as described above, and have a P/OE (ratio of programming to open-endedness) that cannot be measured, but can be related to other P/OE values. Phylogeny and ontogeny are compared as processes af directional change with set selection. Stages in the evolution of multi-cellular individuals are suggested, and are essentially the same as stages in the evolution of some multi-individual insect societies. Thinking is considered as a part of ontogeny involving an irreversible, nonrepetitive process of set selection in the brain.
Wavelet-Based Visible and Infrared Image Fusion: A Comparative Study
Sappa, Angel D.; Carvajal, Juan A.; Aguilera, Cristhian A.; Oliveira, Miguel; Romero, Dennis; Vintimilla, Boris X.
2016-01-01
This paper evaluates different wavelet-based cross-spectral image fusion strategies adopted to merge visible and infrared images. The objective is to find the best setup independently of the evaluation metric used to measure the performance. Quantitative performance results are obtained with state of the art approaches together with adaptations proposed in the current work. The options evaluated in the current work result from the combination of different setups in the wavelet image decomposition stage together with different fusion strategies for the final merging stage that generates the resulting representation. Most of the approaches evaluate results according to the application for which they are intended for. Sometimes a human observer is selected to judge the quality of the obtained results. In the current work, quantitative values are considered in order to find correlations between setups and performance of obtained results; these correlations can be used to define a criteria for selecting the best fusion strategy for a given pair of cross-spectral images. The whole procedure is evaluated with a large set of correctly registered visible and infrared image pairs, including both Near InfraRed (NIR) and Long Wave InfraRed (LWIR). PMID:27294938
Wavelet-Based Visible and Infrared Image Fusion: A Comparative Study.
Sappa, Angel D; Carvajal, Juan A; Aguilera, Cristhian A; Oliveira, Miguel; Romero, Dennis; Vintimilla, Boris X
2016-06-10
This paper evaluates different wavelet-based cross-spectral image fusion strategies adopted to merge visible and infrared images. The objective is to find the best setup independently of the evaluation metric used to measure the performance. Quantitative performance results are obtained with state of the art approaches together with adaptations proposed in the current work. The options evaluated in the current work result from the combination of different setups in the wavelet image decomposition stage together with different fusion strategies for the final merging stage that generates the resulting representation. Most of the approaches evaluate results according to the application for which they are intended for. Sometimes a human observer is selected to judge the quality of the obtained results. In the current work, quantitative values are considered in order to find correlations between setups and performance of obtained results; these correlations can be used to define a criteria for selecting the best fusion strategy for a given pair of cross-spectral images. The whole procedure is evaluated with a large set of correctly registered visible and infrared image pairs, including both Near InfraRed (NIR) and Long Wave InfraRed (LWIR).
DRUMS: Disk Repository with Update Management and Select option for high throughput sequencing data
2014-01-01
Background New technologies for analyzing biological samples, like next generation sequencing, are producing a growing amount of data together with quality scores. Moreover, software tools (e.g., for mapping sequence reads), calculating transcription factor binding probabilities, estimating epigenetic modification enriched regions or determining single nucleotide polymorphism increase this amount of position-specific DNA-related data even further. Hence, requesting data becomes challenging and expensive and is often implemented using specialised hardware. In addition, picking specific data as fast as possible becomes increasingly important in many fields of science. The general problem of handling big data sets was addressed by developing specialized databases like HBase, HyperTable or Cassandra. However, these database solutions require also specialized or distributed hardware leading to expensive investments. To the best of our knowledge, there is no database capable of (i) storing billions of position-specific DNA-related records, (ii) performing fast and resource saving requests, and (iii) running on a single standard computer hardware. Results Here, we present DRUMS (Disk Repository with Update Management and Select option), satisfying demands (i)-(iii). It tackles the weaknesses of traditional databases while handling position-specific DNA-related data in an efficient manner. DRUMS is capable of storing up to billions of records. Moreover, it focuses on optimizing relating single lookups as range request, which are needed permanently for computations in bioinformatics. To validate the power of DRUMS, we compare it to the widely used MySQL database. The test setting considers two biological data sets. We use standard desktop hardware as test environment. Conclusions DRUMS outperforms MySQL in writing and reading records by a factor of two up to a factor of 10000. Furthermore, it can work with significantly larger data sets. Our work focuses on mid-sized data sets up to several billion records without requiring cluster technology. Storing position-specific data is a general problem and the concept we present here is a generalized approach. Hence, it can be easily applied to other fields of bioinformatics. PMID:24495746
Keleher, Myra P; Stanton, Marietta P
2016-01-01
The purpose of this article is to explore the most important factors that an employer utilizes in selecting an occupational health care provider for their employees injured on the job. The primary practice setting is the attending physician's office who is an occupational health care provider. The responding employers deemed "work restrictions given after each office visit" as their most important factor in selecting an occupational health care provider, with a score of 43. This was followed in order in the "very important" category by communication, appointment availability, employee return to work within nationally recognized guidelines, tied were medical provider professionalism and courtesy with diagnostics ordered timely, next was staff professionalism and courtesy, and tied with 20 responses in the "very important" category were wait time and accurate billing by the provider.The selection of an occupational health care provider in the realm of workers' compensation plays a monumental role in the life of a claim for the employer. Safe and timely return to work is in the best interest of the employer and their injured employee. For the employer, it can represent hard dollars saved in indemnity payments and insurance premiums when the employee can return to some form of work. For the injured employee, it can have a positive impact on their attitude of going back to work as they will feel they are a valued asset to their employer. The case managers, who are the "eyes and ears" for the employer in the field of workers' compensation, have a valuable role in a successful outcome of dollars saved and appropriate care rendered for the employees' on the job injury. The employers in the study were looking for case managers who could ensure their employees received quality care but that this care is cost-effective. The case manager can be instrumental in assisting the employer in developing and monitoring a "stay-at-work" program, thereby reducing the financial exposure for the employer.
Bentrup, Ursula
2010-12-01
Several in situ techniques are known which allow investigations of catalysts and catalytic reactions under real reaction conditions using different spectroscopic and X-ray methods. In recent years, specific set-ups have been established which combine two or more in situ methods in order to get a more detailed understanding of catalytic systems. This tutorial review will give a summary of currently available set-ups equipped with multiple techniques for in situ catalyst characterization, catalyst preparation, and reaction monitoring. Besides experimental and technical aspects of method coupling including X-ray techniques, spectroscopic methods (Raman, UV-vis, FTIR), and magnetic resonance spectroscopies (NMR, EPR), essential results will be presented to demonstrate the added value of multitechnique in situ approaches. A special section is focussed on selected examples of use which show new developments and application fields.
Determining open cluster membership. A Bayesian framework for quantitative member classification
NASA Astrophysics Data System (ADS)
Stott, Jonathan J.
2018-01-01
Aims: My goal is to develop a quantitative algorithm for assessing open cluster membership probabilities. The algorithm is designed to work with single-epoch observations. In its simplest form, only one set of program images and one set of reference images are required. Methods: The algorithm is based on a two-stage joint astrometric and photometric assessment of cluster membership probabilities. The probabilities were computed within a Bayesian framework using any available prior information. Where possible, the algorithm emphasizes simplicity over mathematical sophistication. Results: The algorithm was implemented and tested against three observational fields using published survey data. M 67 and NGC 654 were selected as cluster examples while a third, cluster-free, field was used for the final test data set. The algorithm shows good quantitative agreement with the existing surveys and has a false-positive rate significantly lower than the astrometric or photometric methods used individually.
NASA Astrophysics Data System (ADS)
Paszkiewicz, Zbigniew; Picard, Willy
Performance management (PM) is a key function of virtual organization (VO) management. A large set of PM indicators has been proposed and evaluated within the context of virtual breeding environments (VBEs). However, it is currently difficult to describe and select suitable PM indicators because of the lack of a common vocabulary and taxonomies of PM indicators. Therefore, there is a need for a framework unifying concepts in the domain of VO PM. In this paper, a reference model for VO PM is presented in the context of service-oriented VBEs. In the proposed reference model, both a set of terms that could be used to describe key performance indicators, and a set of taxonomies reflecting various aspects of PM are proposed. The proposed reference model is a first attempt and a work in progress that should not be supposed exhaustive.
Investigating Kindergarten Parents' Selection of After-School Art Education Settings in Taiwan
ERIC Educational Resources Information Center
Hsiao, Ching-Yuan; Kuo, Ting-Yin
2013-01-01
The research purpose was to investigate kindergarten parents' selection of after-school art education settings in Taiwan. A review of the literature and interviews with parents were conducted to identify several possible factors that would impact on parents' selection of after-school art education settings for their children. Then, the researcher…
Bayesian Parameter Estimation for Heavy-Duty Vehicles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Eric; Konan, Arnaud; Duran, Adam
2017-03-28
Accurate vehicle parameters are valuable for design, modeling, and reporting. Estimating vehicle parameters can be a very time-consuming process requiring tightly-controlled experimentation. This work describes a method to estimate vehicle parameters such as mass, coefficient of drag/frontal area, and rolling resistance using data logged during standard vehicle operation. The method uses Monte Carlo to generate parameter sets which is fed to a variant of the road load equation. Modeled road load is then compared to measured load to evaluate the probability of the parameter set. Acceptance of a proposed parameter set is determined using the probability ratio to the currentmore » state, so that the chain history will give a distribution of parameter sets. Compared to a single value, a distribution of possible values provides information on the quality of estimates and the range of possible parameter values. The method is demonstrated by estimating dynamometer parameters. Results confirm the method's ability to estimate reasonable parameter sets, and indicates an opportunity to increase the certainty of estimates through careful selection or generation of the test drive cycle.« less
Monthly mean global satellite data sets available in CCM history tape format
NASA Technical Reports Server (NTRS)
Hurrell, James W.; Campbell, G. Garrett
1992-01-01
Satellite data for climate monitoring have become increasingly important over the past decade, especially with increasing concern for inadvertent antropogenic climate change. Although most satellite based data are of short record, satellites can provide the global coverage that traditional meteorological observations network lack. In addition, satellite data are invaluable for the validation of climate models, and they are useful for many diagnostic studies. Herein, several satellite data sets were processed and transposed into 'history tape' format for use with the Community Climate Model (CCM) modular processor. Only a few of the most widely used and best documented data sets were selected at this point, although future work will expand the number of data sets examined as well as update the archived data sets. An attempt was made to include data of longer record and only monthly averaged data were processed. For studies using satellite data over an extended period, it is important to recognize the impact of changes in instrumentation, drift in instrument calibration, errors introduced by retrieval algorithms and other sources of errors such as those resulting from insufficient space and/or time sampling.
Some fuzzy techniques for staff selection process: A survey
NASA Astrophysics Data System (ADS)
Md Saad, R.; Ahmad, M. Z.; Abu, M. S.; Jusoh, M. S.
2013-04-01
With high level of business competition, it is vital to have flexible staff that are able to adapt themselves with work circumstances. However, staff selection process is not an easy task to be solved, even when it is tackled in a simplified version containing only a single criterion and a homogeneous skill. When multiple criteria and various skills are involved, the problem becomes much more complicated. In adddition, there are some information that could not be measured precisely. This is patently obvious when dealing with opinions, thoughts, feelings, believes, etc. One possible tool to handle this issue is by using fuzzy set theory. Therefore, the objective of this paper is to review the existing fuzzy techniques for solving staff selection process. It classifies several existing research methods and identifies areas where there is a gap and need further research. Finally, this paper concludes by suggesting new ideas for future research based on the gaps identified.
The response of numerical weather prediction analysis systems to FGGE 2b data
NASA Technical Reports Server (NTRS)
Hollingsworth, A.; Lorenc, A.; Tracton, S.; Arpe, K.; Cats, G.; Uppala, S.; Kallberg, P.
1985-01-01
An intercomparison of analyses of the main PGGE Level IIb data set is presented with three advanced analysis systems. The aims of the work are to estimate the extent and magnitude of the differences between the analyses, to identify the reasons for the differences, and finally to estimate the significance of the differences. Extratropical analyses only are considered. Objective evaluations of analysis quality, such as fit to observations, statistics of analysis differences, and mean fields are discussed. In addition, substantial emphasis is placed on subjective evaluation of a series of case studies that were selected to illustrate the importance of different aspects of the analysis procedures, such as quality control, data selection, resolution, dynamical balance, and the role of the assimilating forecast model. In some cases, the forecast models are used as selective amplifiers of analysis differences to assist in deciding which analysis was more nearly correct in the treatment of particular data.
Procelewska, Joanna; Galilea, Javier Llamas; Clerc, Frederic; Farrusseng, David; Schüth, Ferdi
2007-01-01
The objective of this work is the construction of a correlation between characteristics of heterogeneous catalysts, encoded in a descriptor vector, and their experimentally measured performances in the propene oxidation reaction. In this paper the key issue in the modeling process, namely the selection of adequate input variables, is explored. Several data-driven feature selection strategies were applied in order to obtain an estimate of the differences in variance and information content of various attributes, furthermore to compare their relative importance. Quantitative property activity relationship techniques using probabilistic neural networks have been used for the creation of various semi-empirical models. Finally, a robust classification model, assigning selected attributes of solid compounds as input to an appropriate performance class in the model reaction was obtained. It has been evident that the mathematical support for the primary attributes set proposed by chemists can be highly desirable.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fishkind, H.H.
1982-04-01
The feasibility of large-scale plantation establishment by various methods was examined, and the following conclusions were reached: seedling plantations are limited in potential yield due to genetic variation among the planting stock and often inadequate supplies of appropriate seed; vegetative propagation by rooted cuttings can provide good genetic uniformity of select hybrid planting stock; however, large-scale production requires establishment and maintenance of extensive cutting orchards. The collection of shoots and preparation of cuttings, although successfully implemented in the Congo and Brazil, would not be economically feasible in Florida for large-scale plantations; tissue culture propagation of select hybrid eucalypts offers themore » only opportunity to produce the very large number of trees required to establish the energy plantation. The cost of tissue culture propagation, although higher than seedling production, is more than off-set by the increased productivity of vegetative plantations established from select hybrid Eucalyptus.« less
Design Issues in Video Disc Map Display.
1984-10-01
such items as the equipment used by ETL in its work with discs and selected images from a disc. % %. I 4 11. VIDEO DISC TECHNOLOGY AND VOCABULARY 0...The term video refers to a television image. The standard home television set is equipped with a receiver, which is capable of picking up a signal...plays for one hour per side and is played at a constant linear velocity. The industria )y-formatted disc has 54,000 frames per side in concentric tracks
1982-09-01
data as well as administra- tion requirements are available. Video game task: a non-social, task-oriented setting in which the person is working...dominant arm and the subject was instructed to watch a video screen. After the experimenter left the room, videotaped instructions for a video game (similar...with video games to understand the task. The task itself was selected for its general interest across divergent groups of potential subjects and
2014-06-01
AUTHOR(S) 5d. PROJECT NUMBER Dr. Charles Lin 5e. TASK NUMBER E-Mail: Charles_lin@dfci.harvard.edu 5f. WORK UNIT NUMBER 7...landscape of Multiple Myeloma (MM), this project has endeavored to provide an explanatory mechanism for how treatment with inhibitors of chromatin...category: 1472-1), BRD4 (Epitomics, category: 5716-1) or b-actin ( Sigma , clone AC-15, A5441). Data Analysis All ChIP-seq data sets were aligned using
Dealing with Multiple Solutions in Structural Vector Autoregressive Models.
Beltz, Adriene M; Molenaar, Peter C M
2016-01-01
Structural vector autoregressive models (VARs) hold great potential for psychological science, particularly for time series data analysis. They capture the magnitude, direction of influence, and temporal (lagged and contemporaneous) nature of relations among variables. Unified structural equation modeling (uSEM) is an optimal structural VAR instantiation, according to large-scale simulation studies, and it is implemented within an SEM framework. However, little is known about the uniqueness of uSEM results. Thus, the goal of this study was to investigate whether multiple solutions result from uSEM analysis and, if so, to demonstrate ways to select an optimal solution. This was accomplished with two simulated data sets, an empirical data set concerning children's dyadic play, and modifications to the group iterative multiple model estimation (GIMME) program, which implements uSEMs with group- and individual-level relations in a data-driven manner. Results revealed multiple solutions when there were large contemporaneous relations among variables. Results also verified several ways to select the correct solution when the complete solution set was generated, such as the use of cross-validation, maximum standardized residuals, and information criteria. This work has immediate and direct implications for the analysis of time series data and for the inferences drawn from those data concerning human behavior.
Improved Diagnostic Multimodal Biomarkers for Alzheimer's Disease and Mild Cognitive Impairment
Martínez-Torteya, Antonio; Treviño, Víctor; Tamez-Peña, José G.
2015-01-01
The early diagnosis of Alzheimer's disease (AD) and mild cognitive impairment (MCI) is very important for treatment research and patient care purposes. Few biomarkers are currently considered in clinical settings, and their use is still optional. The objective of this work was to determine whether multimodal and nonpreviously AD associated features could improve the classification accuracy between AD, MCI, and healthy controls, which may impact future AD biomarkers. For this, Alzheimer's Disease Neuroimaging Initiative database was mined for case-control candidates. At least 652 baseline features extracted from MRI and PET analyses, biological samples, and clinical data up to February 2014 were used. A feature selection methodology that includes a genetic algorithm search coupled to a logistic regression classifier and forward and backward selection strategies was used to explore combinations of features. This generated diagnostic models with sizes ranging from 3 to 8, including well documented AD biomarkers, as well as unexplored image, biochemical, and clinical features. Accuracies of 0.85, 0.79, and 0.80 were achieved for HC-AD, HC-MCI, and MCI-AD classifications, respectively, when evaluated using a blind test set. In conclusion, a set of features provided additional and independent information to well-established AD biomarkers, aiding in the classification of MCI and AD. PMID:26106620
Waste in health information systems: a systematic review.
Awang Kalong, Nadia; Yusof, Maryati
2017-05-08
Purpose The purpose of this paper is to discuss a systematic review on waste identification related to health information systems (HIS) in Lean transformation. Design/methodology/approach A systematic review was conducted on 19 studies to evaluate Lean transformation and tools used to remove waste related to HIS in clinical settings. Findings Ten waste categories were identified, along with their relationships and applications of Lean tool types related to HIS. Different Lean tools were used at the early and final stages of Lean transformation; the tool selection depended on the waste characteristic. Nine studies reported a positive impact from Lean transformation in improving daily work processes. The selection of Lean tools should be made based on the timing, purpose and characteristics of waste to be removed. Research limitations/implications Overview of waste and its category within HIS and its analysis from socio-technical perspectives enabled the identification of its root cause in a holistic and rigorous manner. Practical implications Understanding waste types, their root cause and review of Lean tools could subsequently lead to the identification of mitigation approach to prevent future error occurrence. Originality/value Specific waste models for HIS settings are yet to be developed. Hence, the identification of the waste categories could guide future implementation of Lean transformations in HIS settings.
A high-performance seizure detection algorithm based on Discrete Wavelet Transform (DWT) and EEG
Chen, Duo; Wan, Suiren; Xiang, Jing; Bao, Forrest Sheng
2017-01-01
In the past decade, Discrete Wavelet Transform (DWT), a powerful time-frequency tool, has been widely used in computer-aided signal analysis of epileptic electroencephalography (EEG), such as the detection of seizures. One of the important hurdles in the applications of DWT is the settings of DWT, which are chosen empirically or arbitrarily in previous works. The objective of this study aimed to develop a framework for automatically searching the optimal DWT settings to improve accuracy and to reduce computational cost of seizure detection. To address this, we developed a method to decompose EEG data into 7 commonly used wavelet families, to the maximum theoretical level of each mother wavelet. Wavelets and decomposition levels providing the highest accuracy in each wavelet family were then searched in an exhaustive selection of frequency bands, which showed optimal accuracy and low computational cost. The selection of frequency bands and features removed approximately 40% of redundancies. The developed algorithm achieved promising performance on two well-tested EEG datasets (accuracy >90% for both datasets). The experimental results of the developed method have demonstrated that the settings of DWT affect its performance on seizure detection substantially. Compared with existing seizure detection methods based on wavelet, the new approach is more accurate and transferable among datasets. PMID:28278203
Dick, Anthony Steven
2012-01-01
Two experiments examined processes underlying cognitive inflexibility in set-shifting tasks typically used to assess the development of executive function in children. Adult participants performed a Flexible Item Selection Task (FIST) that requires shifting from categorizing by one dimension (e.g., color) to categorizing by a second orthogonal dimension (e.g., shape). The experiments showed performance of the FIST involves suppression of the representation of the ignored dimension; response times for selecting a target object in an immediately-following oddity task were slower when the oddity target was the previously-ignored stimulus of the FIST. However, proactive interference from the previously relevant stimulus dimension also impaired responding. The results are discussed with respect to two prominent theories of the source of difficulty for children and adults on dimensional shifting tasks: attentional inertia and negative priming. In contrast to prior work emphasizing one over the other process, the findings indicate that difficulty in the FIST, and by extension other set-shifting tasks, can be attributed to both the need to shift away from the previously attended representation (attentional inertia), and the need to shift to the previously ignored representation (negative priming). Results are discussed in relation to theoretical explanations for cognitive inflexibility in adults and children. PMID:23539267
Dick, Anthony Steven
2012-01-01
Two experiments examined processes underlying cognitive inflexibility in set-shifting tasks typically used to assess the development of executive function in children. Adult participants performed a Flexible Item Selection Task (FIST) that requires shifting from categorizing by one dimension (e.g., color) to categorizing by a second orthogonal dimension (e.g., shape). The experiments showed performance of the FIST involves suppression of the representation of the ignored dimension; response times for selecting a target object in an immediately-following oddity task were slower when the oddity target was the previously-ignored stimulus of the FIST. However, proactive interference from the previously relevant stimulus dimension also impaired responding. The results are discussed with respect to two prominent theories of the source of difficulty for children and adults on dimensional shifting tasks: attentional inertia and negative priming . In contrast to prior work emphasizing one over the other process, the findings indicate that difficulty in the FIST, and by extension other set-shifting tasks, can be attributed to both the need to shift away from the previously attended representation ( attentional inertia ), and the need to shift to the previously ignored representation ( negative priming ). Results are discussed in relation to theoretical explanations for cognitive inflexibility in adults and children.
Yugandhar, K; Gromiha, M Michael
2014-09-01
Protein-protein interactions are intrinsic to virtually every cellular process. Predicting the binding affinity of protein-protein complexes is one of the challenging problems in computational and molecular biology. In this work, we related sequence features of protein-protein complexes with their binding affinities using machine learning approaches. We set up a database of 185 protein-protein complexes for which the interacting pairs are heterodimers and their experimental binding affinities are available. On the other hand, we have developed a set of 610 features from the sequences of protein complexes and utilized Ranker search method, which is the combination of Attribute evaluator and Ranker method for selecting specific features. We have analyzed several machine learning algorithms to discriminate protein-protein complexes into high and low affinity groups based on their Kd values. Our results showed a 10-fold cross-validation accuracy of 76.1% with the combination of nine features using support vector machines. Further, we observed accuracy of 83.3% on an independent test set of 30 complexes. We suggest that our method would serve as an effective tool for identifying the interacting partners in protein-protein interaction networks and human-pathogen interactions based on the strength of interactions. © 2014 Wiley Periodicals, Inc.
Preconception sex selection for non-medical and intermediate reasons: ethical reflections
de Wert, G.; Dondorp, W.
2010-01-01
Sex selection for non-medical reasons is forbidden in many countries. Focusing on preconception sex selection, the authors first observe that it is unclear what should count as a ‘medical reason’ in this context and argue for the existence of ‘intermediate reasons’ that do not fit well within the rigid distinction between ‘medical’and ‘non-medical’. The article further provides a critical review of the arguments for the prohibition of sex selection for non-medical reasons and finds that none of these are conclusive. The authors conclude that the ban should be reconsidered, but also that existing societal concerns about possible harmful effects should be taken seriously. Measures to this effect may include limiting the practice to couples who already have at least one child of the sex opposite to that which they now want to select (‘family balancing’). Finally, a difficult set of questions is raised by concerns about the reliability and unproven (long-term) safety of the only technology (flow cytometry) proven to work. PMID:25009714
A selective review of selective attention research from the past century.
Driver, Jon
2001-02-01
Research on attention is concerned with selective processing of incoming sensory information. To some extent, our awareness of the world depends on what we choose to attend, not merely on the stimulation entering our senses. British psychologists have made substantial contributions to this topic in the past century. Celebrated examples include Donald Broadbent's filter theory of attention, which set the agenda for most subsequent work; and Anne Treisman's revisions of this account, and her later feature-integration theory. More recent contributions include Alan Allport's prescient emphasis on the relevance of neuroscience data, and John Duncan's integration of such data with psychological theory. An idiosyncratic but roughly chronological review of developments is presented, some practical and clinical implications are briefly sketched, and future directions suggested. One of the biggest changes in the field has been the increasing interplay between psychology and neuroscience, which promises much for the future. A related change has been the realization that selection attention is best thought of as a broad topic, encompassing a range of selective issues, rather than as a single explanatory process.
Direct and Absolute Quantification of over 1800 Yeast Proteins via Selected Reaction Monitoring*
Lawless, Craig; Holman, Stephen W.; Brownridge, Philip; Lanthaler, Karin; Harman, Victoria M.; Watkins, Rachel; Hammond, Dean E.; Miller, Rebecca L.; Sims, Paul F. G.; Grant, Christopher M.; Eyers, Claire E.; Beynon, Robert J.
2016-01-01
Defining intracellular protein concentration is critical in molecular systems biology. Although strategies for determining relative protein changes are available, defining robust absolute values in copies per cell has proven significantly more challenging. Here we present a reference data set quantifying over 1800 Saccharomyces cerevisiae proteins by direct means using protein-specific stable-isotope labeled internal standards and selected reaction monitoring (SRM) mass spectrometry, far exceeding any previous study. This was achieved by careful design of over 100 QconCAT recombinant proteins as standards, defining 1167 proteins in terms of copies per cell and upper limits on a further 668, with robust CVs routinely less than 20%. The selected reaction monitoring-derived proteome is compared with existing quantitative data sets, highlighting the disparities between methodologies. Coupled with a quantification of the transcriptome by RNA-seq taken from the same cells, these data support revised estimates of several fundamental molecular parameters: a total protein count of ∼100 million molecules-per-cell, a median of ∼1000 proteins-per-transcript, and a linear model of protein translation explaining 70% of the variance in translation rate. This work contributes a “gold-standard” reference yeast proteome (including 532 values based on high quality, dual peptide quantification) that can be widely used in systems models and for other comparative studies. PMID:26750110
W-band PELDOR with 1 kW microwave power: molecular geometry, flexibility and exchange coupling.
Reginsson, Gunnar W; Hunter, Robert I; Cruickshank, Paul A S; Bolton, David R; Sigurdsson, Snorri Th; Smith, Graham M; Schiemann, Olav
2012-03-01
A technique that is increasingly being used to determine the structure and conformational flexibility of biomacromolecules is Pulsed Electron-Electron Double Resonance (PELDOR or DEER), an Electron Paramagnetic Resonance (EPR) based technique. At X-band frequencies (9.5 GHz), PELDOR is capable of precisely measuring distances in the range of 1.5-8 nm between paramagnetic centres but the orientation selectivity is weak. In contrast, working at higher frequencies increases the orientation selection but usually at the expense of decreased microwave power and PELDOR modulation depth. Here it is shown that a home-built high-power pulsed W-band EPR spectrometer (HiPER) with a large instantaneous bandwidth enables one to achieve PELDOR data with a high degree of orientation selectivity and large modulation depths. We demonstrate a measurement methodology that gives a set of PELDOR time traces that yield highly constrained data sets. Simulating the resulting time traces provides a deeper insight into the conformational flexibility and exchange coupling of three bisnitroxide model systems. These measurements provide strong evidence that W-band PELDOR may prove to be an accurate and quantitative tool in assessing the relative orientations of nitroxide spin labels and to correlate those orientations to the underlying biological structure and dynamics. Copyright © 2012 Elsevier Inc. All rights reserved.
A Compact Optical Instrument with Artificial Neural Network for pH Determination
Capel-Cuevas, Sonia; López-Ruiz, Nuria; Martinez-Olmos, Antonio; Cuéllar, Manuel P.; Pegalajar, Maria del Carmen; Palma, Alberto José; de Orbe-Payá, Ignacio; Capitán-Vallvey, Luis Fermin
2012-01-01
The aim of this work was the determination of pH with a sensor array-based optical portable instrument. This sensor array consists of eleven membranes with selective colour changes at different pH intervals. The method for the pH calculation is based on the implementation of artificial neural networks that use the responses of the membranes to generate a final pH value. A multi-objective algorithm was used to select the minimum number of sensing elements required to achieve an accurate pH determination from the neural network, and also to minimise the network size. This helps to minimise instrument and array development costs and save on microprocessor energy consumption. A set of artificial neural networks that fulfils these requirements is proposed using different combinations of the membranes in the sensor array, and is evaluated in terms of accuracy and reliability. In the end, the network including the response of the eleven membranes in the sensor was selected for validation in the instrument prototype because of its high accuracy. The performance of the instrument was evaluated by measuring the pH of a large set of real samples, showing that high precision can be obtained in the full range. PMID:22778668
The KIT Motion-Language Dataset.
Plappert, Matthias; Mandery, Christian; Asfour, Tamim
2016-12-01
Linking human motion and natural language is of great interest for the generation of semantic representations of human activities as well as for the generation of robot activities based on natural language input. However, although there have been years of research in this area, no standardized and openly available data set exists to support the development and evaluation of such systems. We, therefore, propose the Karlsruhe Institute of Technology (KIT) Motion-Language Dataset, which is large, open, and extensible. We aggregate data from multiple motion capture databases and include them in our data set using a unified representation that is independent of the capture system or marker set, making it easy to work with the data regardless of its origin. To obtain motion annotations in natural language, we apply a crowd-sourcing approach and a web-based tool that was specifically build for this purpose, the Motion Annotation Tool. We thoroughly document the annotation process itself and discuss gamification methods that we used to keep annotators motivated. We further propose a novel method, perplexity-based selection, which systematically selects motions for further annotation that are either under-represented in our data set or that have erroneous annotations. We show that our method mitigates the two aforementioned problems and ensures a systematic annotation process. We provide an in-depth analysis of the structure and contents of our resulting data set, which, as of October 10, 2016, contains 3911 motions with a total duration of 11.23 hours and 6278 annotations in natural language that contain 52,903 words. We believe this makes our data set an excellent choice that enables more transparent and comparable research in this important area.
3D Printed Surgical Instruments: The Design and Fabrication Process.
George, Mitchell; Aroom, Kevin R; Hawes, Harvey G; Gill, Brijesh S; Love, Joseph
2017-01-01
3D printing is an additive manufacturing process allowing the creation of solid objects directly from a digital file. We believe recent advances in additive manufacturing may be applicable to surgical instrument design. This study investigates the feasibility, design and fabrication process of usable 3D printed surgical instruments. The computer-aided design package SolidWorks (Dassault Systemes SolidWorks Corp., Waltham MA) was used to design a surgical set including hemostats, needle driver, scalpel handle, retractors and forceps. These designs were then printed on a selective laser sintering (SLS) Sinterstation HiQ (3D Systems, Rock Hill SC) using DuraForm EX plastic. The final printed products were evaluated by practicing general surgeons for ergonomic functionality and performance, this included simulated surgery and inguinal hernia repairs on human cadavers. Improvements were identified and addressed by adjusting design and build metrics. Repeated manufacturing processes and redesigns led to the creation of multiple functional and fully reproducible surgical sets utilizing the user feedback of surgeons. Iterative cycles including design, production and testing took an average of 3 days. Each surgical set was built using the SLS Sinterstation HiQ with an average build time of 6 h per set. Functional 3D printed surgical instruments are feasible. Advantages compared to traditional manufacturing methods include no increase in cost for increased complexity, accelerated design to production times and surgeon specific modifications.
[Evaluation of using statistical methods in selected national medical journals].
Sych, Z
1996-01-01
The paper covers the performed evaluation of frequency with which the statistical methods were applied in analyzed works having been published in six selected, national medical journals in the years 1988-1992. For analysis the following journals were chosen, namely: Klinika Oczna, Medycyna Pracy, Pediatria Polska, Polski Tygodnik Lekarski, Roczniki Państwowego Zakładu Higieny, Zdrowie Publiczne. Appropriate number of works up to the average in the remaining medical journals was randomly selected from respective volumes of Pol. Tyg. Lek. The studies did not include works wherein the statistical analysis was not implemented, which referred both to national and international publications. That exemption was also extended to review papers, casuistic ones, reviews of books, handbooks, monographies, reports from scientific congresses, as well as papers on historical topics. The number of works was defined in each volume. Next, analysis was performed to establish the mode of finding out a suitable sample in respective studies, differentiating two categories: random and target selections. Attention was also paid to the presence of control sample in the individual works. In the analysis attention was also focussed on the existence of sample characteristics, setting up three categories: complete, partial and lacking. In evaluating the analyzed works an effort was made to present the results of studies in tables and figures (Tab. 1, 3). Analysis was accomplished with regard to the rate of employing statistical methods in analyzed works in relevant volumes of six selected, national medical journals for the years 1988-1992, simultaneously determining the number of works, in which no statistical methods were used. Concurrently the frequency of applying the individual statistical methods was analyzed in the scrutinized works. Prominence was given to fundamental statistical methods in the field of descriptive statistics (measures of position, measures of dispersion) as well as most important methods of mathematical statistics such as parametric tests of significance, analysis of variance (in single and dual classifications). non-parametric tests of significance, correlation and regression. The works, in which use was made of either multiple correlation or multiple regression or else more complex methods of studying the relationship for two or more numbers of variables, were incorporated into the works whose statistical methods were constituted by correlation and regression as well as other methods, e.g. statistical methods being used in epidemiology (coefficients of incidence and morbidity, standardization of coefficients, survival tables) factor analysis conducted by Jacobi-Hotellng's method, taxonomic methods and others. On the basis of the performed studies it has been established that the frequency of employing statistical methods in the six selected national, medical journals in the years 1988-1992 was 61.1-66.0% of the analyzed works (Tab. 3), and they generally were almost similar to the frequency provided in English language medical journals. On a whole, no significant differences were disclosed in the frequency of applied statistical methods (Tab. 4) as well as in frequency of random tests (Tab. 3) in the analyzed works, appearing in the medical journals in respective years 1988-1992. The most frequently used statistical methods in analyzed works for 1988-1992 were the measures of position 44.2-55.6% and measures of dispersion 32.5-38.5% as well as parametric tests of significance 26.3-33.1% of the works analyzed (Tab. 4). For the purpose of increasing the frequency and reliability of the used statistical methods, the didactics should be widened in the field of biostatistics at medical studies and postgraduation training designed for physicians and scientific-didactic workers.
Cyber-Attack Methods, Why They Work on Us, and What to Do
NASA Technical Reports Server (NTRS)
Byrne, DJ
2015-01-01
Basic cyber-attack methods are well documented, and even automated with user-friendly GUIs (Graphical User Interfaces). Entire suites of attack tools are legal, conveniently packaged, and freely downloadable to anyone; more polished versions are sold with vendor support. Our team ran some of these against a selected set of projects within our organization to understand what the attacks do so that we can design and validate defenses against them. Some existing defenses were effective against the attacks, some less so. On average, every machine had twelve easily identifiable vulnerabilities, two of them "critical". Roughly 5% of passwords in use were easily crack-able. We identified a clear set of recommendations for each project, and some common patterns that emerged among them all.
Penetrator role in Mars sample strategy
NASA Technical Reports Server (NTRS)
Boynton, William; Dwornik, Steve; Eckstrom, William; Roalstad, David A.
1988-01-01
The application of the penetrator to a Mars Return Sample Mission (MRSM) has direct advantages to meet science objectives and mission safety. Based on engineering data and work currently conducted at Ball Aerospace Systems Division, the concept of penetrators as scientific instruments is entirely practical. The primary utilization of a penetrator for MRSM would be to optimize the selection of the sample site location and to help in selection of the actual sample to be returned to Earth. It is recognized that the amount of sample to be returned is very limited, therefore the selection of the sample site is critical to the success of the mission. The following mission scenario is proposed. The site selection of a sample to be acquired will be performed by science working groups. A decision will be reached and a set of target priorities established based on data to give geochemical, geophysical and geological information. The first task of a penetrator will be to collect data at up to 4 to 6 possible landing sites. The penetrator can include geophysical, geochemical, geological and engineering instruments to confirm that scientific data requirements at that site will be met. This in situ near real-time data, collected prior to final targeting of the lander, will insure that the sample site is both scientifically valuable and also that it is reachable within limits of the capability of the lander.
Automatic welding detection by an intelligent tool pipe inspection
NASA Astrophysics Data System (ADS)
Arizmendi, C. J.; Garcia, W. L.; Quintero, M. A.
2015-07-01
This work provide a model based on machine learning techniques in welds recognition, based on signals obtained through in-line inspection tool called “smart pig” in Oil and Gas pipelines. The model uses a signal noise reduction phase by means of pre-processing algorithms and attribute-selection techniques. The noise reduction techniques were selected after a literature review and testing with survey data. Subsequently, the model was trained using recognition and classification algorithms, specifically artificial neural networks and support vector machines. Finally, the trained model was validated with different data sets and the performance was measured with cross validation and ROC analysis. The results show that is possible to identify welding automatically with an efficiency between 90 and 98 percent.
Vector excitation speech or audio coder for transmission or storage
NASA Technical Reports Server (NTRS)
Davidson, Grant (Inventor); Gersho, Allen (Inventor)
1989-01-01
A vector excitation coder compresses vectors by using an optimum codebook designed off line, using an initial arbitrary codebook and a set of speech training vectors exploiting codevector sparsity (i.e., by making zero all but a selected number of samples of lowest amplitude in each of N codebook vectors). A fast-search method selects a number N.sub.c of good excitation vectors from the codebook, where N.sub.c is much smaller tha ORIGIN OF INVENTION The invention described herein was made in the performance of work under a NASA contract, and is subject to the provisions of Public Law 96-517 (35 USC 202) under which the inventors were granted a request to retain title.
Simulation of millimeter-wave body images and its application to biometric recognition
NASA Astrophysics Data System (ADS)
Moreno-Moreno, Miriam; Fierrez, Julian; Vera-Rodriguez, Ruben; Parron, Josep
2012-06-01
One of the emerging applications of the millimeter-wave imaging technology is its use in biometric recognition. This is mainly due to some properties of the millimeter-waves such as their ability to penetrate through clothing and other occlusions, their low obtrusiveness when collecting the image and the fact that they are harmless to health. In this work we first describe the generation of a database comprising 1200 synthetic images at 94 GHz obtained from the body of 50 people. Then we extract a small set of distance-based features from each image and select the best feature subsets for person recognition using the SFFS feature selection algorithm. Finally these features are used in body geometry authentication obtaining promising results.
The development of the Project NetWork administrative records database for policy evaluation.
Rupp, K; Driessen, D; Kornfeld, R; Wood, M
1999-01-01
This article describes the development of SSA's administrative records database for the Project NetWork return-to-work experiment targeting persons with disabilities. The article is part of a series of papers on the evaluation of the Project NetWork demonstration. In addition to 8,248 Project NetWork participants randomly assigned to receive case management services and a control group, the simulation identified 138,613 eligible nonparticipants in the demonstration areas. The output data files contain detailed monthly information on Supplemental Security Income (SSI) and Disability Insurance (DI) benefits, annual earnings, and a set of demographic and diagnostic variables. The data allow for the measurement of net outcomes and the analysis of factors affecting participation. The results suggest that it is feasible to simulate complex eligibility rules using administrative records, and create a clean and edited data file for a comprehensive and credible evaluation. The study shows that it is feasible to use administrative records data for selecting control or comparison groups in future demonstration evaluations.
Toward Improved Predictions of Slender Airframe Aerodynamics Using the F-16XL Aircraft
NASA Technical Reports Server (NTRS)
Luckring, James M.; Rizzi, Arthur; Davis, M. Bruce
2016-01-01
A coordinated project has been underway to improve computational fluid dynamics predictions of slender airframe aerodynamics. The work is focused on two flow conditions and leverages a unique flight data set obtained with an F-16XL aircraft. These conditions, a low-speed high angle-of-attack case and a transonic low angle-of-attack case, were selected from a prior prediction campaign wherein the computational fluid dynamics failed to provide acceptable results. In this paper, the background, objectives, and approach to the current project are presented. The work embodies predictions from multiple numerical formulations that are contributed from multiple organizations, and the context of this campaign to other multicode, multi-organizational efforts is included. The relevance of this body of work toward future supersonic commercial transport concepts is also briefly addressed.
Hot-spot heating susceptibility due to reverse bias operating conditions
NASA Technical Reports Server (NTRS)
Gonzalez, C. C.
1985-01-01
Because of field experience (indicating that cell and module degradation could occur as a result of hot spot heating), a laboratory test was developed at JPL to determine hot spot susceptibility of modules. The initial hot spot testing work at JPL formed a foundation for the test development. Test parameters are selected as follows. For high shunt resistance cells, the applied back bias test current is set equal to the test cell current at maximum power. For low shunt resistance cells, the test current is set equal to the cell short circuit current. The shadow level is selected to conform to that which would lead to maximum back bias voltage under the appropriate test current level. The test voltage is determined by the bypass diode frequency. The test conditions are meant to simulate the thermal boundary conditions for 100 mW/sq cm, 40C ambient environment. The test lasts 100 hours. A key assumption made during the development of the test is that no current imbalance results from the connecting of multiparallel cell strings. Therefore, the test as originally developed was applicable for single string case only.
Environmental geology of ancient Greek cities
NASA Astrophysics Data System (ADS)
Crouch, D. P.
1996-04-01
Man-environment relations in the ancient Greek world, as now, were complex interactions. To understand them, we need to study a range of physical features and man's impact on the setting. The underlying geological reality of this area is karst, which is widely distributed, dominating Greece, the southern half of Turkey, and southern Italy and Sicily, where the Greco-Roman cities that we study were located. Year-round water from karst springs was important because of scarce rainfall, intense evaporation, and infertile soil—none under human control. Examples from the Greek mainland (Corinth), an Aegean island (Rhodes), Turkey (Priene), and Sicily (Syracuse) are selected and described to suggest the way that karst water potential played an important role in site selection and development. A wider look at criteria for urban location and a new classification of urban patterns help to revise conventional understandings of these ancient cities. In conclusion, some modern findings about the interaction between city and setting suggest new research agendas for geologists and engineers, ancient historians and archaelogists, and water policy makers—preferrably working together.
Morris, Christopher; Dunkley, Colin; Gibbon, Frances M; Currier, Janet; Roberts, Deborah; Rogers, Morwenna; Crudgington, Holly; Bray, Lucy; Carter, Bernie; Hughes, Dyfrig; Tudur Smith, Catrin; Williamson, Paula R; Gringras, Paul; Pal, Deb K
2017-11-28
There is increasing recognition that establishing a core set of outcomes to be evaluated and reported in trials of interventions for particular conditions will improve the usefulness of health research. There is no established core outcome set for childhood epilepsy. The aim of this work is to select a core outcome set to be used in evaluative research of interventions for children with rolandic epilepsy, as an exemplar of common childhood epilepsy syndromes. First we will identify what outcomes should be measured; then we will decide how to measure those outcomes. We will engage relevant UK charities and health professional societies as partners, and convene advisory panels for young people with epilepsy and parents of children with epilepsy. We will identify candidate outcomes from a search for trials of interventions for childhood epilepsy, statutory guidance and consultation with our advisory panels. Families, charities and health, education and neuropsychology professionals will be invited to participate in a Delphi survey following recommended practices in the development of core outcome sets. Participants will be able to recommend additional outcome domains. Over three rounds of Delphi survey participants will rate the importance of candidate outcome domains and state the rationale for their decisions. Over the three rounds we will seek consensus across and between families and health professionals on the more important outcomes. A face-to-face meeting will be convened to ratify the core outcome set. We will then review and recommend ways to measure the shortlisted outcomes using clinical assessment and/or patient-reported outcome measures. Our methodology is a proportionate and pragmatic approach to expediently produce a core outcome set for evaluative research of interventions aiming to improve the health of children with epilepsy. A number of decisions have to be made when designing a study to develop a core outcome set including defining the scope, choosing which stakeholders to engage, most effective ways to elicit their views, especially children and a potential role for qualitative research.
SU-F-T-243: Major Risks in Radiotherapy. A Review Based On Risk Analysis Literature
DOE Office of Scientific and Technical Information (OSTI.GOV)
López-Tarjuelo, J; Guasp-Tortajada, M; Iglesias-Montenegro, N
Purpose: We present a literature review of risk analyses in radiotherapy to highlight the most reported risks and facilitate the spread of this valuable information so that professionals can be aware of these major threats before performing their own studies. Methods: We considered studies with at least an estimation of the probability of occurrence of an adverse event (O) and its associated severity (S). They cover external beam radiotherapy, brachytherapy, intraoperative radiotherapy, and stereotactic techniques. We selected only the works containing a detailed ranked series of elements or failure modes and focused on the first fully reported quartile as much.more » Afterward, we sorted the risk elements according to a regular radiotherapy procedure so that the resulting groups were cited in several works and be ranked in this way. Results: 29 references published between 2007 and February 2016 were studied. Publication trend has been generally rising. The most employed analysis has been the Failure mode and effect analysis (FMEA). Among references, we selected 20 works listing 258 ranked risk elements. They were sorted into 31 groups appearing at least in two different works. 11 groups appeared in at least 5 references and 5 groups did it in 7 or more papers. These last sets of risks where choosing another set of images or plan for planning or treating, errors related with contours, errors in patient positioning for treatment, human mistakes when programming treatments, and planning errors. Conclusion: There is a sufficient amount and variety of references for identifying which failure modes or elements should be addressed in a radiotherapy department before attempting a specific analysis. FMEA prevailed, but other studies such as “risk matrix” or “occurrence × severity” analyses can also lead professionals’ efforts. Risk associated with human actions ranks very high; therefore, they should be automated or at least peer-reviewed.« less
Statistical molecular design of building blocks for combinatorial chemistry.
Linusson, A; Gottfries, J; Lindgren, F; Wold, S
2000-04-06
The reduction of the size of a combinatorial library can be made in two ways, either base the selection on the building blocks (BB's) or base it on the full set of virtually constructed products. In this paper we have investigated the effects of applying statistical designs to BB sets compared to selections based on the final products. The two sets of BB's and the virtually constructed library were described by structural parameters, and the correlation between the two characterizations was investigated. Three different selection approaches were used both for the BB sets and for the products. In the first two the selection algorithms were applied directly to the data sets (D-optimal design and space-filling design), while for the third a cluster analysis preceded the selection (cluster-based design). The selections were compared using visual inspection, the Tanimoto coefficient, the Euclidean distance, the condition number, and the determinant of the resulting data matrix. No difference in efficiency was found between selections made in the BB space and in the product space. However, it is of critical importance to investigate the BB space carefully and to select an appropriate number of BB's to result in an adequate diversity. An example from the pharmaceutical industry is then presented, where selection via BB's was made using a cluster-based design.
Inference of Evolutionary Forces Acting on Human Biological Pathways
Daub, Josephine T.; Dupanloup, Isabelle; Robinson-Rechavi, Marc; Excoffier, Laurent
2015-01-01
Because natural selection is likely to act on multiple genes underlying a given phenotypic trait, we study here the potential effect of ongoing and past selection on the genetic diversity of human biological pathways. We first show that genes included in gene sets are generally under stronger selective constraints than other genes and that their evolutionary response is correlated. We then introduce a new procedure to detect selection at the pathway level based on a decomposition of the classical McDonald–Kreitman test extended to multiple genes. This new test, called 2DNS, detects outlier gene sets and takes into account past demographic effects and evolutionary constraints specific to gene sets. Selective forces acting on gene sets can be easily identified by a mere visual inspection of the position of the gene sets relative to their two-dimensional null distribution. We thus find several outlier gene sets that show signals of positive, balancing, or purifying selection but also others showing an ancient relaxation of selective constraints. The principle of the 2DNS test can also be applied to other genomic contrasts. For instance, the comparison of patterns of polymorphisms private to African and non-African populations reveals that most pathways show a higher proportion of nonsynonymous mutations in non-Africans than in Africans, potentially due to different demographic histories and selective pressures. PMID:25971280
Feature Selection for Ridge Regression with Provable Guarantees.
Paul, Saurabh; Drineas, Petros
2016-04-01
We introduce single-set spectral sparsification as a deterministic sampling-based feature selection technique for regularized least-squares classification, which is the classification analog to ridge regression. The method is unsupervised and gives worst-case guarantees of the generalization power of the classification function after feature selection with respect to the classification function obtained using all features. We also introduce leverage-score sampling as an unsupervised randomized feature selection method for ridge regression. We provide risk bounds for both single-set spectral sparsification and leverage-score sampling on ridge regression in the fixed design setting and show that the risk in the sampled space is comparable to the risk in the full-feature space. We perform experiments on synthetic and real-world data sets; a subset of TechTC-300 data sets, to support our theory. Experimental results indicate that the proposed methods perform better than the existing feature selection methods.
Selection of suitable hand gestures for reliable myoelectric human computer interface.
Castro, Maria Claudia F; Arjunan, Sridhar P; Kumar, Dinesh K
2015-04-09
Myoelectric controlled prosthetic hand requires machine based identification of hand gestures using surface electromyogram (sEMG) recorded from the forearm muscles. This study has observed that a sub-set of the hand gestures have to be selected for an accurate automated hand gesture recognition, and reports a method to select these gestures to maximize the sensitivity and specificity. Experiments were conducted where sEMG was recorded from the muscles of the forearm while subjects performed hand gestures and then was classified off-line. The performances of ten gestures were ranked using the proposed Positive-Negative Performance Measurement Index (PNM), generated by a series of confusion matrices. When using all the ten gestures, the sensitivity and specificity was 80.0% and 97.8%. After ranking the gestures using the PNM, six gestures were selected and these gave sensitivity and specificity greater than 95% (96.5% and 99.3%); Hand open, Hand close, Little finger flexion, Ring finger flexion, Middle finger flexion and Thumb flexion. This work has shown that reliable myoelectric based human computer interface systems require careful selection of the gestures that have to be recognized and without such selection, the reliability is poor.
Sankar, A S Kamatchi; Vetrichelvan, Thangarasu; Venkappaya, Devashya
2011-09-01
In the present work, three different spectrophotometric methods for simultaneous estimation of ramipril, aspirin and atorvastatin calcium in raw materials and in formulations are described. Overlapped data was quantitatively resolved by using chemometric methods, viz. inverse least squares (ILS), principal component regression (PCR) and partial least squares (PLS). Calibrations were constructed using the absorption data matrix corresponding to the concentration data matrix. The linearity range was found to be 1-5, 10-50 and 2-10 μg mL-1 for ramipril, aspirin and atorvastatin calcium, respectively. The absorbance matrix was obtained by measuring the zero-order absorbance in the wavelength range between 210 and 320 nm. A training set design of the concentration data corresponding to the ramipril, aspirin and atorvastatin calcium mixtures was organized statistically to maximize the information content from the spectra and to minimize the error of multivariate calibrations. By applying the respective algorithms for PLS 1, PCR and ILS to the measured spectra of the calibration set, a suitable model was obtained. This model was selected on the basis of RMSECV and RMSEP values. The same was applied to the prediction set and capsule formulation. Mean recoveries of the commercial formulation set together with the figures of merit (calibration sensitivity, selectivity, limit of detection, limit of quantification and analytical sensitivity) were estimated. Validity of the proposed approaches was successfully assessed for analyses of drugs in the various prepared physical mixtures and formulations.
A Versatile Panel of Reference Gene Assays for the Measurement of Chicken mRNA by Quantitative PCR
Maier, Helena J.; Van Borm, Steven; Young, John R.; Fife, Mark
2016-01-01
Quantitative real-time PCR assays are widely used for the quantification of mRNA within avian experimental samples. Multiple stably-expressed reference genes, selected for the lowest variation in representative samples, can be used to control random technical variation. Reference gene assays must be reliable, have high amplification specificity and efficiency, and not produce signals from contaminating DNA. Whilst recent research papers identify specific genes that are stable in particular tissues and experimental treatments, here we describe a panel of ten avian gene primer and probe sets that can be used to identify suitable reference genes in many experimental contexts. The panel was tested with TaqMan and SYBR Green systems in two experimental scenarios: a tissue collection and virus infection of cultured fibroblasts. GeNorm and NormFinder algorithms were able to select appropriate reference gene sets in each case. We show the effects of using the selected genes on the detection of statistically significant differences in expression. The results are compared with those obtained using 28s ribosomal RNA, the present most widely accepted reference gene in chicken work, identifying circumstances where its use might provide misleading results. Methods for eliminating DNA contamination of RNA reduced, but did not completely remove, detectable DNA. We therefore attached special importance to testing each qPCR assay for absence of signal using DNA template. The assays and analyses developed here provide a useful resource for selecting reference genes for investigations of avian biology. PMID:27537060
Evaluating biomechanics of user-selected sitting and standing computer workstation.
Lin, Michael Y; Barbir, Ana; Dennerlein, Jack T
2017-11-01
A standing computer workstation has now become a popular modern work place intervention to reduce sedentary behavior at work. However, user's interaction related to a standing computer workstation and its differences with a sitting workstation need to be understood to assist in developing recommendations for use and set up. The study compared the differences in upper extremity posture and muscle activity between user-selected sitting and standing workstation setups. Twenty participants (10 females, 10 males) volunteered for the study. 3-D posture, surface electromyography, and user-reported discomfort were measured while completing simulated tasks with each participant's self-selected workstation setups. Sitting computer workstation associated with more non-neutral shoulder postures and greater shoulder muscle activity, while standing computer workstation induced greater wrist adduction angle and greater extensor carpi radialis muscle activity. Sitting computer workstation also associated with greater shoulder abduction postural variation (90th-10th percentile) while standing computer workstation associated with greater variation for should rotation and wrist extension. Users reported similar overall discomfort levels within the first 10 min of work but had more than twice as much discomfort while standing than sitting after 45 min; with most discomfort reported in the low back for standing and shoulder for sitting. These different measures provide understanding in users' different interactions with sitting and standing and by alternating between the two configurations in short bouts may be a way of changing the loading pattern on the upper extremity. Copyright © 2017 Elsevier Ltd. All rights reserved.
How Many Loci Does it Take to DNA Barcode a Crocus?
Seberg, Ole; Petersen, Gitte
2009-01-01
Background DNA barcoding promises to revolutionize the way taxonomists work, facilitating species identification by using small, standardized portions of the genome as substitutes for morphology. The concept has gained considerable momentum in many animal groups, but the higher plant world has been largely recalcitrant to the effort. In plants, efforts are concentrated on various regions of the plastid genome, but no agreement exists as to what kinds of regions are ideal, though most researchers agree that more than one region is necessary. One reason for this discrepancy is differences in the tests that are used to evaluate the performance of the proposed regions. Most tests have been made in a floristic setting, where the genetic distance and therefore the level of variation of the regions between taxa is large, or in a limited set of congeneric species. Methodology and Principal Findings Here we present the first in-depth coverage of a large taxonomic group, all 86 known species (except two doubtful ones) of crocus. Even six average-sized barcode regions do not identify all crocus species. This is currently an unrealistic burden in a barcode context. Whereas most proposed regions work well in a floristic context, the majority will – as is the case in crocus – undoubtedly be less efficient in a taxonomic setting. However, a reasonable but less than perfect level of identification may be reached – even in a taxonomic context. Conclusions/Significance The time is ripe for selecting barcode regions in plants, and for prudent examination of their utility. Thus, there is no reason for the plant community to hold back the barcoding effort by continued search for the Holy Grail. We must acknowledge that an emerging system will be far from perfect, fraught with problems and work best in a floristic setting. PMID:19240801
How many loci does it take to DNA barcode a crocus?
Seberg, Ole; Petersen, Gitte
2009-01-01
DNA barcoding promises to revolutionize the way taxonomists work, facilitating species identification by using small, standardized portions of the genome as substitutes for morphology. The concept has gained considerable momentum in many animal groups, but the higher plant world has been largely recalcitrant to the effort. In plants, efforts are concentrated on various regions of the plastid genome, but no agreement exists as to what kinds of regions are ideal, though most researchers agree that more than one region is necessary. One reason for this discrepancy is differences in the tests that are used to evaluate the performance of the proposed regions. Most tests have been made in a floristic setting, where the genetic distance and therefore the level of variation of the regions between taxa is large, or in a limited set of congeneric species. Here we present the first in-depth coverage of a large taxonomic group, all 86 known species (except two doubtful ones) of crocus. Even six average-sized barcode regions do not identify all crocus species. This is currently an unrealistic burden in a barcode context. Whereas most proposed regions work well in a floristic context, the majority will--as is the case in crocus--undoubtedly be less efficient in a taxonomic setting. However, a reasonable but less than perfect level of identification may be reached--even in a taxonomic context. The time is ripe for selecting barcode regions in plants, and for prudent examination of their utility. Thus, there is no reason for the plant community to hold back the barcoding effort by continued search for the Holy Grail. We must acknowledge that an emerging system will be far from perfect, fraught with problems and work best in a floristic setting.
Alaska digital aeromagnetic database description
Connard, G.G.; Saltus, R.W.; Hill, P.L.; Carlson, L.; Milicevic, B.
1999-01-01
Northwest Geophysical Associates, Inc. (NGA) was contracted by the U.S. Geological Survey (USGS) to construct a database containing original aeromagnotic data (in digital form) from surveys, maps and grids for the State of Alaska from existing public-domain magnetic data. This database facilitates thedetailed study and interpretation of aeromagnetic data along flightline profiles and allows construction of custom grids for selected regions of Alaska. The database is linked to and reflect? the work from the statewide gridded compilation completed under a prior contract. The statewide gridded compilation is also described in Saltus and Simmons (1997) and in Saltus and others (1999). The database area generally covers the on-shore portion of the State of Alaska and the northern Gulf of Alaska excluding the Aleutian Islands. The area extends from 54'N to 72'N latitude and 129'W to 169'W longitude. The database includes the 85 surveys that were included in the previous statewide gridded compilation. Figure (1) shows the extents of the 85 individual data sets included in the statewide grids. NGA subcontracted a significant portion of the work described in this report to Paterson, Grant, and Watson Limited (PGW). Prior work by PGW (described in Meyer and Saltus, 1995 and Meyer and others, 1998) for the interior portion of Alrska (INTAK) is included in this present study. The previous PGW project compiled 25 of the 85 surveys included in the statewide grids. PGW also contributed 10 additional data sets that were not included in either of the prior contracts or the statewide grids. These additional data sets are included in the current project in the interest of making the database as complete as possible. Figure (2) shows the location of the additional data sets.
Kamuzora, Peter; Maluka, Stephen; Ndawi, Benedict; Byskov, Jens; Hurtig, Anna-Karin
2013-01-01
Background Community participation in priority setting in health systems has gained importance all over the world, particularly in resource-poor settings where governments have often failed to provide adequate public-sector services for their citizens. Incorporation of public views into priority setting is perceived as a means to restore trust, improve accountability, and secure cost-effective priorities within healthcare. However, few studies have reported empirical experiences of involving communities in priority setting in developing countries. The aim of this article is to provide the experience of implementing community participation and the challenges of promoting it in the context of resource-poor settings, weak organizations, and fragile democratic institutions. Design Key informant interviews were conducted with the Council Health Management Team (CHMT), community representatives, namely women, youth, elderly, disabled, and people living with HIV/AIDS, and other stakeholders who participated in the preparation of the district annual budget and health plans. Additionally, minutes from the Action Research Team and planning and priority-setting meeting reports were analyzed. Results A number of benefits were reported: better identification of community needs and priorities, increased knowledge of the community representatives about priority setting, increased transparency and accountability, promoted trust among health systems and communities, and perceived improved quality and accessibility of health services. However, lack of funds to support the work of the selected community representatives, limited time for deliberations, short notice for the meetings, and lack of feedback on the approved priorities constrained the performance of the community representatives. Furthermore, the findings show the importance of external facilitation and support in enabling health professionals and community representatives to arrive at effective working arrangement. Conclusion Community participation in priority setting in developing countries, characterized by weak democratic institutions and low public awareness, requires effective mobilization of both communities and health systems. In addition, this study confirms that community participation is an important element in strengthening health systems. PMID:24280341
Burgess, Gregory C; Depue, Brendan E; Ruzic, Luka; Willcutt, Erik G; Du, Yiping P; Banich, Marie T
2010-04-01
Attentional control difficulties in individuals with attention-deficit/hyperactivity disorder (ADHD) might reflect poor working memory (WM) ability, especially because WM ability and attentional control rely on similar brain regions. The current study examined whether WM ability might explain group differences in brain activation between adults with ADHD and normal control subjects during attentional demand. Participants were 20 adults with ADHD combined subtype with no comorbid psychiatric or learning disorders and 23 control subjects similar in age, IQ, and gender. The WM measures were obtained from the Wechsler Adult Intelligence Scale-III and Wechsler Memory Scale-Revised. Brain activation was assessed with functional magnetic resonance imaging (fMRI) while performing a Color-Word Stroop task. Group differences in WM ability explained a portion of the activation in left dorsolateral prefrontal cortex (DLPFC), which has been related to the creation and maintenance of an attentional set for task-relevant information. In addition, greater WM ability predicted increased activation of brain regions related to stimulus-driven attention and response selection processes in the ADHD group but not in the control group. The inability to maintain an appropriate task set in young adults with combined type ADHD, associated with decreased activity in left DLPFC, might in part be due to poor WM ability. Furthermore, in individuals with ADHD, higher WM ability might relate to increased recruitment of stimulus-driven attention and response selection processes, perhaps as a compensatory strategy. Copyright 2010 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Šafka, J.; Ackermann, M.; Voleský, L.
2016-04-01
This paper deals with establishing of building parameters for 1.2344 (H13) tool steel processed using Selective Laser Melting (SLM) technology with layer thickness of 50 µm. In the first part of the work, testing matrix of models in the form of a cube with chamfered edge were built under various building parameters such as laser scanning speed and laser power. Resulting models were subjected to set of tests including measurement of surface roughness, inspection of inner structure with aid of Light Optical Microscopy and Scanning Electron Microscopy and evaluation of micro-hardness. These tests helped us to evaluate an influence of changes in building strategy to the properties of the resulting model. In the second part of the work, mechanical properties of the H13 steel were examined. For this purpose, the set of samples in the form of “dog bone” were printed under three different alignments towards the building plate and tested on universal testing machine. Mechanical testing of the samples should then reveal if the different orientation and thus different layering of the material somehow influence its mechanical properties. For this type of material, the producer provides the parameters for layer thickness of 30 µm only. Thus, our 50 µm building strategy brings shortening of the building time which is valuable especially for large models. Results of mechanical tests show slight variation in mechanical properties for various alignment of the sample.
Cox regression analysis with missing covariates via nonparametric multiple imputation.
Hsu, Chiu-Hsieh; Yu, Mandi
2018-01-01
We consider the situation of estimating Cox regression in which some covariates are subject to missing, and there exists additional information (including observed event time, censoring indicator and fully observed covariates) which may be predictive of the missing covariates. We propose to use two working regression models: one for predicting the missing covariates and the other for predicting the missing probabilities. For each missing covariate observation, these two working models are used to define a nearest neighbor imputing set. This set is then used to non-parametrically impute covariate values for the missing observation. Upon the completion of imputation, Cox regression is performed on the multiply imputed datasets to estimate the regression coefficients. In a simulation study, we compare the nonparametric multiple imputation approach with the augmented inverse probability weighted (AIPW) method, which directly incorporates the two working models into estimation of Cox regression, and the predictive mean matching imputation (PMM) method. We show that all approaches can reduce bias due to non-ignorable missing mechanism. The proposed nonparametric imputation method is robust to mis-specification of either one of the two working models and robust to mis-specification of the link function of the two working models. In contrast, the PMM method is sensitive to misspecification of the covariates included in imputation. The AIPW method is sensitive to the selection probability. We apply the approaches to a breast cancer dataset from Surveillance, Epidemiology and End Results (SEER) Program.
The career goals of nurses in some health care settings in Gauteng.
Jooste, K
2005-08-01
In nursing, purposeful career planning is essential if nurse practitioners want to make the right decisions about their work in order to strive towards and accomplish a meaningful quality of working life. Nurses should identify their career goals to be able to investigate their different career opportunities in their field of interest and direct their work according to a work strategy for years ahead. The purpose of this study was to explore and describe the career goals of post-basic nursing students with the aim of describing management strategies to guide the future career of post-basic nursing students in climbing the career ladder effectively and obtaining their set career goals. An explorative, descriptive, qualitative design was selected where the researcher worked inductively to explore and describe the needs (goals) and future planned actions of the participants regarding their career management as viewed for a period of five years. The researcher purposively and conveniently identified the sample as all the post-basic nursing students, namely 250 students, who were registered for the first, second and third year of nursing management courses in that period at a South African residential university. Two structured, open questions were developed. Each participant received the questions in writing and was asked to answer them. The QSR NUD*IST program was used for the qualitative management (categorization) of data. The results of the research questions related to five categories, namely becoming empowered, being promoted, being educated and professionally developed, partaking in research and taking up new projects.
Kinno, Ryuta; Shiromaru, Azusa; Mori, Yukiko; Futamura, Akinori; Kuroda, Takeshi; Yano, Satoshi; Murakami, Hidetomo; Ono, Kenjiro
2017-01-01
The Wechsler Memory Scale-Revised (WMS-R) is one of the internationally well-known batteries for memory assessment in a general memory clinic setting. Several factor structures of the WMS-R for patients aged under 74 have been proposed. However, little is known about the factor structure of the WMS-R for patients aged over 75 years and its neurological significance. Thus, we conducted exploratory factor analysis to determine the factor structure of the WMS-R for patients aged over 75 years in a memory clinic setting. Regional cerebral blood flow (rCBF) was calculated from single-photon emission computed tomography data. Cortical thickness and cortical fractal dimension, as the marker of cortical complexity, were calculated from high resolution magnetic resonance imaging data. We found that the four factors appeared to be the most appropriate solution to the model, including recognition memory, paired associate memory, visual-and-working memory, and attention as factors. Patients with mild cognitive impairments showed significantly higher factor scores for paired associate memory, visual-and-working memory, and attention than patients with Alzheimer's disease. Regarding the neuroimaging data, the factor scores for paired associate memory positively correlated with rCBF in the left pericallosal and hippocampal regions. Moreover, the factor score for paired associate memory showed most robust correlations with the cortical thickness in the limbic system, whereas the factor score for attention correlated with the cortical thickness in the bilateral precuneus. Furthermore, each factor score correlated with the cortical fractal dimension in the bilateral frontotemporal regions. Interestingly, the factor scores for the visual-and-working memory and attention selectively correlated with the cortical fractal dimension in the right posterior cingulate cortex and right precuneus cortex, respectively. These findings demonstrate that recognition memory, paired associate memory, visual-and-working memory, and attention can be crucial factors for interpreting the WMS-R results of elderly patients aged over 75 years in a memory clinic setting. Considering these findings, the results of WMS-R in elderly patients aged over 75 years in a memory clinic setting should be cautiously interpreted.
Stewart, Eugene L; Brown, Peter J; Bentley, James A; Willson, Timothy M
2004-08-01
A methodology for the selection and validation of nuclear receptor ligand chemical descriptors is described. After descriptors for a targeted chemical space were selected, a virtual screening methodology utilizing this space was formulated for the identification of potential NR ligands from our corporate collection. Using simple descriptors and our virtual screening method, we are able to quickly identify potential NR ligands from a large collection of compounds. As validation of the virtual screening procedure, an 8, 000-membered NR targeted set and a 24, 000-membered diverse control set of compounds were selected from our in-house general screening collection and screened in parallel across a number of orphan NR FRET assays. For the two assays that provided at least one hit per set by the established minimum pEC(50) for activity, the results showed a 2-fold increase in the hit-rate of the targeted compound set over the diverse set.
A new approach to evaluating the well-being of police.
Juniper, B; White, N; Bellamy, P
2010-10-01
There is a growing body of evidence that links employee well-being to organizational performance. Although police forces are under increasing pressure to improve efficiency and productivity, the evaluation of well-being in law enforcement is mostly restricted to self-report stress questionnaires that are based on questionable construction methodologies. No instrument to specifically determine the well-being of police force employees currently exists. To construct an instrument that measures the work-related well-being of officers and staff within a police force. The approach is drawn from well-established clinical models used to evaluate the well-being of patients. Potential variables were confirmed using an item selection method known as impact analysis that places keen emphasis on frequency and importance as perceived by the respondents themselves. Analyses of 822 completed response sets showed that nine separate dimensions of police work can adversely affect well-being (advancement, facilities, home work interface, job, physical health, psychological health, relationships, organizational and workload). Overall, officers showed inferior well-being compared with their colleagues. Content validity and adequate internal reliability were confirmed. This study considered a new robust approach to evaluating the well-being of all those working in law enforcement. The nine dimensions extended beyond conventional stress measures and may offer a practical alternative way of assessing the overall well-being status of an entire force using a systematic item selection framework.
Experiences of Physical Therapists Working in the Acute Hospital Setting: Systematic Review.
Lau, Bonnie; Skinner, Elizabeth H; Lo, Kristin; Bearman, Margaret
2016-09-01
Physical therapists working in acute care hospitals require unique skills to adapt to the challenging environment and short patient length of stay. Previous literature has reported burnout of clinicians and difficulty with staff retention; however, no systematic reviews have investigated qualitative literature in the area. The purpose of this study was to investigate the experiences of physical therapists working in acute hospitals. Six databases (MEDLINE, CINAHL Plus, EMBASE, AMED, PsycINFO, and Sociological Abstracts) were searched up to and including September 30, 2015, using relevant terms. Studies in English were selected if they included physical therapists working in an acute hospital setting, used qualitative methods, and contained themes or descriptive data relating to physical therapists' experiences. Data extraction included the study authors and year, settings, participant characteristics, aims, and methods. Key themes, explanatory models/theories, and implications for policy and practice were extracted, and quality assessment was conducted. Thematic analysis was used to conduct qualitative synthesis. Eight articles were included. Overall, study quality was high. Four main themes were identified describing factors that influence physical therapists' experience and clinical decision making: environmental/contextual factors, communication/relationships, the physical therapist as a person, and professional identity/role. Qualitative synthesis may be difficult to replicate. The majority of articles were from North America and Australia, limiting transferability of the findings. The identified factors, which interact to influence the experiences of acute care physical therapists, should be considered by therapists and their managers to optimize the physical therapy role in acute care. Potential strategies include promotion of interprofessional and collegial relationships, clear delineation of the physical therapy role, multidisciplinary team member education, additional support staff, and innovative models of care to address funding and staff shortages. © 2016 American Physical Therapy Association.
Parallel In Situ Indexing for Data-intensive Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Jinoh; Abbasi, Hasan; Chacon, Luis
2011-09-09
As computing power increases exponentially, vast amount of data is created by many scientific re- search activities. However, the bandwidth for storing the data to disks and reading the data from disks has been improving at a much slower pace. These two trends produce an ever-widening data access gap. Our work brings together two distinct technologies to address this data access issue: indexing and in situ processing. From decades of database research literature, we know that indexing is an effective way to address the data access issue, particularly for accessing relatively small fraction of data records. As data sets increasemore » in sizes, more and more analysts need to use selective data access, which makes indexing an even more important for improving data access. The challenge is that most implementations of in- dexing technology are embedded in large database management systems (DBMS), but most scientific datasets are not managed by any DBMS. In this work, we choose to include indexes with the scientific data instead of requiring the data to be loaded into a DBMS. We use compressed bitmap indexes from the FastBit software which are known to be highly effective for query-intensive workloads common to scientific data analysis. To use the indexes, we need to build them first. The index building procedure needs to access the whole data set and may also require a significant amount of compute time. In this work, we adapt the in situ processing technology to generate the indexes, thus removing the need of read- ing data from disks and to build indexes in parallel. The in situ data processing system used is ADIOS, a middleware for high-performance I/O. Our experimental results show that the indexes can improve the data access time up to 200 times depending on the fraction of data selected, and using in situ data processing system can effectively reduce the time needed to create the indexes, up to 10 times with our in situ technique when using identical parallel settings.« less
An intervention approach for children with teacher- and parent-identified attentional difficulties.
Semrud-Clikeman, M; Nielsen, K H; Clinton, A; Sylvester, L; Parle, N; Connor, R T
1999-01-01
Using a multimodal and multi-informant method for diagnosis, we selected 33 children by teacher and parent nomination for attention and work completion problems that met DSM-IV criteria for attention-deficit/hyperactivity disorder (ADHD). Of the 33 children in this group, 21 participated in the initial intervention, and 12 were placed in an ADHD control group and received the intervention after pre- and posttesting. A similarly selected group of 21 children without difficulties in attention and work completion served as a control group. Each child was assessed on pre- and posttest measures of visual and auditory attention. After an 18-week intervention period that included attention and problem-solving training, all children in the intervention and control groups were retested on visual and auditory tasks. Children in both ADHD groups showed significantly poorer initial performance on the visual attention task. Whereas the ADHD intervention group showed commensurate performance to the nondisabled control group after training, the ADHD control group did not show significant improvement over the same period. Auditory attention was poorer compared to the control group for both ADHD groups initially and improved only for the ADHD intervention group. These findings are discussed as a possible intervention for children with difficulties in strategy selection in a classroom setting.
A Selectivity based approach to Continuous Pattern Detection in Streaming Graphs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Choudhury, Sutanay; Holder, Larry; Chin, George
2015-02-02
Cyber security is one of the most significant technical challenges in current times. Detecting adversarial activities, prevention of theft of intellectual properties and customer data is a high priority for corporations and government agencies around the world. Cyber defenders need to analyze massive-scale, high-resolution network flows to identify, categorize, and mitigate attacks involving net- works spanning institutional and national boundaries. Many of the cyber attacks can be described as subgraph patterns, with promi- nent examples being insider infiltrations (path queries), denial of service (parallel paths) and malicious spreads (tree queries). This motivates us to explore subgraph matching on streaming graphsmore » in a continuous setting. The novelty of our work lies in using the subgraph distributional statistics collected from the streaming graph to determine the query processing strategy. We introduce a “Lazy Search" algorithm where the search strategy is decided on a vertex-to-vertex basis depending on the likelihood of a match in the vertex neighborhood. We also propose a metric named “Relative Selectivity" that is used to se- lect between different query processing strategies. Our experiments performed on real online news, network traffic stream and a syn- thetic social network benchmark demonstrate 10-100x speedups over selectivity agnostic approaches.« less
Effect of Bearings on Vibration in Rotating Machinery
NASA Astrophysics Data System (ADS)
Daniel, Rudrapati Victor; Amit Siddhappa, Savale; Bhushan Gajanan, Savale; Vipin Philip, S.; Paul, P. Sam
2017-08-01
In rotary machines vibration is an inherent phenomenon which has the tendency to affect required performance. Amongst the different parameters that affect vibration, selection of appropriate bearing is the most critical component. In this work the effect of different types of bearing on vibration in rotary machines is studied and the magnitude of vibration produced by use of different set of bearings under the same condition of loads and rotational speeds were investigated. Bearings considered in this work were ball bearing, tapered roller bearing, thrust bearing and shaft material considered is of mild steel. From experimental result, it was noted that tapered roller bearing gives the highest amplitude of vibration among all the three bearings whereas the ball bearing gives least amplitude under similar operating conditions.
Patients with chronic insomnia have selective impairments in memory that are modulated by cortisol.
Chen, Gui-Hai; Xia, Lan; Wang, Fang; Li, Xue-Wei; Jiao, Chuan-An
2016-10-01
Memory impairment is a frequent complaint in insomniacs; however, it is not consistently demonstrated. It is unknown whether memory impairment in insomniacs involves neuroendocrine dysfunction. The participants in this study were selected from the clinical setting and included 21 patients with chronic insomnia disorder (CID), 25 patients with insomnia and comorbid depressive disorder (CDD), and 20 control participants without insomnia. We evaluated spatial working and reference memory, object working and reference memory, and object recognition memory using the Nine Box Maze Test. We also evaluated serum neuroendocrine hormone levels. Compared to the controls, the CID patients made significantly more errors in spatial working and object recognition memory (p < .05), whereas the CDD patients performed poorly in all the assessed memory types (p < .05). In addition, the CID patients had higher levels (mean difference [95% CI]) of corticotrophin-releasing hormone, cortisol (31.98 [23.97, 39.98] μg/l), total triiodothyronine (667.58 [505.71, 829.45] μg/l), and total thyroxine (41.49 [33.23, 49.74] μg/l) (p < .05), and lower levels of thyrotropin-releasing hormone (-35.93 [-38.83, -33.02] ng/l), gonadotropin-releasing hormone (-4.50 [-5.02, -3.98] ng/l) (p < .05), and adrenocorticotropic hormone compared to the CDD patients. After controlling for confounding variables, the partial correlation analysis revealed that the levels of cortisol positively correlated with the errors in object working memory (r = .534, p = .033) and negatively correlated with the errors in object recognition memory (r = -.659, p = .006) in the CID patients. The results suggest that the CID patients had selective memory impairment, which may be mediated by increased cortisol levels. © 2016 Society for Psychophysiological Research.
2015-01-01
Retinal fundus images are widely used in diagnosing and providing treatment for several eye diseases. Prior works using retinal fundus images detected the presence of exudation with the aid of publicly available dataset using extensive segmentation process. Though it was proved to be computationally efficient, it failed to create a diabetic retinopathy feature selection system for transparently diagnosing the disease state. Also the diagnosis of diseases did not employ machine learning methods to categorize candidate fundus images into true positive and true negative ratio. Several candidate fundus images did not include more detailed feature selection technique for diabetic retinopathy. To apply machine learning methods and classify the candidate fundus images on the basis of sliding window a method called, Diabetic Fundus Image Recuperation (DFIR) is designed in this paper. The initial phase of DFIR method select the feature of optic cup in digital retinal fundus images based on Sliding Window Approach. With this, the disease state for diabetic retinopathy is assessed. The feature selection in DFIR method uses collection of sliding windows to obtain the features based on the histogram value. The histogram based feature selection with the aid of Group Sparsity Non-overlapping function provides more detailed information of features. Using Support Vector Model in the second phase, the DFIR method based on Spiral Basis Function effectively ranks the diabetic retinopathy diseases. The ranking of disease level for each candidate set provides a much promising result for developing practically automated diabetic retinopathy diagnosis system. Experimental work on digital fundus images using the DFIR method performs research on the factors such as sensitivity, specificity rate, ranking efficiency and feature selection time. PMID:25974230
How Well Does Medicaid Work in Improving Access to Care?
Long, Sharon K; Coughlin, Teresa; King, Jennifer
2005-01-01
Objective To provide an assessment of how well the Medicaid program is working at improving access to and use of health care for low-income mothers. Data Source/Study Setting The 1997 and 1999 National Survey of America's Families, with state and county information drawn from the Area Resource File and other sources. Study Design Estimate the effects of Medicaid on access and use relative to private coverage and being uninsured, using instrumental variables methods to control for selection into insurance status. Data Collection/Extraction Method This study combines data from 1997 and 1999 for mothers in families with incomes below 200 percent of the federal poverty level. Principal Findings We find that Medicaid beneficiaries' access and use are significantly better than those obtained by the uninsured. Analysis that controls for insurance selection shows that the benefits of having Medicaid coverage versus being uninsured are substantially larger than what is estimated when selection is not accounted for. Our results also indicate that Medicaid beneficiaries' access and use are comparable to that of the low-income privately insured. Once insurance selection is controlled for, access and use under Medicaid is not significantly different from access and use under private insurance. Without controls for insurance selection, access and use for Medicaid beneficiaries is found to be significantly worse than for the low-income privately insured. Conclusions Our results show that the Medicaid program improved access to care relative to uninsurance for low-income mothers, achieving access and use levels comparable to those of the privately insured. Our results also indicate that prior research, which generally has not controlled for selection into insurance coverage, has likely understated the gains of Medicaid relative to uninsurance and overstated the gains of private coverage relative to Medicaid. PMID:15663701
Occupational health issues in small-scale industries in Sri Lanka: An underreported burden.
Suraweera, Inoka K; Wijesinghe, Supun D; Senanayake, Sameera J; Herath, Hema D B; Jayalal, T B Ananda
2016-10-17
Work-related diseases and occupational accidents affect a significant number of workers globally. The majority of these diseases and accidents are reported from developing countries; and a large percentage of the workforce in developing countries is estimated to be employed in small-scale industries. Sri Lanka is no exception. These workers are exposed to occupational hazards and are at a great risk of developing work- related diseases and injuries. To identify occupational health issues faced by small-scale industry workers in Sri Lanka. A cross sectional study was conducted among workers in four selected small-scale industry categories in two districts of Sri Lanka. A small-scale industry was defined as a work setting with less than 20 workers. Cluster sampling using probability proportionate to size of workers was used. Eighty clusters with a cluster size of eight from each district were selected. Data was collected using a pre-tested interviewer administered questionnaire. Our study surveyed 198 industries. Headache (2.2%, 95% CI 1.5-3.1) and eye problems (2.1%, 95% CI 1.4-2.9) were the commonest general health issues detected. Back pain (4.8%, 95% CI 3.8-6.1) was the most prevalent work-related musculoskeletal pain reported. Knee pain was the second highest (4.4%, 95% CI 3.4-5.6). Most of the work-related musculoskeletal pain was either of short duration or long lasting. Work-related musculoskeletal pain was much more common than the general health issues reported. Health promotional programs at workplaces focusing ergonomics will benefit the workers at small-scale industries inSri Lanka.
Harrison, Luke B; Larsson, Hans C E
2015-03-01
Likelihood-based methods are commonplace in phylogenetic systematics. Although much effort has been directed toward likelihood-based models for molecular data, comparatively less work has addressed models for discrete morphological character (DMC) data. Among-character rate variation (ACRV) may confound phylogenetic analysis, but there have been few analyses of the magnitude and distribution of rate heterogeneity among DMCs. Using 76 data sets covering a range of plants, invertebrate, and vertebrate animals, we used a modified version of MrBayes to test equal, gamma-distributed and lognormally distributed models of ACRV, integrating across phylogenetic uncertainty using Bayesian model selection. We found that in approximately 80% of data sets, unequal-rates models outperformed equal-rates models, especially among larger data sets. Moreover, although most data sets were equivocal, more data sets favored the lognormal rate distribution relative to the gamma rate distribution, lending some support for more complex character correlations than in molecular data. Parsimony estimation of the underlying rate distributions in several data sets suggests that the lognormal distribution is preferred when there are many slowly evolving characters and fewer quickly evolving characters. The commonly adopted four rate category discrete approximation used for molecular data was found to be sufficient to approximate a gamma rate distribution with discrete characters. However, among the two data sets tested that favored a lognormal rate distribution, the continuous distribution was better approximated with at least eight discrete rate categories. Although the effect of rate model on the estimation of topology was difficult to assess across all data sets, it appeared relatively minor between the unequal-rates models for the one data set examined carefully. As in molecular analyses, we argue that researchers should test and adopt the most appropriate model of rate variation for the data set in question. As discrete characters are increasingly used in more sophisticated likelihood-based phylogenetic analyses, it is important that these studies be built on the most appropriate and carefully selected underlying models of evolution. © The Author(s) 2014. Published by Oxford University Press, on behalf of the Society of Systematic Biologists. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Unbiased feature selection in learning random forests for high-dimensional data.
Nguyen, Thanh-Tung; Huang, Joshua Zhexue; Nguyen, Thuy Thi
2015-01-01
Random forests (RFs) have been widely used as a powerful classification method. However, with the randomization in both bagging samples and feature selection, the trees in the forest tend to select uninformative features for node splitting. This makes RFs have poor accuracy when working with high-dimensional data. Besides that, RFs have bias in the feature selection process where multivalued features are favored. Aiming at debiasing feature selection in RFs, we propose a new RF algorithm, called xRF, to select good features in learning RFs for high-dimensional data. We first remove the uninformative features using p-value assessment, and the subset of unbiased features is then selected based on some statistical measures. This feature subset is then partitioned into two subsets. A feature weighting sampling technique is used to sample features from these two subsets for building trees. This approach enables one to generate more accurate trees, while allowing one to reduce dimensionality and the amount of data needed for learning RFs. An extensive set of experiments has been conducted on 47 high-dimensional real-world datasets including image datasets. The experimental results have shown that RFs with the proposed approach outperformed the existing random forests in increasing the accuracy and the AUC measures.
Evans, Karla K; Horowitz, Todd S; Howe, Piers; Pedersini, Roccardo; Reijnen, Ester; Pinto, Yair; Kuzmova, Yoana; Wolfe, Jeremy M
2011-09-01
A typical visual scene we encounter in everyday life is complex and filled with a huge amount of perceptual information. The term, 'visual attention' describes a set of mechanisms that limit some processing to a subset of incoming stimuli. Attentional mechanisms shape what we see and what we can act upon. They allow for concurrent selection of some (preferably, relevant) information and inhibition of other information. This selection permits the reduction of complexity and informational overload. Selection can be determined both by the 'bottom-up' saliency of information from the environment and by the 'top-down' state and goals of the perceiver. Attentional effects can take the form of modulating or enhancing the selected information. A central role for selective attention is to enable the 'binding' of selected information into unified and coherent representations of objects in the outside world. In the overview on visual attention presented here we review the mechanisms and consequences of selection and inhibition over space and time. We examine theoretical, behavioral and neurophysiologic work done on visual attention. We also discuss the relations between attention and other cognitive processes such as automaticity and awareness. WIREs Cogni Sci 2011 2 503-514 DOI: 10.1002/wcs.127 For further resources related to this article, please visit the WIREs website. Copyright © 2011 John Wiley & Sons, Ltd.
Performance Analysis of Relay Subset Selection for Amplify-and-Forward Cognitive Relay Networks
Qureshi, Ijaz Mansoor; Malik, Aqdas Naveed; Zubair, Muhammad
2014-01-01
Cooperative communication is regarded as a key technology in wireless networks, including cognitive radio networks (CRNs), which increases the diversity order of the signal to combat the unfavorable effects of the fading channels, by allowing distributed terminals to collaborate through sophisticated signal processing. Underlay CRNs have strict interference constraints towards the secondary users (SUs) active in the frequency band of the primary users (PUs), which limits their transmit power and their coverage area. Relay selection offers a potential solution to the challenges faced by underlay networks, by selecting either single best relay or a subset of potential relay set under different design requirements and assumptions. The best relay selection schemes proposed in the literature for amplify-and-forward (AF) based underlay cognitive relay networks have been very well studied in terms of outage probability (OP) and bit error rate (BER), which is deficient in multiple relay selection schemes. The novelty of this work is to study the outage behavior of multiple relay selection in the underlay CRN and derive the closed-form expressions for the OP and BER through cumulative distribution function (CDF) of the SNR received at the destination. The effectiveness of relay subset selection is shown through simulation results. PMID:24737980
Selection of multiple cued items is possible during visual short-term memory maintenance.
Matsukura, Michi; Vecera, Shaun P
2015-07-01
Recent neuroimaging studies suggest that maintenance of a selected object feature held in visual short-term/working memory (VSTM/VWM) is supported by the same neural mechanisms that encode the sensory information. If VSTM operates by retaining "reasonable copies" of scenes constructed during sensory processing (Serences, Ester, Vogel, & Awh, 2009, p. 207, the sensory recruitment hypothesis), then attention should be able to select multiple items represented in VSTM as long as the number of these attended items does not exceed the typical VSTM capacity. It is well known that attention can select at least two noncontiguous locations at the same time during sensory processing. However, empirical reports from the studies that examined this possibility are inconsistent. In the present study, we demonstrate that (1) attention can indeed select more than a single item during VSTM maintenance when observers are asked to recognize a set of items in the manner that these items were originally attended, and (2) attention can select multiple cued items regardless of whether these items are perceptually organized into a single group (contiguous locations) or not (noncontiguous locations). The results also replicate and extend the recent finding that selective attention that operates during VSTM maintenance is sensitive to the observers' goal and motivation to use the cueing information.
Practice settings and dentists' job satisfaction.
Lo Sasso, Anthony T; Starkel, Rebecca L; Warren, Matthew N; Guay, Albert H; Vujicic, Marko
2015-08-01
The nature and organization of dental practice is changing. The aim of this study was to explore how job satisfaction among dentists is associated with dental practice setting. A survey measured satisfaction with income, benefits, hours worked, clinical autonomy, work-life balance, emotional exhaustion, and overall satisfaction among dentists working in large group, small group, and solo practice settings; 2,171 dentists responded. The authors used logistic regression to measure differences in reported levels of satisfaction across practice settings. Dentists working in small group settings reported the most satisfaction overall. Dentists working in large group settings reported more satisfaction with income and benefits than dentists in solo practice, as well as having the least stress. Findings suggest possible advantages and disadvantages of working in different types of practice settings. Dentists working in different practice settings reported differences in satisfaction. These results may help dentists decide which practice setting is best for them. Copyright © 2015 American Dental Association. Published by Elsevier Inc. All rights reserved.
Building an ACT-R Reader for Eye-Tracking Corpus Data.
Dotlačil, Jakub
2018-01-01
Cognitive architectures have often been applied to data from individual experiments. In this paper, I develop an ACT-R reader that can model a much larger set of data, eye-tracking corpus data. It is shown that the resulting model has a good fit to the data for the considered low-level processes. Unlike previous related works (most prominently, Engelmann, Vasishth, Engbert & Kliegl, ), the model achieves the fit by estimating free parameters of ACT-R using Bayesian estimation and Markov-Chain Monte Carlo (MCMC) techniques, rather than by relying on the mix of manual selection + default values. The method used in the paper is generalizable beyond this particular model and data set and could be used on other ACT-R models. Copyright © 2017 Cognitive Science Society, Inc.
Managing the Perception of Advanced Technology Risks in Mission Proposals
NASA Technical Reports Server (NTRS)
Bellisario, Sebastian Nickolai
2012-01-01
Through my work in the project proposal office I became interested in how technology advancement efforts affect competitive mission proposals. Technology development allows for new instruments and functionality. However, including technology advancement in a mission proposal often increases perceived risk. Risk mitigation has a major impact on the overall evaluation of the proposal and whether the mission is selected. In order to evaluate the different approaches proposals took I compared the proposals claims of heritage and technology advancement to the sponsor feedback provided in the NASA debriefs. I examined a set of Discovery 2010 Mission proposals to draw patterns in how they were evaluated and come up with a set of recommendations for future mission proposals in how they should approach technology advancement to reduce the perceived risk.
Flumignan, Danilo Luiz; Boralle, Nivaldo; Oliveira, José Eduardo de
2010-06-30
In this work, the combination of carbon nuclear magnetic resonance ((13)C NMR) fingerprinting with pattern-recognition analyses provides an original and alternative approach to screening commercial gasoline quality. Soft Independent Modelling of Class Analogy (SIMCA) was performed on spectroscopic fingerprints to classify representative commercial gasoline samples, which were selected by Hierarchical Cluster Analyses (HCA) over several months in retails services of gas stations, into previously quality-defined classes. Following optimized (13)C NMR-SIMCA algorithm, sensitivity values were obtained in the training set (99.0%), with leave-one-out cross-validation, and external prediction set (92.0%). Governmental laboratories could employ this method as a rapid screening analysis to discourage adulteration practices. Copyright 2010 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Toropov, V. S.
2018-05-01
The paper suggests a set of measures to select the equipment and its components in order to reduce energy costs in the process of pulling the pipeline into the well in the constructing the trenchless pipeline crossings of various materials using horizontal directional drilling technology. A methodology for reducing energy costs has been developed by regulating the operation modes of equipment during the process of pulling the working pipeline into a drilled and pre-expanded well. Since the power of the drilling rig is the most important criterion in the selection of equipment for the construction of a trenchless crossover, an algorithm is proposed for calculating the required capacity of the rig when operating in different modes in the process of pulling the pipeline into the well.
NASA Astrophysics Data System (ADS)
Gao, Yi; Zhu, Liangjia; Norton, Isaiah; Agar, Nathalie Y. R.; Tannenbaum, Allen
2014-03-01
Desorption electrospray ionization mass spectrometry (DESI-MS) provides a highly sensitive imaging technique for differentiating normal and cancerous tissue at the molecular level. This can be very useful, especially under intra-operative conditions where the surgeon has to make crucial decision about the tumor boundary. In such situations, the time it takes for imaging and data analysis becomes a critical factor. Therefore, in this work we utilize compressive sensing to perform the sparse sampling of the tissue, which halves the scanning time. Furthermore, sparse feature selection is performed, which not only reduces the dimension of data from about 104 to less than 50, and thus significantly shortens the analysis time. This procedure also identifies biochemically important molecules for further pathological analysis. The methods are validated on brain and breast tumor data sets.
NASA Astrophysics Data System (ADS)
Giana, Fabián Eduardo; Bonetto, Fabián José; Bellotti, Mariela Inés
2018-03-01
In this work we present an assay to discriminate between normal and cancerous cells. The method is based on the measurement of electrical impedance spectra of in vitro cell cultures. We developed a protocol consisting on four consecutive measurement phases, each of them designed to obtain different information about the cell cultures. Through the analysis of the measured data, 26 characteristic features were obtained for both cell types. From the complete set of features, we selected the most relevant in terms of their discriminant capacity by means of conventional statistical tests. A linear discriminant analysis was then carried out on the selected features, allowing the classification of the samples in normal or cancerous with 4.5% of false positives and no false negatives.
A fractured rock geophysical toolbox method selection tool
Day-Lewis, F. D.; Johnson, C.D.; Slater, L.D.; Robinson, J.L.; Williams, J.H.; Boyden, C.L.; Werkema, D.D.; Lane, J.W.
2016-01-01
Geophysical technologies have the potential to improve site characterization and monitoring in fractured rock, but the appropriate and effective application of geophysics at a particular site strongly depends on project goals (e.g., identifying discrete fractures) and site characteristics (e.g., lithology). No method works at every site or for every goal. New approaches are needed to identify a set of geophysical methods appropriate to specific project goals and site conditions while considering budget constraints. To this end, we present the Excel-based Fractured-Rock Geophysical Toolbox Method Selection Tool (FRGT-MST). We envision the FRGT-MST (1) equipping remediation professionals with a tool to understand what is likely to be realistic and cost-effective when contracting geophysical services, and (2) reducing applications of geophysics with unrealistic objectives or where methods are likely to fail.
Giana, Fabián Eduardo; Bonetto, Fabián José; Bellotti, Mariela Inés
2018-03-01
In this work we present an assay to discriminate between normal and cancerous cells. The method is based on the measurement of electrical impedance spectra of in vitro cell cultures. We developed a protocol consisting on four consecutive measurement phases, each of them designed to obtain different information about the cell cultures. Through the analysis of the measured data, 26 characteristic features were obtained for both cell types. From the complete set of features, we selected the most relevant in terms of their discriminant capacity by means of conventional statistical tests. A linear discriminant analysis was then carried out on the selected features, allowing the classification of the samples in normal or cancerous with 4.5% of false positives and no false negatives.
Benkert, Pascal; Schwede, Torsten; Tosatto, Silvio Ce
2009-05-20
The selection of the most accurate protein model from a set of alternatives is a crucial step in protein structure prediction both in template-based and ab initio approaches. Scoring functions have been developed which can either return a quality estimate for a single model or derive a score from the information contained in the ensemble of models for a given sequence. Local structural features occurring more frequently in the ensemble have a greater probability of being correct. Within the context of the CASP experiment, these so called consensus methods have been shown to perform considerably better in selecting good candidate models, but tend to fail if the best models are far from the dominant structural cluster. In this paper we show that model selection can be improved if both approaches are combined by pre-filtering the models used during the calculation of the structural consensus. Our recently published QMEAN composite scoring function has been improved by including an all-atom interaction potential term. The preliminary model ranking based on the new QMEAN score is used to select a subset of reliable models against which the structural consensus score is calculated. This scoring function called QMEANclust achieves a correlation coefficient of predicted quality score and GDT_TS of 0.9 averaged over the 98 CASP7 targets and perform significantly better in selecting good models from the ensemble of server models than any other groups participating in the quality estimation category of CASP7. Both scoring functions are also benchmarked on the MOULDER test set consisting of 20 target proteins each with 300 alternatives models generated by MODELLER. QMEAN outperforms all other tested scoring functions operating on individual models, while the consensus method QMEANclust only works properly on decoy sets containing a certain fraction of near-native conformations. We also present a local version of QMEAN for the per-residue estimation of model quality (QMEANlocal) and compare it to a new local consensus-based approach. Improved model selection is obtained by using a composite scoring function operating on single models in order to enrich higher quality models which are subsequently used to calculate the structural consensus. The performance of consensus-based methods such as QMEANclust highly depends on the composition and quality of the model ensemble to be analysed. Therefore, performance estimates for consensus methods based on large meta-datasets (e.g. CASP) might overrate their applicability in more realistic modelling situations with smaller sets of models based on individual methods.
Moving from Efficacy to Effectiveness Trials in Prevention Research
Marchand, Erica; Stice, Eric; Rohde, Paul; Becker, Carolyn Black
2013-01-01
Efficacy trials test whether interventions work under optimal, highly controlled conditions whereas effectiveness trials test whether interventions work with typical clients and providers in real-world settings. Researchers, providers, and funding bodies have called for more effectiveness trials to understand whether interventions produce effects under ecologically valid conditions, which factors predict program effectiveness, and what strategies are needed to successfully implement programs in practice settings. The transition from efficacy to effectiveness with preventive interventions involves unique considerations, some of which are not shared by treatment research. The purpose of this article is to discuss conceptual and methodological issues that arise when making the transition from efficacy to effectiveness research in primary, secondary, and tertiary prevention, drawing on the experiences of two complimentary research groups as well as the existing literature. We address (a) program of research, (b) intervention design and conceptualization, (c) participant selection and characteristics, (d) providers, (e) context, (f) measurement and methodology, (g) outcomes, (h) cost, and (i) sustainability. We present examples of research in eating disorder prevention that demonstrate the progression from efficacy to effectiveness trials. PMID:21092935
From guideline modeling to guideline execution: defining guideline-based decision-support services.
Tu, S. W.; Musen, M. A.
2000-01-01
We describe our task-based approach to defining the guideline-based decision-support services that the EON system provides. We categorize uses of guidelines in patient-specific decision support into a set of generic tasks--making of decisions, specification of work to be performed, interpretation of data, setting of goals, and issuance of alert and reminders--that can be solved using various techniques. Our model includes constructs required for representing the knowledge used by these techniques. These constructs form a toolkit from which developers can select modeling solutions for guideline task. Based on the tasks and the guideline model, we define a guideline-execution architecture and a model of interactions between a decision-support server and clients that invoke services provided by the server. These services use generic interfaces derived from guideline tasks and their associated modeling constructs. We describe two implementations of these decision-support services and discuss how this work can be generalized. We argue that a well-defined specification of guideline-based decision-support services will facilitate sharing of tools that implement computable clinical guidelines. PMID:11080007
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holden, Zachary C.; Richard, Ryan M.; Herbert, John M., E-mail: herbert@chemistry.ohio-state.edu
2013-12-28
An implementation of Ewald summation for use in mixed quantum mechanics/molecular mechanics (QM/MM) calculations is presented, which builds upon previous work by others that was limited to semi-empirical electronic structure for the QM region. Unlike previous work, our implementation describes the wave function's periodic images using “ChElPG” atomic charges, which are determined by fitting to the QM electrostatic potential evaluated on a real-space grid. This implementation is stable even for large Gaussian basis sets with diffuse exponents, and is thus appropriate when the QM region is described by a correlated wave function. Derivatives of the ChElPG charges with respect tomore » the QM density matrix are a potentially serious bottleneck in this approach, so we introduce a ChElPG algorithm based on atom-centered Lebedev grids. The ChElPG charges thus obtained exhibit good rotational invariance even for sparse grids, enabling significant cost savings. Detailed analysis of the optimal choice of user-selected Ewald parameters, as well as timing breakdowns, is presented.« less
Governance of professional nursing practice in a hospital setting: a mixed methods study.
dos Santos, José Luís Guedes; Erdmann, Alacoque Lorenzini
2015-01-01
To elaborate an interpretative model for the governance of professional nursing practice in a hospital setting. A mixed methods study with concurrent triangulation strategy, using data from a cross-sectional study with 106 nurses and a Grounded Theory study with 63 participants. The quantitative data were collected through the Brazilian Nursing Work Index - Revised and underwent descriptive statistical analysis. Qualitative data were obtained from interviews and analyzed through initial, selective and focused coding. Based on the results obtained with the Brazilian Nursing Work Index - Revised, it is possible to state that nurses perceived that they had autonomy, control over the environment, good relationships with physicians and organizational support for nursing governance. The governance of the professional nursing practice is based on the management of nursing care and services carried out by the nurses. To perform these tasks, nurses aim to get around the constraints of the organizational support and develop management knowledge and skills. It is important to reorganize the structures and processes of nursing governance, especially the support provided by the organization for the management practices of nurses.
What you say matters: exploring visual-verbal interactions in visual working memory.
Mate, Judit; Allen, Richard J; Baqués, Josep
2012-01-01
The aim of this study was to explore whether the content of a simple concurrent verbal load task determines the extent of its interference on memory for coloured shapes. The task consisted of remembering four visual items while repeating aloud a pair of words that varied in terms of imageability and relatedness to the task set. At test, a cue appeared that was either the colour or the shape of one of the previously seen objects, with participants required to select the object's other feature from a visual array. During encoding and retention, there were four verbal load conditions: (a) a related, shape-colour pair (from outside the experimental set, i.e., "pink square"); (b) a pair of unrelated but visually imageable, concrete, words (i.e., "big elephant"); (c) a pair of unrelated and abstract words (i.e., "critical event"); and (d) no verbal load. Results showed differential effects of these verbal load conditions. In particular, imageable words (concrete and related conditions) interfered to a greater degree than abstract words. Possible implications for how visual working memory interacts with verbal memory and long-term memory are discussed.
NASA Astrophysics Data System (ADS)
Standvoss, K.; Crijns, T.; Goerke, L.; Janssen, D.; Kern, S.; van Niedek, T.; van Vugt, J.; Alfonso Burgos, N.; Gerritse, E. J.; Mol, J.; van de Vooren, D.; Ghafoorian, M.; van den Heuvel, T. L. A.; Manniesing, R.
2018-02-01
The number and location of cerebral microbleeds (CMBs) in patients with traumatic brain injury (TBI) is important to determine the severity of trauma and may hold prognostic value for patient outcome. However, manual assessment is subjective and time-consuming due to the resemblance of CMBs to blood vessels, the possible presence of imaging artifacts, and the typical heterogeneity of trauma imaging data. In this work, we present a computer aided detection system based on 3D convolutional neural networks for detecting CMBs in 3D susceptibility weighted images. Network architectures with varying depth were evaluated. Data augmentation techniques were employed to improve the networks' generalization ability and selective sampling was implemented to handle class imbalance. The predictions of the models were clustered using a connected component analysis. The system was trained on ten annotated scans and evaluated on an independent test set of eight scans. Despite this limited data set, the system reached a sensitivity of 0.87 at 16.75 false positives per scan (2.5 false positives per CMB), outperforming related work on CMB detection in TBI patients.
Song, Sutao; Zhan, Zhichao; Long, Zhiying; Zhang, Jiacai; Yao, Li
2011-01-01
Background Support vector machine (SVM) has been widely used as accurate and reliable method to decipher brain patterns from functional MRI (fMRI) data. Previous studies have not found a clear benefit for non-linear (polynomial kernel) SVM versus linear one. Here, a more effective non-linear SVM using radial basis function (RBF) kernel is compared with linear SVM. Different from traditional studies which focused either merely on the evaluation of different types of SVM or the voxel selection methods, we aimed to investigate the overall performance of linear and RBF SVM for fMRI classification together with voxel selection schemes on classification accuracy and time-consuming. Methodology/Principal Findings Six different voxel selection methods were employed to decide which voxels of fMRI data would be included in SVM classifiers with linear and RBF kernels in classifying 4-category objects. Then the overall performances of voxel selection and classification methods were compared. Results showed that: (1) Voxel selection had an important impact on the classification accuracy of the classifiers: in a relative low dimensional feature space, RBF SVM outperformed linear SVM significantly; in a relative high dimensional space, linear SVM performed better than its counterpart; (2) Considering the classification accuracy and time-consuming holistically, linear SVM with relative more voxels as features and RBF SVM with small set of voxels (after PCA) could achieve the better accuracy and cost shorter time. Conclusions/Significance The present work provides the first empirical result of linear and RBF SVM in classification of fMRI data, combined with voxel selection methods. Based on the findings, if only classification accuracy was concerned, RBF SVM with appropriate small voxels and linear SVM with relative more voxels were two suggested solutions; if users concerned more about the computational time, RBF SVM with relative small set of voxels when part of the principal components were kept as features was a better choice. PMID:21359184
Song, Sutao; Zhan, Zhichao; Long, Zhiying; Zhang, Jiacai; Yao, Li
2011-02-16
Support vector machine (SVM) has been widely used as accurate and reliable method to decipher brain patterns from functional MRI (fMRI) data. Previous studies have not found a clear benefit for non-linear (polynomial kernel) SVM versus linear one. Here, a more effective non-linear SVM using radial basis function (RBF) kernel is compared with linear SVM. Different from traditional studies which focused either merely on the evaluation of different types of SVM or the voxel selection methods, we aimed to investigate the overall performance of linear and RBF SVM for fMRI classification together with voxel selection schemes on classification accuracy and time-consuming. Six different voxel selection methods were employed to decide which voxels of fMRI data would be included in SVM classifiers with linear and RBF kernels in classifying 4-category objects. Then the overall performances of voxel selection and classification methods were compared. Results showed that: (1) Voxel selection had an important impact on the classification accuracy of the classifiers: in a relative low dimensional feature space, RBF SVM outperformed linear SVM significantly; in a relative high dimensional space, linear SVM performed better than its counterpart; (2) Considering the classification accuracy and time-consuming holistically, linear SVM with relative more voxels as features and RBF SVM with small set of voxels (after PCA) could achieve the better accuracy and cost shorter time. The present work provides the first empirical result of linear and RBF SVM in classification of fMRI data, combined with voxel selection methods. Based on the findings, if only classification accuracy was concerned, RBF SVM with appropriate small voxels and linear SVM with relative more voxels were two suggested solutions; if users concerned more about the computational time, RBF SVM with relative small set of voxels when part of the principal components were kept as features was a better choice.
Rapp, M; Lein, V; Lacoudre, F; Lafferty, J; Müller, E; Vida, G; Bozhanova, V; Ibraliu, A; Thorwarth, P; Piepho, H P; Leiser, W L; Würschum, T; Longin, C F H
2018-06-01
Simultaneous improvement of protein content and grain yield by index selection is possible but its efficiency largely depends on the weighting of the single traits. The genetic architecture of these indices is similar to that of the primary traits. Grain yield and protein content are of major importance in durum wheat breeding, but their negative correlation has hampered their simultaneous improvement. To account for this in wheat breeding, the grain protein deviation (GPD) and the protein yield were proposed as targets for selection. The aim of this work was to investigate the potential of different indices to simultaneously improve grain yield and protein content in durum wheat and to evaluate their genetic architecture towards genomics-assisted breeding. To this end, we investigated two different durum wheat panels comprising 159 and 189 genotypes, which were tested in multiple field locations across Europe and genotyped by a genotyping-by-sequencing approach. The phenotypic analyses revealed significant genetic variances for all traits and heritabilities of the phenotypic indices that were in a similar range as those of grain yield and protein content. The GPD showed a high and positive correlation with protein content, whereas protein yield was highly and positively correlated with grain yield. Thus, selecting for a high GPD would mainly increase the protein content whereas a selection based on protein yield would mainly improve grain yield, but a combination of both indices allows to balance this selection. The genome-wide association mapping revealed a complex genetic architecture for all traits with most QTL having small effects and being detected only in one germplasm set, thus limiting the potential of marker-assisted selection for trait improvement. By contrast, genome-wide prediction appeared promising but its performance strongly depends on the relatedness between training and prediction sets.
Identifying failure in a tree network of a parallel computer
Archer, Charles J.; Pinnow, Kurt W.; Wallenfelt, Brian P.
2010-08-24
Methods, parallel computers, and products are provided for identifying failure in a tree network of a parallel computer. The parallel computer includes one or more processing sets including an I/O node and a plurality of compute nodes. For each processing set embodiments include selecting a set of test compute nodes, the test compute nodes being a subset of the compute nodes of the processing set; measuring the performance of the I/O node of the processing set; measuring the performance of the selected set of test compute nodes; calculating a current test value in dependence upon the measured performance of the I/O node of the processing set, the measured performance of the set of test compute nodes, and a predetermined value for I/O node performance; and comparing the current test value with a predetermined tree performance threshold. If the current test value is below the predetermined tree performance threshold, embodiments include selecting another set of test compute nodes. If the current test value is not below the predetermined tree performance threshold, embodiments include selecting from the test compute nodes one or more potential problem nodes and testing individually potential problem nodes and links to potential problem nodes.
Traffic engineering and regenerator placement in GMPLS networks with restoration
NASA Astrophysics Data System (ADS)
Yetginer, Emre; Karasan, Ezhan
2002-07-01
In this paper we study regenerator placement and traffic engineering of restorable paths in Generalized Multipro-tocol Label Switching (GMPLS) networks. Regenerators are necessary in optical networks due to transmission impairments. We study a network architecture where there are regenerators at selected nodes and we propose two heuristic algorithms for the regenerator placement problem. Performances of these algorithms in terms of required number of regenerators and computational complexity are evaluated. In this network architecture with sparse regeneration, offline computation of working and restoration paths is studied with bandwidth reservation and path rerouting as the restoration scheme. We study two approaches for selecting working and restoration paths from a set of candidate paths and formulate each method as an Integer Linear Programming (ILP) prob-lem. Traffic uncertainty model is developed in order to compare these methods based on their robustness with respect to changing traffic patterns. Traffic engineering methods are compared based on number of additional demands due to traffic uncertainty that can be carried. Regenerator placement algorithms are also evaluated from a traffic engineering point of view.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
The Weldon Spring DOE grantee, St. Charles County, is seeking an early renewal on the Weldon Spring Grant order to match the grant`s reporting calendar with the County`s fiscal calendar which is January through December. Therefore, this renewal application will cover five months time instead of 12 months. This notified annual overview bridges a two month period that precedes the appointment and activation of the Weldon Spring Citizens Commission in February 1995. In the original grant application the County described its intent to select a volunteer Citizens Oversight Commission to monitor the cleanup activities at the DOE`s Weldon Spring Site.more » This commission would serve at the County`s watchdog group by monitoring Weldon Spring Site activities and provide on-going communication to the County`s residents through publications and forums. The first eight months of the project involved setting up the project office and working with a three member {open_quotes}Section Panel/Work Group{close_quotes} to select the Citizen`s Commission. These activities were coordinated by a Project Director hired by the County and funded from the initial grant funds.« less
Mental health problems among young doctors: an updated review of prospective studies.
Tyssen, Reidar; Vaglum, Per
2002-01-01
Previous studies have shown the medical community to exhibit a relatively high level of certain mental health problems, particularly depression, which may lead to drug abuse and suicide. We reviewed prospective studies published over the past 20 years to investigate the prevalence and predictors of mental health problems in doctors during their first postgraduate years. We selected clinically relevant mental health problems as the outcome measure. We found nine cohort studies that met our selection criteria. Each of them had limitations, notably low response rate at follow-up, small sample size, and/or short observation period. Most studies showed that symptoms of mental health problems, particularly of depression, were highest during the first postgraduate year. They found that individual factors, such as family background, personality traits (neuroticism and self-criticism), and coping by wishful thinking, as well as contextual factors including perceived medical-school stress, perceived overwork, emotional pressure, working in an intensive-care setting, and stress outside of work, were often predictive of mental health problems. The studies revealed somewhat discrepant findings with respect to gender. The implications of these findings are discussed.
Social work and adverse childhood experiences research: implications for practice and health policy.
Larkin, Heather; Felitti, Vincent J; Anda, Robert F
2014-01-01
Medical research on "adverse childhood experiences" (ACEs) reveals a compelling relationship between the extent of childhood adversity, adult health risk behaviors, and principal causes of death in the United States. This article provides a selective review of the ACE Study and related social science research to describe how effective social work practice that prevents ACEs and mobilizes resilience and recovery from childhood adversity could support the achievement of national health policy goals. This article applies a biopsychosocial perspective, with an emphasis on mind-body coping processes to demonstrate that social work responses to adverse childhood experiences may contribute to improvement in overall health. Consistent with this framework, the article sets forth prevention and intervention response strategies with individuals, families, communities, and the larger society. Economic research on human capital development is reviewed that suggests significant cost savings may result from effective implementation of these strategies.
Schwark, Jeremy D; Dolgov, Igor; Sandry, Joshua; Volkman, C Brooks
2013-10-01
Recent theories of attention have proposed that selection history is a separate, dissociable source of information that influences attention. The current study sought to investigate the simultaneous involvement of selection history and working-memory on attention during visual search. Experiments 1 and 2 used target feature probability to manipulate selection history and found significant effects of both working-memory and selection history, although working-memory dominated selection history when they cued different locations. Experiment 3 eliminated the contribution of voluntary refreshing of working-memory and replicated the main effects, although selection history became dominant. Using the same methodology, but with reduced probability cue validity, both effects were present in Experiment 4 and did not significantly differ in their contribution to attention. Effects of selection history and working-memory never interacted. These results suggest that selection history and working-memory are separate influences on attention and have little impact on each other. Theoretical implications for models of attention are discussed. © 2013.
Soeker, Mohammed Shaheed; De Jongh, Jo Celene; Diedericks, Amy; Matthys, Kelly; Swart, Nicole; van der Pol, Petra
2018-01-01
Protective workshops and sheltered employment settings have been instrumental in developing the work skills of people with disabilities, however there has been a void in the literature about its influence on the ability of individuals to find employment in the open labor market. The aim of the study is to explore the experiences and perceptions of people with disabilities about the development of their work skills for transitioning into the open labor market. Five individuals with various types of disabilities and two key informants participated in the study. The research study was positioned within the qualitative paradigm specifically utilizing an exploratory and descriptive research design. In order to gather data from the participants, semi structured interviews were used. Three themes emerged from the findings of the study. Theme one, designated as "Reaching a ceiling", reflected the barriers that the participants experienced regarding work skills development. Theme two, designated as "Enablers for growth within the workplace", related to the enabling factors related to development of the work skills of persons with a disability (PWD). The final theme related to the meaning that PWD associated to their worker role and was designated as "A sense of universality". The participants highlighted that they felt their coworkers in the workshops were "like family" to them and thoroughly enjoyed the work tasks and work environment, expressing specific support from their fellow workers. Through reaching their goals, engaging in their work tasks and having the sense of universality in the workplace, the workers felt that the work they participated in gave them meaning to their life. The findings of the study indicated that managers of protective workshops and sheltered employment settings should consider selecting work tasks that enable the development of skills needed in the open labour market. A work skills development system whereby PWD in these workshops could determine their own career progression is advocated.
Neural bases of orthographic long-term memory and working memory in dysgraphia.
Rapp, Brenda; Purcell, Jeremy; Hillis, Argye E; Capasso, Rita; Miceli, Gabriele
2016-02-01
Spelling a word involves the retrieval of information about the word's letters and their order from long-term memory as well as the maintenance and processing of this information by working memory in preparation for serial production by the motor system. While it is known that brain lesions may selectively affect orthographic long-term memory and working memory processes, relatively little is known about the neurotopographic distribution of the substrates that support these cognitive processes, or the lesions that give rise to the distinct forms of dysgraphia that affect these cognitive processes. To examine these issues, this study uses a voxel-based mapping approach to analyse the lesion distribution of 27 individuals with dysgraphia subsequent to stroke, who were identified on the basis of their behavioural profiles alone, as suffering from deficits only affecting either orthographic long-term or working memory, as well as six other individuals with deficits affecting both sets of processes. The findings provide, for the first time, clear evidence of substrates that selectively support orthographic long-term and working memory processes, with orthographic long-term memory deficits centred in either the left posterior inferior frontal region or left ventral temporal cortex, and orthographic working memory deficits primarily arising from lesions of the left parietal cortex centred on the intraparietal sulcus. These findings also contribute to our understanding of the relationship between the neural instantiation of written language processes and spoken language, working memory and other cognitive skills. © The Author (2015). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
ERIC Educational Resources Information Center
Eimer, Martin; Kiss, Monika; Nicholas, Susan
2011-01-01
When target-defining features are specified in advance, attentional target selection in visual search is controlled by preparatory top-down task sets. We used ERP measures to study voluntary target selection in the absence of such feature-specific task sets, and to compare it to selection that is guided by advance knowledge about target features.…
Predicting distant failure in early stage NSCLC treated with SBRT using clinical parameters.
Zhou, Zhiguo; Folkert, Michael; Cannon, Nathan; Iyengar, Puneeth; Westover, Kenneth; Zhang, Yuanyuan; Choy, Hak; Timmerman, Robert; Yan, Jingsheng; Xie, Xian-J; Jiang, Steve; Wang, Jing
2016-06-01
The aim of this study is to predict early distant failure in early stage non-small cell lung cancer (NSCLC) treated with stereotactic body radiation therapy (SBRT) using clinical parameters by machine learning algorithms. The dataset used in this work includes 81 early stage NSCLC patients with at least 6months of follow-up who underwent SBRT between 2006 and 2012 at a single institution. The clinical parameters (n=18) for each patient include demographic parameters, tumor characteristics, treatment fraction schemes, and pretreatment medications. Three predictive models were constructed based on different machine learning algorithms: (1) artificial neural network (ANN), (2) logistic regression (LR) and (3) support vector machine (SVM). Furthermore, to select an optimal clinical parameter set for the model construction, three strategies were adopted: (1) clonal selection algorithm (CSA) based selection strategy; (2) sequential forward selection (SFS) method; and (3) statistical analysis (SA) based strategy. 5-cross-validation is used to validate the performance of each predictive model. The accuracy was assessed by area under the receiver operating characteristic (ROC) curve (AUC), sensitivity and specificity of the system was also evaluated. The AUCs for ANN, LR and SVM were 0.75, 0.73, and 0.80, respectively. The sensitivity values for ANN, LR and SVM were 71.2%, 72.9% and 83.1%, while the specificity values for ANN, LR and SVM were 59.1%, 63.6% and 63.6%, respectively. Meanwhile, the CSA based strategy outperformed SFS and SA in terms of AUC, sensitivity and specificity. Based on clinical parameters, the SVM with the CSA optimal parameter set selection strategy achieves better performance than other strategies for predicting distant failure in lung SBRT patients. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Wesolowski, Edwin A.
2000-01-01
This report presents a proposal for conducting a water-quality modeling study at drought streamflow, a detailed comprehensive plan for collecting the data, and an annual drought-formation monitoring plan. A 30.8 mile reach of the Red River of the North receives treated wastewater from plants at Fargo, North Dakota, and Moorhead, Minnesota, and streamflow from the Sheyenne River. The water-quality modeling study will evaluate the effects of continuous treated-wastewater discharges to the study reach at drought streamflow. The study will define hydraulic characteristics and reaeration and selected reaction coefficients and will calibrate and verity a model.The study includes collecting synoptic water-quality samples for various types of analyses at a number of sites in the study reach. Dye and gas samples will be collected for traveltime and reaeration measurements. Using the Lagrangian reference frame, synoptic water-quality samples will be collected for analysis of nutrients, chlorophyll a, alkalinity, and carbonaceous biochemical oxygen demand. Field measurements will be made of specific conductance, pH, air and water temperature, dissolved oxygen, and sediment oxygen demand. Two sets of water-quality data will be collected. One data set will be used to calibrate the model, and the other data set will be used to verity the model.The DAFLOW/BLTM models will be used to evaluate the effects of the treated wastewater on the water quality of the river. The model will simulate specific conductance, temperature, dissolved oxygen, carbonaceous biochemical oxygen demand, total nitrogen (organic, ammonia, nitrite, nitrate), total orthophosphorus, total phosphorus, and phytoplankton as chlorophyll a.The work plan identifies and discusses the work elements needed for accomplishing the data collection for the study. The work elements specify who will provide personnel, vehicles, instruments, and supplies needed during data collection. The work plan contains instructions for data collection; inventory lists of needed personnel, vehicles, instruments, and supplies; and examples of computations for determining quantities of tracer to be injected into the stream. The work plan also contains an annual drought-formation monitoring plan that includes a 9-month time line that specifies when essential planning actions must occur before actual project start up. Drought streamflows are rare. The annual drought-formation monitoring plan is presented to assist project planning by providing early warning that conditions are favorable to produce drought streamflow. The plan to monitor drought-forming conditions discusses the drought indices to be monitored. To establish a baseline, historic values for some of the drought indices for selected years were reviewed. An annual review of the drought indices is recommended.
NASA Astrophysics Data System (ADS)
McKean, John R.; Johnson, Donn; Taylor, R. Garth
2003-04-01
An alternate travel cost model is applied to an on-site sample to estimate the value of flat water recreation on the impounded lower Snake River. Four contiguous reservoirs would be eliminated if the dams are breached to protect endangered Pacific salmon and steelhead trout. The empirical method applies truncated negative binomial regression with adjustment for endogenous stratification. The two-stage decision model assumes that recreationists allocate their time among work and leisure prior to deciding among consumer goods. The allocation of time and money among goods in the second stage is conditional on the predetermined work time and income. The second stage is a disequilibrium labor market which also applies if employers set work hours or if recreationists are not in the labor force. When work time is either predetermined, fixed by contract, or nonexistent, recreationists must consider separate prices and budgets for time and money.
Evaluation of knee joint forces during kneeling work with different kneepads.
Xu, Hang; Jampala, Sree; Bloswick, Donald; Zhao, Jie; Merryweather, Andrew
2017-01-01
The main purpose of this study is to determine knee joint forces resulting from kneeling work with and without kneepads to quantify how different kneepads redistribute force. Eleven healthy males simulated a tile setting task to different locations during six kneepad states (five different kneepad types and without kneepad). Peak and average forces on the anatomical landmarks of both knees were obtained by custom force sensors. The results revealed that kneepad design can significantly modify the forces on the knee joint through redistribution. The Professional Gel design was preferred among the five tested kneepads which was confirmed with both force measurements and participants' responses. The extreme reaching locations induced significantly higher joint forces on left knee or right knee depending on task. The conclusion of this study is that a properly selected kneepad for specific tasks and a more neutral working posture can modify the force distribution on the knees and likely decrease the risk of knee disorders from kneeling work. Copyright © 2016 Elsevier Ltd. All rights reserved.
Work-related activities associated with injury in occupational and physical therapists.
Darragh, Amy R; Campo, Marc; King, Phyllis
2012-01-01
The purpose of this study was to examine work activities associated with work-related injury (WRI) in occupational and physical therapy. 1,158 occupational and physical therapists in Wisconsin responded to a mailed survey, from a total of 3,297 OTs and PTs randomly selected from the State licensure list. The study used a cross-sectional, survey design. Participants reported information about WRI they sustained between 2004 and 2006, including the activities they were performing when injured. Investigators analyzed 248 injury incidents using qualitative and quantitative analysis. Data were examined across OT and PT practice in general, and also by practice area. Manual therapy and transfers/lifts were associated with 54% of all injuries. Other activities associated with injury were distinct to practice area, for example: floor work in pediatrics; functional activities in acute care; patient falls in skilled nursing facilities; and motor vehicle activities in home care. Injury prevention activities must address transfers and manual therapy, but also must examine setting-specific activities influenced by environment and patient population.
[Evaluation standards and application for photography of schistosomiasis control theme].
Chun-Li, Cao; Qing-Biao, Hong; Jing-Ping, Guo; Fang, Liu; Tian-Ping, Wang; Jian-Bin, Liu; Lin, Chen; Hao, Wang; You-Sheng, Liang; Jia-Gang, Guo
2018-02-26
To set up and apply the evaluation standards for photography of schistosomiasis control theme, so as to offer the scientific advice for enriching the health information carrier of schistosomiasis control. Through the literature review and expert consultation, the evaluation standard for photography of schistosomiasis control theme was formulated. The themes were divided into 4 projects, such as new construction, natural scenery, working scene, and control achievements. The evaluation criteria of the theme photography were divided into the theme (60%), photographic composition (15%), focus exposure (15%), and color saturation (10%) . A total of 495 pictures (sets) from 59 units with 77 authors were collected from schistosomiasis epidemic areas national wide. After the first-step screening and second-step evaluation, the prizes of 3 themes of control achievements and new construction, working scene, and natural scenery were selected, such as 6 pictures of first prize, 12 pictures of second prize, 18 pictures of third prize, and 20 pictures of honorable prize. The evaluation standards of theme photography should be taken into the consideration of the technical elements of photography and the work specification of schistosomiasis prevention and control. In order to improve the ability of records for propaganda purpose of schistosomiasis control and better play a role of guiding correct propaganda, the training and guidance of photography of professionals should be carried out.
Promising technological innovations in cognitive training to treat eating-related behavior.
Forman, Evan M; Goldstein, Stephanie P; Flack, Daniel; Evans, Brittney C; Manasse, Stephanie M; Dochat, Cara
2018-05-01
One potential reason for the suboptimal outcomes of treatments targeting appetitive behavior, such as eating and alcohol consumption, is that they do not target the implicit cognitive processes that may be driving these behaviors. Two groups of related neurocognitive processes that are robustly associated with dysregulated eating and drinking are attention bias (AB; selective attention to specific stimuli) and executive function (EF; a set of cognitive control processes such as inhibitory control, working memory, set shifting, that govern goal-directed behaviors). An increasing body of work suggests that EF and AB training programs improve regulation of appetitive behaviors, especially if trainings are frequent and sustained. However, several key challenges, such as adherence to the trainings in the long term, and overall potency of the training, remain. The current manuscript describes five technological innovations that have the potential to address difficulties related to the effectiveness and feasibility of EF and AB trainings: (1) deployment of training in the home, (2) training via smartphone, (3) gamification, (4) virtual reality, and (5) personalization. The drawbacks of these innovations, as well as areas for future research, are also discussed. The above-mentioned innovations are likely to be instrumental in the future empirical work to develop and evaluate effective EF and AB trainings for appetitive behaviors. Copyright © 2017 Elsevier Ltd. All rights reserved.
Demystifying Multitask Deep Neural Networks for Quantitative Structure-Activity Relationships.
Xu, Yuting; Ma, Junshui; Liaw, Andy; Sheridan, Robert P; Svetnik, Vladimir
2017-10-23
Deep neural networks (DNNs) are complex computational models that have found great success in many artificial intelligence applications, such as computer vision1,2 and natural language processing.3,4 In the past four years, DNNs have also generated promising results for quantitative structure-activity relationship (QSAR) tasks.5,6 Previous work showed that DNNs can routinely make better predictions than traditional methods, such as random forests, on a diverse collection of QSAR data sets. It was also found that multitask DNN models-those trained on and predicting multiple QSAR properties simultaneously-outperform DNNs trained separately on the individual data sets in many, but not all, tasks. To date there has been no satisfactory explanation of why the QSAR of one task embedded in a multitask DNN can borrow information from other unrelated QSAR tasks. Thus, using multitask DNNs in a way that consistently provides a predictive advantage becomes a challenge. In this work, we explored why multitask DNNs make a difference in predictive performance. Our results show that during prediction a multitask DNN does borrow "signal" from molecules with similar structures in the training sets of the other tasks. However, whether this borrowing leads to better or worse predictive performance depends on whether the activities are correlated. On the basis of this, we have developed a strategy to use multitask DNNs that incorporate prior domain knowledge to select training sets with correlated activities, and we demonstrate its effectiveness on several examples.
NASA Astrophysics Data System (ADS)
Vollmer, B.; Ostrenga, D.; Johnson, J. E.; Savtchenko, A. K.; Shen, S.; Teng, W. L.; Wei, J. C.
2013-12-01
Digital Object Identifiers (DOIs) are applied to selected data sets at the NASA Goddard Earth Sciences Data and Information Services Center (GES DISC). The DOI system provides an Internet resolution service for unique and persistent identifiers of digital objects. Products assigned DOIs include data from the NASA MEaSUREs Program, the Earth Observing System (EOS) Aqua Atmospheric Infrared Sounder (AIRS) and EOS Aura High Resolution Dynamics Limb Sounder (HIRDLS). DOIs are acquired and registered through EZID, California Digital Library and DataCite. GES DISC hosts a data set landing page associated with each DOI containing information on and access to the data including a recommended data citation when using the product in research or applications. This work includes participation with the earth science community (e.g., Earth Science Information Partners (ESIP) Federation) and the NASA Earth Science Data and Information System (ESDIS) Project to identify, establish and implement best practices for assigning DOIs and managing supporting information, including metadata, for earth science data sets. Future work includes (1) coordination with NASA mission Science Teams and other data providers on the assignment of DOIs for other GES DISC data holdings, particularly for future missions such as Orbiting Carbon Observatory -2 and -3 (OCO-2, OCO-3) and projects (MEaSUREs 2012), (2) construction of landing pages that are both human and machine readable, and (3) pursuing the linking of data and publications with tools such as the Thomson Reuters Data Citation Index.
Natural selection and self-organization in complex adaptive systems.
Di Bernardo, Mirko
2010-01-01
The central theme of this work is self-organization "interpreted" both from the point of view of theoretical biology, and from a philosophical point of view. By analysing, on the one hand, those which are now considered--not only in the field of physics--some of the most important discoveries, that is complex systems and deterministic chaos and, on the other hand, the new frontiers of systemic biology, this work highlights how large thermodynamic systems which are open can spontaneously stay in an orderly regime. Such systems can represent the natural source of the order required for a stable self-organization, for homoeostasis and for hereditary variations. The order, emerging in enormous randomly interconnected nets of binary variables, is almost certainly only the precursor of similar orders emerging in all the varieties of complex systems. Hence, this work, by finding new foundations for the order pervading the living world, advances the daring hypothesis according to which Darwinian natural selection is not the only source of order in the biosphere. Thus, the article, by examining the passage from Prigogine's dissipative structures theory to the contemporary theory of biological complexity, highlights the development of a coherent and continuous line of research which is set to individuate the general principles marking the profound reality of that mysterious self-organization characterizing the complexity of life.
2013-01-01
Background Effective interventions among female sex workers require a thorough knowledge of the context of local sex industries. We explore the organisation of female sex work in a low socio-economic setting in Kampala, Uganda. Methods We conducted a qualitative study with 101 participants selected from an epidemiological cohort of 1027 women at high risk of HIV in Kampala. Repeat in-depth life history and work practice interviews were conducted from March 2010 to June 2011. Context specific factors of female sex workers’ day-to-day lives were captured. Reported themes were identified and categorised inductively. Results Of the 101 women, 58 were active self-identified sex workers operating in different locations within the area of study and nine had quit sex work. This paper focuses on these 67 women who gave information about their involvement in sex work. The majority had not gone beyond primary level of education and all had at least one child. Thirty one voluntarily disclosed that they were HIV-positive. Common sex work locations were streets/roadsides, bars and night clubs. Typically sex occurred in lodges near bars/night clubs, dark alleyways or car parking lots. Overall, women experienced sex work-related challenges at their work locations but these were more apparent in outdoor settings. These settings exposed women to violence, visibility to police, a stigmatising public as well as competition for clients, while bars provided some protection from these challenges. Older sex workers tended to prefer bars while the younger ones were mostly based on the streets. Alcohol consumption was a feature in all locations and women said it gave them courage and helped them to withstand the night chill. Condom use was determined by clients’ willingness, a woman’s level of sobriety or price offered. Conclusions Sex work operates across a variety of locations in the study area in Kampala, with each presenting different strategies and challenges for those operating there. Risky practices are present in all locations although they are higher on the streets compared to other locations. Location specific interventions are required to address the complex challenges in sex work environments. PMID:23938037
NASA Astrophysics Data System (ADS)
Pagnuco, Inti A.; Pastore, Juan I.; Abras, Guillermo; Brun, Marcel; Ballarin, Virginia L.
2016-04-01
It is usually assumed that co-expressed genes suggest co-regulation in the underlying regulatory network. Determining sets of co-expressed genes is an important task, where significative groups of genes are defined based on some criteria. This task is usually performed by clustering algorithms, where the whole family of genes, or a subset of them, are clustered into meaningful groups based on their expression values in a set of experiment. In this work we used a methodology based on the Silhouette index as a measure of cluster quality for individual gene groups, and a combination of several variants of hierarchical clustering to generate the candidate groups, to obtain sets of co-expressed genes for two real data examples. We analyzed the quality of the best ranked groups, obtained by the algorithm, using an online bioinformatics tool that provides network information for the selected genes. Moreover, to verify the performance of the algorithm, considering the fact that it doesn’t find all possible subsets, we compared its results against a full search, to determine the amount of good co-regulated sets not detected.
PubMed Phrases, an open set of coherent phrases for searching biomedical literature
Kim, Sun; Yeganova, Lana; Comeau, Donald C.; Wilbur, W. John; Lu, Zhiyong
2018-01-01
In biomedicine, key concepts are often expressed by multiple words (e.g., ‘zinc finger protein’). Previous work has shown treating a sequence of words as a meaningful unit, where applicable, is not only important for human understanding but also beneficial for automatic information seeking. Here we present a collection of PubMed® Phrases that are beneficial for information retrieval and human comprehension. We define these phrases as coherent chunks that are logically connected. To collect the phrase set, we apply the hypergeometric test to detect segments of consecutive terms that are likely to appear together in PubMed. These text segments are then filtered using the BM25 ranking function to ensure that they are beneficial from an information retrieval perspective. Thus, we obtain a set of 705,915 PubMed Phrases. We evaluate the quality of the set by investigating PubMed user click data and manually annotating a sample of 500 randomly selected noun phrases. We also analyze and discuss the usage of these PubMed Phrases in literature search. PMID:29893755
Messier, S P; Callahan, L F; Golightly, Y M; Keefe, F J
2015-05-01
The objective was to develop a set of "best practices" for use as a primer for those interested in entering the clinical trials field for lifestyle diet and/or exercise interventions in osteoarthritis (OA), and as a set of recommendations for experienced clinical trials investigators. A subcommittee of the non-pharmacologic therapies committee of the OARSI Clinical Trials Working Group was selected by the Steering Committee to develop a set of recommended principles for non-pharmacologic diet/exercise OA randomized clinical trials. Topics were identified for inclusion by co-authors and reviewed by the subcommittee. Resources included authors' expert opinions, traditional search methods including MEDLINE (via PubMed), and previously published guidelines. Suggested steps and considerations for study methods (e.g., recruitment and enrollment of participants, study design, intervention and assessment methods) were recommended. The recommendations set forth in this paper provide a guide from which a research group can design a lifestyle diet/exercise randomized clinical trial in patients with OA. Copyright © 2015 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.
Knacker, T; Schallnaß, H J; Klaschka, U; Ahlers, J
1995-11-01
The criteria for classification and labelling of substances as "dangerous for the environment" agreed upon within the European Union (EU) were applied to two sets of existing chemicals. One set (sample A) consisted of 41 randomly selected compounds listed in the European Inventory of Existing Chemical Substances (EINECS). The other set (sample B) comprised 115 substances listed in Annex I of Directive 67/548/EEC which were classified by the EU Working Group on Classification and Labelling of Existing Chemicals. The aquatic toxicity (fish mortality,Daphnia immobilisation, algal growth inhibition), ready biodegradability and n-octanol/water partition coefficient were measured for sample A by one and the same laboratory. For sample B, the available ecotoxicological data originated from many different sources and therefore was rather heterogeneous. In both samples, algal toxicity was the most sensitive effect parameter for most substances. Furthermore, it was found that, classification based on a single aquatic test result differs in many cases from classification based on a complete data set, although a correlation exists between the biological end-points of the aquatic toxicity test systems.
Developing core outcome sets for clinical trials: issues to consider
2012-01-01
The selection of appropriate outcomes or domains is crucial when designing clinical trials in order to compare directly the effects of different interventions in ways that minimize bias. If the findings are to influence policy and practice then the chosen outcomes need to be relevant and important to key stakeholders including patients and the public, health care professionals and others making decisions about health care. There is a growing recognition that insufficient attention has been paid to the outcomes measured in clinical trials. These issues could be addressed through the development and use of an agreed standardized collection of outcomes, known as a core outcome set, which should be measured and reported, as a minimum, in all trials for a specific clinical area. Accumulating work in this area has identified the need for general guidance on the development of core outcome sets. Key issues to consider in the development of a core outcome set include its scope, the stakeholder groups to involve, choice of consensus method and the achievement of a consensus. PMID:22867278
Finite Nuclei in the Quark-Meson Coupling Model.
Stone, J R; Guichon, P A M; Reinhard, P G; Thomas, A W
2016-03-04
We report the first use of the effective quark-meson coupling (QMC) energy density functional (EDF), derived from a quark model of hadron structure, to study a broad range of ground state properties of even-even nuclei across the periodic table in the nonrelativistic Hartree-Fock+BCS framework. The novelty of the QMC model is that the nuclear medium effects are treated through modification of the internal structure of the nucleon. The density dependence is microscopically derived and the spin-orbit term arises naturally. The QMC EDF depends on a single set of four adjustable parameters having a clear physics basis. When applied to diverse ground state data the QMC EDF already produces, in its present simple form, overall agreement with experiment of a quality comparable to a representative Skyrme EDF. There exist, however, multiple Skyrme parameter sets, frequently tailored to describe selected nuclear phenomena. The QMC EDF set of fewer parameters, derived in this work, is not open to such variation, chosen set being applied, without adjustment, to both the properties of finite nuclei and nuclear matter.
Nondestructive evaluation of soluble solid content in strawberry by near infrared spectroscopy
NASA Astrophysics Data System (ADS)
Guo, Zhiming; Huang, Wenqian; Chen, Liping; Wang, Xiu; Peng, Yankun
This paper indicates the feasibility to use near infrared (NIR) spectroscopy combined with synergy interval partial least squares (siPLS) algorithms as a rapid nondestructive method to estimate the soluble solid content (SSC) in strawberry. Spectral preprocessing methods were optimized selected by cross-validation in the model calibration. Partial least squares (PLS) algorithm was conducted on the calibration of regression model. The performance of the final model was back-evaluated according to root mean square error of calibration (RMSEC) and correlation coefficient (R2 c) in calibration set, and tested by mean square error of prediction (RMSEP) and correlation coefficient (R2 p) in prediction set. The optimal siPLS model was obtained with after first derivation spectra preprocessing. The measurement results of best model were achieved as follow: RMSEC = 0.2259, R2 c = 0.9590 in the calibration set; and RMSEP = 0.2892, R2 p = 0.9390 in the prediction set. This work demonstrated that NIR spectroscopy and siPLS with efficient spectral preprocessing is a useful tool for nondestructively evaluation SSC in strawberry.
Machine-assisted discovery of relationships in astronomy
NASA Astrophysics Data System (ADS)
Graham, Matthew J.; Djorgovski, S. G.; Mahabal, Ashish A.; Donalek, Ciro; Drake, Andrew J.
2013-05-01
High-volume feature-rich data sets are becoming the bread-and-butter of 21st century astronomy but present significant challenges to scientific discovery. In particular, identifying scientifically significant relationships between sets of parameters is non-trivial. Similar problems in biological and geosciences have led to the development of systems which can explore large parameter spaces and identify potentially interesting sets of associations. In this paper, we describe the application of automated discovery systems of relationships to astronomical data sets, focusing on an evolutionary programming technique and an information-theory technique. We demonstrate their use with classical astronomical relationships - the Hertzsprung-Russell diagram and the Fundamental Plane of elliptical galaxies. We also show how they work with the issue of binary classification which is relevant to the next generation of large synoptic sky surveys, such as the Large Synoptic Survey Telescope (LSST). We find that comparable results to more familiar techniques, such as decision trees, are achievable. Finally, we consider the reality of the relationships discovered and how this can be used for feature selection and extraction.
NASA Astrophysics Data System (ADS)
Kiran Kumar, Kalla; Nagaraju, Dega; Gayathri, S.; Narayanan, S.
2017-05-01
Priority Sequencing Rules provide the guidance for the order in which the jobs are to be processed at a workstation. The application of different priority rules in job shop scheduling gives different order of scheduling. More experimentation needs to be conducted before a final choice is made to know the best priority sequencing rule. Hence, a comprehensive method of selecting the right choice is essential in managerial decision making perspective. This paper considers seven different priority sequencing rules in job shop scheduling. For evaluation and selection of the best priority sequencing rule, a set of eight criteria are considered. The aim of this work is to demonstrate the methodology of evaluating and selecting the best priority sequencing rule by using hybrid multi criteria decision making technique (MCDM), i.e., analytical hierarchy process (AHP) with technique for order preference by similarity to ideal solution (TOPSIS). The criteria weights are calculated by using AHP whereas the relative closeness values of all priority sequencing rules are computed based on TOPSIS with the help of data acquired from the shop floor of a manufacturing firm. Finally, from the findings of this work, the priority sequencing rules are ranked from most important to least important. The comprehensive methodology presented in this paper is very much essential for the management of a workstation to choose the best priority sequencing rule among the available alternatives for processing the jobs with maximum benefit.
2014-01-01
GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S...21 For example, see DoD, Sustaining U.S. Global Leadership: Priorities for 21st Century Defense, January 2012. 22 U.S. Joint Chiefs of Staff, 2011, p... projects whenever possible.10 And most of them recog- nized a need for a common set of tools and capabilities. Competence with the Micro- soft Excel and
Optimized extreme learning machine for urban land cover classification using hyperspectral imagery
NASA Astrophysics Data System (ADS)
Su, Hongjun; Tian, Shufang; Cai, Yue; Sheng, Yehua; Chen, Chen; Najafian, Maryam
2017-12-01
This work presents a new urban land cover classification framework using the firefly algorithm (FA) optimized extreme learning machine (ELM). FA is adopted to optimize the regularization coefficient C and Gaussian kernel σ for kernel ELM. Additionally, effectiveness of spectral features derived from an FA-based band selection algorithm is studied for the proposed classification task. Three sets of hyperspectral databases were recorded using different sensors, namely HYDICE, HyMap, and AVIRIS. Our study shows that the proposed method outperforms traditional classification algorithms such as SVM and reduces computational cost significantly.
Peace through health II: a framework for medical student education.
Arya, Neil
2004-01-01
The world's first university course in Peace through Health (PtH) recently finished at McMaster University, Hamilton, Canada. Medical students and academic staff in Canada and Europe have expressed interest in developing this course for other medical schools. Seven medical students were selected to do an unofficial 'audit' in return for 'in kind' work, developing the course materials for the web and adaptation to the medical curriculum. This article sets out the goals and structure of the course as a guide for similar teaching models.
2006-03-01
represented by the set of tiles that it lies in. A variation on tile coding is Berenji and Vengerov’s [4, 5] use of fuzzy state aggregation (FSA) as a means...approximation with Q-learning is not a new or unusual concept [1, 3]. Berenji and Vengerov [4, 5] advanced this work in their application of Q-learning and... Berenji and Vengerov [4, 5]. The simplified Tileworld consists of agents, reward spikes, and deformations. The agent must select which reward to pursue
Women's experiences in the engineering laboratory in Japan
NASA Astrophysics Data System (ADS)
Hosaka, Masako
2014-07-01
This qualitative study aims to examine Japanese women undergraduate engineering students' experiences of interacting with departmental peers of the same year in the laboratory setting by using interview data of 32 final-year students at two modestly selective national universities in Japan. Expectation state theory that explains unequal relationship between men and women is used as a framework. Findings suggest that women generally had a discouraging experience while working with their male peers. Specifically, women participated less and lost confidence by comparing with the men who appeared to be confident and competent.
CrowdPhase: crowdsourcing the phase problem
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jorda, Julien; Sawaya, Michael R.; Yeates, Todd O., E-mail: yeates@mbi.ucla.edu
The idea of attacking the phase problem by crowdsourcing is introduced. Using an interactive, multi-player, web-based system, participants work simultaneously to select phase sets that correspond to better electron-density maps in order to solve low-resolution phasing problems. The human mind innately excels at some complex tasks that are difficult to solve using computers alone. For complex problems amenable to parallelization, strategies can be developed to exploit human intelligence in a collective form: such approaches are sometimes referred to as ‘crowdsourcing’. Here, a first attempt at a crowdsourced approach for low-resolution ab initio phasing in macromolecular crystallography is proposed. A collaborativemore » online game named CrowdPhase was designed, which relies on a human-powered genetic algorithm, where players control the selection mechanism during the evolutionary process. The algorithm starts from a population of ‘individuals’, each with a random genetic makeup, in this case a map prepared from a random set of phases, and tries to cause the population to evolve towards individuals with better phases based on Darwinian survival of the fittest. Players apply their pattern-recognition capabilities to evaluate the electron-density maps generated from these sets of phases and to select the fittest individuals. A user-friendly interface, a training stage and a competitive scoring system foster a network of well trained players who can guide the genetic algorithm towards better solutions from generation to generation via gameplay. CrowdPhase was applied to two synthetic low-resolution phasing puzzles and it was shown that players could successfully obtain phase sets in the 30° phase error range and corresponding molecular envelopes showing agreement with the low-resolution models. The successful preliminary studies suggest that with further development the crowdsourcing approach could fill a gap in current crystallographic methods by making it possible to extract meaningful information in cases where limited resolution might otherwise prevent initial phasing.« less