Sample records for model-based assignment tests

  1. Comparisons of node-based and element-based approaches of assigning bone material properties onto subject-specific finite element models.

    PubMed

    Chen, G; Wu, F Y; Liu, Z C; Yang, K; Cui, F

    2015-08-01

    Subject-specific finite element (FE) models can be generated from computed tomography (CT) datasets of a bone. A key step is assigning material properties automatically onto finite element models, which remains a great challenge. This paper proposes a node-based assignment approach and also compares it with the element-based approach in the literature. Both approaches were implemented using ABAQUS. The assignment procedure is divided into two steps: generating the data file of the image intensity of a bone in a MATLAB program and reading the data file into ABAQUS via user subroutines. The node-based approach assigns the material properties to each node of the finite element mesh, while the element-based approach assigns the material properties directly to each integration point of an element. Both approaches are independent from the type of elements. A number of FE meshes are tested and both give accurate solutions; comparatively the node-based approach involves less programming effort. The node-based approach is also independent from the type of analyses; it has been tested on the nonlinear analysis of a Sawbone femur. The node-based approach substantially improves the level of automation of the assignment procedure of bone material properties. It is the simplest and most powerful approach that is applicable to many types of analyses and elements. Copyright © 2015 IPEM. Published by Elsevier Ltd. All rights reserved.

  2. Nurse-patient assignment models considering patient acuity metrics and nurses' perceived workload.

    PubMed

    Sir, Mustafa Y; Dundar, Bayram; Barker Steege, Linsey M; Pasupathy, Kalyan S

    2015-06-01

    Patient classification systems (PCSs) are commonly used in nursing units to assess how many nursing care hours are needed to care for patients. These systems then provide staffing and nurse-patient assignment recommendations for a given patient census based on these acuity scores. Our hypothesis is that such systems do not accurately capture workload and we conduct an experiment to test this hypothesis. Specifically, we conducted a survey study to capture nurses' perception of workload in an inpatient unit. Forty five nurses from oncology and surgery units completed the survey and rated the impact of patient acuity indicators on their perceived workload using a six-point Likert scale. These ratings were used to calculate a workload score for an individual nurse given a set of patient acuity indicators. The approach offers optimization models (prescriptive analytics), which use patient acuity indicators from a commercial PCS as well as a survey-based nurse workload score. The models assign patients to nurses in a balanced manner by distributing acuity scores from the PCS and survey-based perceived workload. Numerical results suggest that the proposed nurse-patient assignment models achieve a balanced assignment and lower overall survey-based perceived workload compared to the assignment based solely on acuity scores from the PCS. This results in an improvement of perceived workload that is upwards of five percent. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. Towards Automated Structure-Based NMR Resonance Assignment

    NASA Astrophysics Data System (ADS)

    Jang, Richard; Gao, Xin; Li, Ming

    We propose a general framework for solving the structure-based NMR backbone resonance assignment problem. The core is a novel 0-1 integer programming model that can start from a complete or partial assignment, generate multiple assignments, and model not only the assignment of spins to residues, but also pairwise dependencies consisting of pairs of spins to pairs of residues. It is still a challenge for automated resonance assignment systems to perform the assignment directly from spectra without any manual intervention. To test the feasibility of this for structure-based assignment, we integrated our system with our automated peak picking and sequence-based resonance assignment system to obtain an assignment for the protein TM1112 with 91% recall and 99% precision without manual intervention. Since using a known structure has the potential to allow one to use only N-labeled NMR data and avoid the added expense of using C-labeled data, we work towards the goal of automated structure-based assignment using only such labeled data. Our system reduced the assignment error of Xiong-Pandurangan-Bailey-Kellogg's contact replacement (CR) method, which to our knowledge is the most error-tolerant method for this problem, by 5 folds on average. By using an iterative algorithm, our system has the added capability of using the NOESY data to correct assignment errors due to errors in predicting the amino acid and secondary structure type of each spin system. On a publicly available data set for Ubiquitin, where the type prediction accuracy is 83%, we achieved 91% assignment accuracy, compared to the 59% accuracy that was obtained without correcting for typing errors.

  4. Ab Initio structure prediction for Escherichia coli: towards genome-wide protein structure modeling and fold assignment

    PubMed Central

    Xu, Dong; Zhang, Yang

    2013-01-01

    Genome-wide protein structure prediction and structure-based function annotation have been a long-term goal in molecular biology but not yet become possible due to difficulties in modeling distant-homology targets. We developed a hybrid pipeline combining ab initio folding and template-based modeling for genome-wide structure prediction applied to the Escherichia coli genome. The pipeline was tested on 43 known sequences, where QUARK-based ab initio folding simulation generated models with TM-score 17% higher than that by traditional comparative modeling methods. For 495 unknown hard sequences, 72 are predicted to have a correct fold (TM-score > 0.5) and 321 have a substantial portion of structure correctly modeled (TM-score > 0.35). 317 sequences can be reliably assigned to a SCOP fold family based on structural analogy to existing proteins in PDB. The presented results, as a case study of E. coli, represent promising progress towards genome-wide structure modeling and fold family assignment using state-of-the-art ab initio folding algorithms. PMID:23719418

  5. Example-based learning: effects of model expertise in relation to student expertise.

    PubMed

    Boekhout, Paul; van Gog, Tamara; van de Wiel, Margje W J; Gerards-Last, Dorien; Geraets, Jacques

    2010-12-01

    Worked examples are very effective for novice learners. They typically present a written-out ideal (didactical) solution for learners to study. This study used worked examples of patient history taking in physiotherapy that presented a non-didactical solution (i.e., based on actual performance). The effects of model expertise (i.e., worked example based on advanced, third-year student model or expert physiotherapist model) in relation to students' expertise (i.e., first- or second-year) were investigated. One hundred and thirty-four physiotherapy students (61 first-year and 73 second-year). Design was 2 × 2 factorial with factors 'Student Expertise' (first-year vs. second-year) and 'Model Expertise' (expert vs. advanced student). Within expertise levels, students were randomly assigned to the Expert Example or the Advanced Student Example condition. All students studied two examples (content depending on their assigned condition) and then completed a retention and test task. They rated their invested mental effort after each example and test task. Second-year students invested less mental effort in studying the examples, and in performing the retention and transfer tasks than first-year students. They also performed better on the retention test, but not on the transfer test. In contrast to our hypothesis, there was no interaction between student expertise and model expertise: all students who had studied the Expert examples performed better on the transfer test than students who had studied Advanced Student Examples. This study suggests that when worked examples are based on actual performance, rather than an ideal procedure, expert models are to be preferred over advanced student models.

  6. Optimum use of air tankers in initial attack: selection, basing, and transfer rules

    Treesearch

    Francis E. Greulich; William G. O' Regan

    1982-01-01

    Fire managers face two interrelated problems in deciding the most efficient use of air tankers: where best to base them, and how best to reallocate them each day in anticipation of fire occurrence. A computerized model based on a mixed integer linear program can help in assigning air tankers throughout the fire season. The model was tested using information from...

  7. Integer Linear Programming for Constrained Multi-Aspect Committee Review Assignment

    PubMed Central

    Karimzadehgan, Maryam; Zhai, ChengXiang

    2011-01-01

    Automatic review assignment can significantly improve the productivity of many people such as conference organizers, journal editors and grant administrators. A general setup of the review assignment problem involves assigning a set of reviewers on a committee to a set of documents to be reviewed under the constraint of review quota so that the reviewers assigned to a document can collectively cover multiple topic aspects of the document. No previous work has addressed such a setup of committee review assignments while also considering matching multiple aspects of topics and expertise. In this paper, we tackle the problem of committee review assignment with multi-aspect expertise matching by casting it as an integer linear programming problem. The proposed algorithm can naturally accommodate any probabilistic or deterministic method for modeling multiple aspects to automate committee review assignments. Evaluation using a multi-aspect review assignment test set constructed using ACM SIGIR publications shows that the proposed algorithm is effective and efficient for committee review assignments based on multi-aspect expertise matching. PMID:22711970

  8. The importance of explicitly mapping instructional analogies in science education

    NASA Astrophysics Data System (ADS)

    Asay, Loretta Johnson

    Analogies are ubiquitous during instruction in science classrooms, yet research about the effectiveness of using analogies has produced mixed results. An aspect seldom studied is a model of instruction when using analogies. The few existing models for instruction with analogies have not often been examined quantitatively. The Teaching With Analogies (TWA) model (Glynn, 1991) is one of the models frequently cited in the variety of research about analogies. The TWA model outlines steps for instruction, including the step of explicitly mapping the features of the source to the target. An experimental study was conducted to examine the effects of explicitly mapping the features of the source and target in an analogy during computer-based instruction about electrical circuits. Explicit mapping was compared to no mapping and to a control with no analogy. Participants were ninth- and tenth-grade biology students who were each randomly assigned to one of three conditions (no analogy module, analogy module, or explicitly mapped analogy module) for computer-based instruction. Subjects took a pre-test before the instruction, which was used to assign them to a level of previous knowledge about electrical circuits for analysis of any differential effects. After the instruction modules, students took a post-test about electrical circuits. Two weeks later, they took a delayed post-test. No advantage was found for explicitly mapping the analogy. Learning patterns were the same, regardless of the type of instruction. Those who knew the least about electrical circuits, based on the pre-test, made the most gains. After the two-week delay, this group maintained the largest amount of their gain. Implications exist for science education classrooms, as analogy use should be based on research about effective practices. Further studies are suggested to foster the building of research-based models for classroom instruction with analogies.

  9. Understanding Test-Type Assignment: Why Do Special Educators Make Unexpected Test-Type Assignments?

    ERIC Educational Resources Information Center

    Cho, Hyun-Jeong; Kingston, Neal

    2014-01-01

    We interviewed special educators (a) whose students with disabilities (SWDs) were proficient on the 2008 general education assessment but were assigned to the 2009 alternate assessment based on modified achievement standards (AA-MAS), and (b) whose students with mild disabilities took the 2008 alternate assessment based on alternate achievement…

  10. Task Assignment Heuristics for Parallel and Distributed CFD Applications

    NASA Technical Reports Server (NTRS)

    Lopez-Benitez, Noe; Djomehri, M. Jahed; Biswas, Rupak

    2003-01-01

    This paper proposes a task graph (TG) model to represent a single discrete step of multi-block overset grid computational fluid dynamics (CFD) applications. The TG model is then used to not only balance the computational workload across the overset grids but also to reduce inter-grid communication costs. We have developed a set of task assignment heuristics based on the constraints inherent in this class of CFD problems. Two basic assignments, the smallest task first (STF) and the largest task first (LTF), are first presented. They are then systematically costs. To predict the performance of the proposed task assignment heuristics, extensive performance evaluations are conducted on a synthetic TG with tasks defined in terms of the number of grid points in predetermined overlapping grids. A TG derived from a realistic problem with eight million grid points is also used as a test case.

  11. Visual Persons Behavior Diary Generation Model based on Trajectories and Pose Estimation

    NASA Astrophysics Data System (ADS)

    Gang, Chen; Bin, Chen; Yuming, Liu; Hui, Li

    2018-03-01

    The behavior pattern of persons was the important output of the surveillance analysis. This paper focus on the generation model of visual person behavior diary. The pipeline includes the person detection, tracking, and the person behavior classify. This paper adopts the deep convolutional neural model YOLO (You Only Look Once)V2 for person detection module. Multi person tracking was based on the detection framework. The Hungarian assignment algorithm was used to the matching. The person appearance model was integrated by HSV color model and Hash code model. The person object motion was estimated by the Kalman Filter. The multi objects were matching with exist tracklets through the appearance and motion location distance by the Hungarian assignment method. A long continuous trajectory for one person was get by the spatial-temporal continual linking algorithm. And the face recognition information was used to identify the trajectory. The trajectories with identification information can be used to generate the visual diary of person behavior based on the scene context information and person action estimation. The relevant modules are tested in public data sets and our own capture video sets. The test results show that the method can be used to generate the visual person behavior pattern diary with certain accuracy.

  12. Preliminary Assessment of the Emporium Model in a Redesigned Engineering Mechanics Course

    ERIC Educational Resources Information Center

    Rais-Rohani, Masoud; Walters, Andrew

    2014-01-01

    A lecture-based engineering mechanics course (Statics) is redesigned using the Emporium model. Whereas students study the material outside of class via asynchronous online delivery of the content and instructional videos, they do all the other activities (e.g., assignments, tests) either individually or in groups inside the classroom. Computer-…

  13. Estimating connectivity in marine populations: an empirical evaluation of assignment tests and parentage analysis under different gene flow scenarios.

    PubMed

    Saenz-Agudelo, P; Jones, G P; Thorrold, S R; Planes, S

    2009-04-01

    The application of spatially explicit models of population dynamics to fisheries management and the design marine reserve network systems has been limited due to a lack of empirical estimates of larval dispersal. Here we compared assignment tests and parentage analysis for examining larval retention and connectivity under two different gene flow scenarios using panda clownfish (Amphiprion polymnus) in Papua New Guinea. A metapopulation of panda clownfish in Bootless Bay with little or no genetic differentiation among five spatially discrete locations separated by 2-6 km provided the high gene flow scenario. The low gene flow scenario compared the Bootless Bay metapopulation with a genetically distinct population (F(ST )= 0.1) located at Schumann Island, New Britain, 1500 km to the northeast. We used assignment tests and parentage analysis based on microsatellite DNA data to identify natal origins of 177 juveniles in Bootless Bay and 73 juveniles at Schumann Island. At low rates of gene flow, assignment tests correctly classified juveniles to their source population. On the other hand, parentage analysis led to an overestimate of self-recruitment within the two populations due to the significant deviation from panmixia when both populations were pooled. At high gene flow (within Bootless Bay), assignment tests underestimated self-recruitment and connectivity among subpopulations, and grossly overestimated self-recruitment within the overall metapopulation. However, the assignment tests did identify immigrants from distant (genetically distinct) populations. Parentage analysis clearly provided the most accurate estimates of connectivity in situations of high gene flow.

  14. A Fuzzy-Based Prior Knowledge Diagnostic Model with Multiple Attribute Evaluation

    ERIC Educational Resources Information Center

    Lin, Yi-Chun; Huang, Yueh-Min

    2013-01-01

    Prior knowledge is a very important part of teaching and learning, as it affects how instructors and students interact with the learning materials. In general, tests are used to assess students' prior knowledge. Nevertheless, conventional testing approaches usually assign only an overall score to each student, and this may mean that students are…

  15. Predicate Argument Structure Analysis for Use Case Description Modeling

    NASA Astrophysics Data System (ADS)

    Takeuchi, Hironori; Nakamura, Taiga; Yamaguchi, Takahira

    In a large software system development project, many documents are prepared and updated frequently. In such a situation, support is needed for looking through these documents easily to identify inconsistencies and to maintain traceability. In this research, we focus on the requirements documents such as use cases and consider how to create models from the use case descriptions in unformatted text. In the model construction, we propose a few semantic constraints based on the features of the use cases and use them for a predicate argument structure analysis to assign semantic labels to actors and actions. With this approach, we show that we can assign semantic labels without enhancing any existing general lexical resources such as case frame dictionaries and design a less language-dependent model construction architecture. By using the constructed model, we consider a system for quality analysis of the use cases and automated test case generation to keep the traceability between document sets. We evaluated the reuse of the existing use cases and generated test case steps automatically with the proposed prototype system from real-world use cases in the development of a system using a packaged application. Based on the evaluation, we show how to construct models with high precision from English and Japanese use case data. Also, we could generate good test cases for about 90% of the real use cases through the manual improvement of the descriptions based on the feedback from the quality analysis system.

  16. Testing Structural Models of DSM-IV Symptoms of Common Forms of Child and Adolescent Psychopathology

    ERIC Educational Resources Information Center

    Lahey, Benjamin B.; Rathouz, Paul J.; Van Hulle, Carol; Urbano, Richard C.; Krueger, Robert F.; Applegate, Brooks; Garriock, Holly A.; Chapman, Derek A.; Waldman, Irwin D.

    2008-01-01

    Confirmatory factor analyses were conducted of "Diagnostic and Statistical Manual of Mental Disorders", Fourth Edition (DSM-IV) symptoms of common mental disorders derived from structured interviews of a representative sample of 4,049 twin children and adolescents and their adult caretakers. A dimensional model based on the assignment of symptoms…

  17. Mitigating active shooter impact: Analysis for policy options based on agent/computer-based modeling.

    PubMed

    Anklam, Charles; Kirby, Adam; Sharevski, Filipo; Dietz, J Eric

    2015-01-01

    Active shooting violence at confined settings, such as educational institutions, poses serious security concerns to public safety. In studying the effects of active shooter scenarios, the common denominator associated with all events, regardless of reason/intent for shooter motives, or type of weapons used, was the location chosen and time expended between the beginning of the event and its culmination. This in turn directly correlates to number of casualties incurred in any given event. The longer the event protracts, the more casualties are incurred until law enforcement or another barrier can react and culminate the situation. Using AnyLogic technology, devise modeling scenarios to test multiple hypotheses against free-agent modeling simulation to determine the best method to reduce casualties associated with active shooter scenarios. Test four possible scenarios of responding to active shooter in a public school setting using agent-based computer modeling techniques-scenario 1: basic scenario where no access control or any type of security is used within the school; scenario 2, scenario assumes that concealed carry individual(s) (5-10 percent of the work force) are present in the school; scenario 3, scenario assumes that the school has assigned resource officer; scenario 4, scenario assumes that the school has assigned resource officer and concealed carry individual(s) (5-10 percent) present in the school. Statistical data from modeling scenarios indicating which tested hypothesis resulted in fewer casualties and quicker culmination of event. The use of AnyLogic proved the initial hypothesis that a decrease on response time to an active shooter scenario directly reduced victim casualties. Modeling tests show statistically significant fewer casualties in scenarios where on scene armed responders such as resource officers and concealed carry personnel were present.

  18. Evaluation of assigned-value uncertainty for complex calibrator value assignment processes: a prealbumin example.

    PubMed

    Middleton, John; Vaks, Jeffrey E

    2007-04-01

    Errors of calibrator-assigned values lead to errors in the testing of patient samples. The ability to estimate the uncertainties of calibrator-assigned values and other variables minimizes errors in testing processes. International Organization of Standardization guidelines provide simple equations for the estimation of calibrator uncertainty with simple value-assignment processes, but other methods are needed to estimate uncertainty in complex processes. We estimated the assigned-value uncertainty with a Monte Carlo computer simulation of a complex value-assignment process, based on a formalized description of the process, with measurement parameters estimated experimentally. This method was applied to study uncertainty of a multilevel calibrator value assignment for a prealbumin immunoassay. The simulation results showed that the component of the uncertainty added by the process of value transfer from the reference material CRM470 to the calibrator is smaller than that of the reference material itself (<0.8% vs 3.7%). Varying the process parameters in the simulation model allowed for optimizing the process, while keeping the added uncertainty small. The patient result uncertainty caused by the calibrator uncertainty was also found to be small. This method of estimating uncertainty is a powerful tool that allows for estimation of calibrator uncertainty for optimization of various value assignment processes, with a reduced number of measurements and reagent costs, while satisfying the requirements to uncertainty. The new method expands and augments existing methods to allow estimation of uncertainty in complex processes.

  19. Influence of Steering Control Devices Mounted in Cars for the Disabled on Passive Safety

    NASA Astrophysics Data System (ADS)

    Masiá, J.; Eixerés, B.; Dols, J. F.; Colomina, F. J.

    2009-11-01

    The purpose of this research is to analyze the influence of steering control devices for disabled people on passive safety. It is based on the advances made in the modelling and simulation of the driver position and in the suit verification test. The influence of these devices is studied through airbag deployment and/or its influence on driver safety. We characterize the different adaptations that are used in adapted cars that can be found mounted in vehicles in order to generating models that are verified by experimental test. A three dimensional design software package was used to develop the model. The simulations were generated using a dynamic simulation program employing LSDYNA finite elements. This program plots the geometry and assigns materials. The airbag is shaped, meshed and folded just as it is mounted in current vehicles. The thermodynamic model of expansion of gases is assigned and the contact interfaces are defined. Static tests were carried out on deployment of the airbag to contrast with and to validate the computational models and to measure the behaviour of the airbag when there are steering adaptations mounted in the vehicle.

  20. Improving the Targeting of Treatment: Evidence from College Remediation

    ERIC Educational Resources Information Center

    Scott-Clayton, Judith; Crosta, Peter M.; Belfield, Clive R.

    2014-01-01

    Remediation is one of the largest single interventions intended to improve outcomes for underprepared college students, yet little is known about the remedial screening process. Using administrative data and a rich predictive model, we find that severe mis-assignments are common using current test-score-cutoff-based policies, with…

  1. Deriving flow directions for coarse-resolution (1-4 km) gridded hydrologic modeling

    NASA Astrophysics Data System (ADS)

    Reed, Seann M.

    2003-09-01

    The National Weather Service Hydrology Laboratory (NWS-HL) is currently testing a grid-based distributed hydrologic model at a resolution (4 km) commensurate with operational, radar-based precipitation products. To implement distributed routing algorithms in this framework, a flow direction must be assigned to each model cell. A new algorithm, referred to as cell outlet tracing with an area threshold (COTAT) has been developed to automatically, accurately, and efficiently assign flow directions to any coarse-resolution grid cells using information from any higher-resolution digital elevation model. Although similar to previously published algorithms, this approach offers some advantages. Use of an area threshold allows more control over the tendency for producing diagonal flow directions. Analyses of results at different output resolutions ranging from 300 m to 4000 m indicate that it is possible to choose an area threshold that will produce minimal differences in average network flow lengths across this range of scales. Flow direction grids at a 4 km resolution have been produced for the conterminous United States.

  2. A test of geographic assignment using isotope tracers in feathers of known origin

    USGS Publications Warehouse

    Wunder, Michael B.; Kester, C.L.; Knopf, F.L.; Rye, R.O.

    2005-01-01

    We used feathers of known origin collected from across the breeding range of a migratory shorebird to test the use of isotope tracers for assigning breeding origins. We analyzed δD, δ13C, and δ15N in feathers from 75 mountain plover (Charadrius montanus) chicks sampled in 2001 and from 119 chicks sampled in 2002. We estimated parameters for continuous-response inverse regression models and for discrete-response Bayesian probability models from data for each year independently. We evaluated model predictions with both the training data and by using the alternate year as an independent test dataset. Our results provide weak support for modeling latitude and isotope values as monotonic functions of one another, especially when data are pooled over known sources of variation such as sample year or location. We were unable to make even qualitative statements, such as north versus south, about the likely origin of birds using both δD and δ13C in inverse regression models; results were no better than random assignment. Probability models provided better results and a more natural framework for the problem. Correct assignment rates were highest when considering all three isotopes in the probability framework, but the use of even a single isotope was better than random assignment. The method appears relatively robust to temporal effects and is most sensitive to the isotope discrimination gradients over which samples are taken. We offer that the problem of using isotope tracers to infer geographic origin is best framed as one of assignment, rather than prediction.

  3. Information needs for increasing log transport efficiency

    Treesearch

    Timothy P. McDonald; Steven E. Taylor; Robert B. Rummer; Jorge Valenzuela

    2001-01-01

    Three methods of dispatching trucks to loggers were tested using a log transport simulation model: random allocation, fixed assignment of trucks to loggers, and dispatch based on knowledge of the current status of trucks and loggers within the system. This 'informed' dispatch algorithm attempted to minimize the difference in time between when a logger would...

  4. Creativity Processes of Students in the Design Studio

    ERIC Educational Resources Information Center

    Huber, Amy Mattingly; Leigh, Katharine E.; Tremblay, Kenneth R., Jr.

    2012-01-01

    The creative process is a multifaceted and dynamic path of thinking required to execute a project in design-based disciplines. The goal of this research was to test a model outlining the creative design process by investigating student experiences in a design project assignment. The study used an exploratory design to collect data from student…

  5. Bag-of-features based medical image retrieval via multiple assignment and visual words weighting.

    PubMed

    Wang, Jingyan; Li, Yongping; Zhang, Ying; Wang, Chao; Xie, Honglan; Chen, Guoling; Gao, Xin

    2011-11-01

    Bag-of-features based approaches have become prominent for image retrieval and image classification tasks in the past decade. Such methods represent an image as a collection of local features, such as image patches and key points with scale invariant feature transform (SIFT) descriptors. To improve the bag-of-features methods, we first model the assignments of local descriptors as contribution functions, and then propose a novel multiple assignment strategy. Assuming the local features can be reconstructed by their neighboring visual words in a vocabulary, reconstruction weights can be solved by quadratic programming. The weights are then used to build contribution functions, resulting in a novel assignment method, called quadratic programming (QP) assignment. We further propose a novel visual word weighting method. The discriminative power of each visual word is analyzed by the sub-similarity function in the bin that corresponds to the visual word. Each sub-similarity function is then treated as a weak classifier. A strong classifier is learned by boosting methods that combine those weak classifiers. The weighting factors of the visual words are learned accordingly. We evaluate the proposed methods on medical image retrieval tasks. The methods are tested on three well-known data sets, i.e., the ImageCLEFmed data set, the 304 CT Set, and the basal-cell carcinoma image set. Experimental results demonstrate that the proposed QP assignment outperforms the traditional nearest neighbor assignment, the multiple assignment, and the soft assignment, whereas the proposed boosting based weighting strategy outperforms the state-of-the-art weighting methods, such as the term frequency weights and the term frequency-inverse document frequency weights.

  6. Hydrochemical analysis of groundwater using a tree-based model

    NASA Astrophysics Data System (ADS)

    Litaor, M. Iggy; Brielmann, H.; Reichmann, O.; Shenker, M.

    2010-06-01

    SummaryHydrochemical indices are commonly used to ascertain aquifer characteristics, salinity problems, anthropogenic inputs and resource management, among others. This study was conducted to test the applicability of a binary decision tree model to aquifer evaluation using hydrochemical indices as input. The main advantage of the tree-based model compared to other commonly used statistical procedures such as cluster and factor analyses is the ability to classify groundwater samples with assigned probability and the reduction of a large data set into a few significant variables without creating new factors. We tested the model using data sets collected from headwater springs of the Jordan River, Israel. The model evaluation consisted of several levels of complexity, from simple separation between the calcium-magnesium-bicarbonate water type of karstic aquifers to the more challenging separation of calcium-sodium-bicarbonate water type flowing through perched and regional basaltic aquifers. In all cases, the model assigned measures for goodness of fit in the form of misclassification errors and singled out the most significant variable in the analysis. The model proceeded through a sequence of partitions providing insight into different possible pathways and changing lithology. The model results were extremely useful in constraining the interpretation of geological heterogeneity and constructing a conceptual flow model for a given aquifer. The tree model clearly identified the hydrochemical indices that were excluded from the analysis, thus providing information that can lead to a decrease in the number of routinely analyzed variables and a significant reduction in laboratory cost.

  7. Validity and sensitivity of a model for assessment of impacts of river floodplain reconstruction on protected and endangered species

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nooij, R.J.W. de; Lotterman, K.M.; Sande, P.H.J. van de

    Environmental Impact Assessment (EIA) must account for legally protected and endangered species. Uncertainties relating to the validity and sensitivity of EIA arise from predictions and valuation of effects on these species. This paper presents a validity and sensitivity analysis of a model (BIO-SAFE) for assessment of impacts of land use changes and physical reconstruction measures on legally protected and endangered river species. The assessment is based on links between species (higher plants, birds, mammals, reptiles and amphibians, butterflies and dragon- and damselflies) and ecotopes (landscape ecological units, e.g., river dune, soft wood alluvial forests), and on value assignment to protectedmore » and endangered species using different valuation criteria (i.e., EU Habitats and Birds directive, Conventions of Bern and Bonn and Red Lists). The validity of BIO-SAFE has been tested by comparing predicted effects of landscape changes on the diversity of protected and endangered species with observed changes in biodiversity in five reconstructed floodplains. The sensitivity of BIO-SAFE to value assignment has been analysed using data of a Strategic Environmental Assessment concerning the Spatial Planning Key Decision for reconstruction of the Dutch floodplains of the river Rhine, aimed at flood defence and ecological rehabilitation. The weights given to the valuation criteria for protected and endangered species were varied and the effects on ranking of alternatives were quantified. A statistically significant correlation (p < 0.01) between predicted and observed values for protected and endangered species was found. The sensitivity of the model to value assignment proved to be low. Comparison of five realistic valuation options showed that different rankings of scenarios predominantly occur when valuation criteria are left out of the assessment. Based on these results we conclude that linking species to ecotopes can be used for adequate impact assessments. Quantification of sensitivity of impact assessment to value assignment shows that a model like BIO-SAFE is relatively insensitive to assignment of values to different policy and legislation based criteria. Arbitrariness of the value assignment therefore has a very limited effect on assessment outcomes. However, the decision to include valuation criteria or not is very important.« less

  8. Solar array electrical performance assessment for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Smith, Bryan K.; Brisco, Holly

    1993-01-01

    Electrical power for Space Station Freedom will be generated by large Photovoltaic arrays with a beginning of life power requirement of 30.8 kW per array. The solar arrays will operate in a Low Earth Orbit (LEO) over a design life of fifteen years. This paper provides an analysis of the predicted solar array electrical performance over the design life and presents a summary of supporting analysis and test data for the assigned model parameters and performance loss factors. Each model parameter and loss factor is assessed based upon program requirements, component analysis, and test data to date. A description of the LMSC performance model, future test plans, and predicted performance ranges are also given.

  9. Solar array electrical performance assessment for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Smith, Bryan K.; Brisco, Holly

    1993-01-01

    Electrical power for Space Station Freedom will be generated by large photovoltaic arrays with a beginning of life power requirement of 30.8 kW per array. The solar arrays will operate in a Low Earth Orbit (LEO) over a design life of fifteen years. This paper provides an analysis of the predicted solar array electrical performance over the design life and presents a summary of supporting analysis and test data for the assigned model parameters and performance loss factors. Each model parameter and loss factor is assessed based upon program requirements, component analysis and test data to date. A description of the LMSC performance model future test plans and predicted performance ranges are also given.

  10. Source apportionment of airborne particulate matter using organic compounds as tracers

    NASA Astrophysics Data System (ADS)

    Schauer, James J.; Rogge, Wolfgang F.; Hildemann, Lynn M.; Mazurek, Monica A.; Cass, Glen R.; Simoneit, Bernd R. T.

    A chemical mass balance receptor model based on organic compounds has been developed that relates source contributions to airborne fine particle mass concentrations. Source contributions to the concentrations of specific organic compounds are revealed as well. The model is applied to four air quality monitoring sites in southern California using atmospheric organic compound concentration data and source test data collected specifically for the purpose of testing this model. The contributions of up to nine primary particle source types can be separately identified in ambient samples based on this method, and approximately 85% of the organic fine aerosol is assigned to primary sources on an annual average basis. The model provides information on source contributions to fine mass concentrations, fine organic aerosol concentrations and individual organic compound concentrations. The largest primary source contributors to fine particle mass concentrations in Los Angeles are found to include diesel engine exhaust, paved road dust, gasoline-powered vehicle exhaust, plus emissions from food cooking and wood smoke, with smaller contribution from tire dust, plant fragments, natural gas combustion aerosol, and cigarette smoke. Once these primary aerosol source contributions are added to the secondary sulfates, nitrates and organics present, virtually all of the annual average fine particle mass at Los Angeles area monitoring sites can be assigned to its source.

  11. Source apportionment of airborne particulate matter using organic compounds as tracers

    NASA Astrophysics Data System (ADS)

    Schauer, James J.; Rogge, Wolfgang F.; Hildemann, Lynn M.; Mazurek, Monica A.; Cass, Glen R.; Simoneit, Bernd R. T.

    A chemical mass balance receptor model based on organic compounds has been developed that relates sours; contributions to airborne fine particle mass concentrations. Source contributions to the concentrations of specific organic compounds are revealed as well. The model is applied to four air quality monitoring sites in southern California using atmospheric organic compound concentration data and source test data collected specifically for the purpose of testing this model. The contributions of up to nine primary particle source types can be separately identified in ambient samples based on this method, and approximately 85% of the organic fine aerosol is assigned to primary sources on an annual average basis. The model provides information on source contributions to fine mass concentrations, fine organic aerosol concentrations and individual organic compound concentrations. The largest primary source contributors to fine particle mass concentrations in Los Angeles are found to include diesel engine exhaust, paved road dust, gasoline-powered vehicle exhaust, plus emissions from food cooking and wood smoke, with smaller contribution:; from tire dust, plant fragments, natural gas combustion aerosol, and cigarette smoke. Once these primary aerosol source contributions are added to the secondary sulfates, nitrates and organics present, virtually all of the annual average fine particle mass at Los Angeles area monitoring sites can be assigned to its source.

  12. Application of dynamic traffic assignment to advanced managed lane modeling.

    DOT National Transportation Integrated Search

    2013-11-01

    In this study, a demand estimation framework is developed for assessing the managed lane (ML) : strategies by utilizing dynamic traffic assignment (DTA) modeling, instead of the traditional : approaches that are based on the static traffic assignment...

  13. Study on store-space assignment based on logistic AGV in e-commerce goods to person picking pattern

    NASA Astrophysics Data System (ADS)

    Xu, Lijuan; Zhu, Jie

    2017-10-01

    This paper studied on the store-space assignment based on logistic AGV in E-commerce goods to person picking pattern, and established the store-space assignment model based on the lowest picking cost, and design for store-space assignment algorithm after the cluster analysis based on similarity coefficient. And then through the example analysis, compared the picking cost between store-space assignment algorithm this paper design and according to item number and storage according to ABC classification allocation, and verified the effectiveness of the design of the store-space assignment algorithm.

  14. Support for Simulation-Based Learning; The Effects of Model Progression and Assignments on Learning about Oscillatory Motion.

    ERIC Educational Resources Information Center

    Swaak, Janine; And Others

    In this study, learners worked with a simulation of harmonic oscillation. Two supportive measures were introduced: model progression and assignments. In model progression, the model underlying the simulation is not offered in its full complexity from the start, but variables are gradually introduced. Assignments are small exercises that help the…

  15. A soft-computing methodology for noninvasive time-spatial temperature estimation.

    PubMed

    Teixeira, César A; Ruano, Maria Graça; Ruano, António E; Pereira, Wagner C A

    2008-02-01

    The safe and effective application of thermal therapies is restricted due to lack of reliable noninvasive temperature estimators. In this paper, the temporal echo-shifts of backscattered ultrasound signals, collected from a gel-based phantom, were tracked and assigned with the past temperature values as radial basis functions neural networks input information. The phantom was heated using a piston-like therapeutic ultrasound transducer. The neural models were assigned to estimate the temperature at different intensities and points arranged across the therapeutic transducer radial line (60 mm apart from the transducer face). Model inputs, as well as the number of neurons were selected using the multiobjective genetic algorithm (MOGA). The best attained models present, in average, a maximum absolute error less than 0.5 degrees C, which is pointed as the borderline between a reliable and an unreliable estimator in hyperthermia/diathermia. In order to test the spatial generalization capacity, the best models were tested using spatial points not yet assessed, and some of them presented a maximum absolute error inferior to 0.5 degrees C, being "elected" as the best models. It should be also stressed that these best models present implementational low-complexity, as desired for real-time applications.

  16. Flow assignment model for quantitative analysis of diverting bulk freight from road to railway

    PubMed Central

    Liu, Chang; Wang, Jiaxi; Xiao, Jie; Liu, Siqi; Wu, Jianping; Li, Jian

    2017-01-01

    Since railway transport possesses the advantage of high volume and low carbon emissions, diverting some freight from road to railway will help reduce the negative environmental impacts associated with transport. This paper develops a flow assignment model for quantitative analysis of diverting truck freight to railway. First, a general network which considers road transportation, railway transportation, handling and transferring is established according to all the steps in the whole transportation process. Then general functions which embody the factors which the shippers will pay attention to when choosing mode and path are formulated. The general functions contain the congestion cost on road, the capacity constraints of railways and freight stations. Based on the general network and general cost function, a user equilibrium flow assignment model is developed to simulate the flow distribution on the general network under the condition that all shippers choose transportation mode and path independently. Since the model is nonlinear and challenging, we adopt a method that uses tangent lines to constitute envelope curve to linearize it. Finally, a numerical example is presented to test the model and show the method of making quantitative analysis of bulk freight modal shift between road and railway. PMID:28771536

  17. A Comparison of Guided Assignments and NAEP Format Tests on Adolescent Response to Literature.

    ERIC Educational Resources Information Center

    Appleman, Deborah

    This study investigated the effects of heuristically based assignments and tests on adolescents' written responses to literature. In order to examine both tests and instruction, two separate but related experiments were conducted. Experiment 1 investigated whether the inclusion of guided prewriting on essay tests for two poems and two short…

  18. Phylogenetic Relationships within the Opisthokonta Based on Phylogenomic Analyses of Conserved Single-Copy Protein Domains

    PubMed Central

    Torruella, Guifré; Derelle, Romain; Paps, Jordi; Lang, B. Franz; Roger, Andrew J.; Shalchian-Tabrizi, Kamran; Ruiz-Trillo, Iñaki

    2012-01-01

    Many of the eukaryotic phylogenomic analyses published to date were based on alignments of hundreds to thousands of genes. Frequently, in such analyses, the most realistic evolutionary models currently available are often used to minimize the impact of systematic error. However, controversy remains over whether or not idiosyncratic gene family dynamics (i.e., gene duplications and losses) and incorrect orthology assignments are always appropriately taken into account. In this paper, we present an innovative strategy for overcoming orthology assignment problems. Rather than identifying and eliminating genes with paralogy problems, we have constructed a data set comprised exclusively of conserved single-copy protein domains that, unlike most of the commonly used phylogenomic data sets, should be less confounded by orthology miss-assignments. To evaluate the power of this approach, we performed maximum likelihood and Bayesian analyses to infer the evolutionary relationships within the opisthokonts (which includes Metazoa, Fungi, and related unicellular lineages). We used this approach to test 1) whether Filasterea and Ichthyosporea form a clade, 2) the interrelationships of early-branching metazoans, and 3) the relationships among early-branching fungi. We also assessed the impact of some methods that are known to minimize systematic error, including reducing the distance between the outgroup and ingroup taxa or using the CAT evolutionary model. Overall, our analyses support the Filozoa hypothesis in which Ichthyosporea are the first holozoan lineage to emerge followed by Filasterea, Choanoflagellata, and Metazoa. Blastocladiomycota appears as a lineage separate from Chytridiomycota, although this result is not strongly supported. These results represent independent tests of previous phylogenetic hypotheses, highlighting the importance of sophisticated approaches for orthology assignment in phylogenomic analyses. PMID:21771718

  19. Adaptive Control Strategies for Flexible Robotic Arm

    NASA Technical Reports Server (NTRS)

    Bialasiewicz, Jan T.

    1996-01-01

    The control problem of a flexible robotic arm has been investigated. The control strategies that have been developed have a wide application in approaching the general control problem of flexible space structures. The following control strategies have been developed and evaluated: neural self-tuning control algorithm, neural-network-based fuzzy logic control algorithm, and adaptive pole assignment algorithm. All of the above algorithms have been tested through computer simulation. In addition, the hardware implementation of a computer control system that controls the tip position of a flexible arm clamped on a rigid hub mounted directly on the vertical shaft of a dc motor, has been developed. An adaptive pole assignment algorithm has been applied to suppress vibrations of the described physical model of flexible robotic arm and has been successfully tested using this testbed.

  20. Design of a randomized, controlled, comparative-effectiveness trial testing a Family Model of Diabetes Self-Management Education (DSME) vs. Standard DSME for Marshallese in the United States.

    PubMed

    Kim Yeary, Karen Hye-Cheon; Long, Christopher R; Bursac, Zoran; McElfish, Pearl Anna

    2017-06-01

    Type 2 diabetes (T2D) is a significant public health problem, with U.S. Pacific Islander communities-such as the Marshallese-bearing a disproportionate burden. Using a community-based participatory approach (CBPR) that engages the strong family-based social infrastructure characteristic of Marshallese communities is a promising way to manage T2D. Led by a collaborative community-academic partnership, the Family Model of Diabetes Self-Management Education (DSME) aimed to change diabetes management behaviors to improve glycemic control in Marshallese adults with T2D by engaging the entire family. To test the Family Model of DSME, a randomized, controlled, comparative effectiveness trial with 240 primary participants was implemented. Half of the primary participants were randomly assigned to the Standard DSME and half were randomly assigned to the Family Model DSME. Both arms received ten hours of content comprised of 6-8 sessions delivered over a 6-8 week period. The Family Model DSME was a cultural adaptation of DSME, whereby the intervention focused on engaging family support for the primary participant with T2D. The Standard DSME was delivered to the primary participant in a community-based group format. Primary participants and participating family members were assessed at baseline and immediate post-intervention, and will also be assessed at 6 and 12 months. The Family Model of DSME aimed to improve glycemic control in Marshallese with T2D. The utilization of a CBPR approach that involves the local stakeholders and the engagement of the family-based social infrastructure of Marshallese communities increase potential for the intervention's success and sustainability.

  1. Entropy Based Genetic Association Tests and Gene-Gene Interaction Tests

    PubMed Central

    de Andrade, Mariza; Wang, Xin

    2011-01-01

    In the past few years, several entropy-based tests have been proposed for testing either single SNP association or gene-gene interaction. These tests are mainly based on Shannon entropy and have higher statistical power when compared to standard χ2 tests. In this paper, we extend some of these tests using a more generalized entropy definition, Rényi entropy, where Shannon entropy is a special case of order 1. The order λ (>0) of Rényi entropy weights the events (genotype/haplotype) according to their probabilities (frequencies). Higher λ places more emphasis on higher probability events while smaller λ (close to 0) tends to assign weights more equally. Thus, by properly choosing the λ, one can potentially increase the power of the tests or the p-value level of significance. We conducted simulation as well as real data analyses to assess the impact of the order λ and the performance of these generalized tests. The results showed that for dominant model the order 2 test was more powerful and for multiplicative model the order 1 or 2 had similar power. The analyses indicate that the choice of λ depends on the underlying genetic model and Shannon entropy is not necessarily the most powerful entropy measure for constructing genetic association or interaction tests. PMID:23089811

  2. Using Educational Data Mining Methods to Assess Field-Dependent and Field-Independent Learners' Complex Problem Solving

    ERIC Educational Resources Information Center

    Angeli, Charoula; Valanides, Nicos

    2013-01-01

    The present study investigated the problem-solving performance of 101 university students and their interactions with a computer modeling tool in order to solve a complex problem. Based on their performance on the hidden figures test, students were assigned to three groups of field-dependent (FD), field-mixed (FM), and field-independent (FI)…

  3. Urn models for response-adaptive randomized designs: a simulation study based on a non-adaptive randomized trial.

    PubMed

    Ghiglietti, Andrea; Scarale, Maria Giovanna; Miceli, Rosalba; Ieva, Francesca; Mariani, Luigi; Gavazzi, Cecilia; Paganoni, Anna Maria; Edefonti, Valeria

    2018-03-22

    Recently, response-adaptive designs have been proposed in randomized clinical trials to achieve ethical and/or cost advantages by using sequential accrual information collected during the trial to dynamically update the probabilities of treatment assignments. In this context, urn models-where the probability to assign patients to treatments is interpreted as the proportion of balls of different colors available in a virtual urn-have been used as response-adaptive randomization rules. We propose the use of Randomly Reinforced Urn (RRU) models in a simulation study based on a published randomized clinical trial on the efficacy of home enteral nutrition in cancer patients after major gastrointestinal surgery. We compare results with the RRU design with those previously published with the non-adaptive approach. We also provide a code written with the R software to implement the RRU design in practice. In detail, we simulate 10,000 trials based on the RRU model in three set-ups of different total sample sizes. We report information on the number of patients allocated to the inferior treatment and on the empirical power of the t-test for the treatment coefficient in the ANOVA model. We carry out a sensitivity analysis to assess the effect of different urn compositions. For each sample size, in approximately 75% of the simulation runs, the number of patients allocated to the inferior treatment by the RRU design is lower, as compared to the non-adaptive design. The empirical power of the t-test for the treatment effect is similar in the two designs.

  4. Efficient Storage Scheme of Covariance Matrix during Inverse Modeling

    NASA Astrophysics Data System (ADS)

    Mao, D.; Yeh, T. J.

    2013-12-01

    During stochastic inverse modeling, the covariance matrix of geostatistical based methods carries the information about the geologic structure. Its update during iterations reflects the decrease of uncertainty with the incorporation of observed data. For large scale problem, its storage and update cost too much memory and computational resources. In this study, we propose a new efficient storage scheme for storage and update. Compressed Sparse Column (CSC) format is utilized to storage the covariance matrix, and users can assign how many data they prefer to store based on correlation scales since the data beyond several correlation scales are usually not very informative for inverse modeling. After every iteration, only the diagonal terms of the covariance matrix are updated. The off diagonal terms are calculated and updated based on shortened correlation scales with a pre-assigned exponential model. The correlation scales are shortened by a coefficient, i.e. 0.95, every iteration to show the decrease of uncertainty. There is no universal coefficient for all the problems and users are encouraged to try several times. This new scheme is tested with 1D examples first. The estimated results and uncertainty are compared with the traditional full storage method. In the end, a large scale numerical model is utilized to validate this new scheme.

  5. Galaxy groups in the low-redshift Universe

    NASA Astrophysics Data System (ADS)

    Lim, S. H.; Mo, H. J.; Lu, Yi; Wang, Huiyuan; Yang, Xiaohu

    2017-09-01

    We apply a halo-based group finder to four large redshift surveys, the 2MRS (Two Micron All-Sky Redshift Survey), 6dFGS (Six-degree Field Galaxy Survey), SDSS (Sloan Digital Sky Survey) and 2dFGRS (Two-degree Field Galaxy Redshift Survey), to construct group catalogues in the low-redshift Universe. The group finder is based on that of Yang et al. but with an improved halo mass assignment so that it can be applied uniformly to various redshift surveys of galaxies. Halo masses are assigned to groups according to proxies based on the stellar mass/luminosity of member galaxies. The performances of the group finder in grouping galaxies according to common haloes and in halo mass assignments are tested using realistic mock samples constructed from hydrodynamical simulations and empirical models of galaxy occupation in dark matter haloes. Our group finder finds ∼94 per cent of the correct true member galaxies for 90-95 per cent of the groups in the mock samples; the halo masses assigned by the group finder are un-biased with respect to the true halo masses, and have a typical uncertainty of ∼0.2 dex. The properties of group catalogues constructed from the observational samples are described and compared with other similar catalogues in the literature.

  6. An investigation of the use of temporal decomposition in space mission scheduling

    NASA Technical Reports Server (NTRS)

    Bullington, Stanley E.; Narayanan, Venkat

    1994-01-01

    This research involves an examination of techniques for solving scheduling problems in long-duration space missions. The mission timeline is broken up into several time segments, which are then scheduled incrementally. Three methods are presented for identifying the activities that are to be attempted within these segments. The first method is a mathematical model, which is presented primarily to illustrate the structure of the temporal decomposition problem. Since the mathematical model is bound to be computationally prohibitive for realistic problems, two heuristic assignment procedures are also presented. The first heuristic method is based on dispatching rules for activity selection, and the second heuristic assigns performances of a model evenly over timeline segments. These heuristics are tested using a sample Space Station mission and a Spacelab mission. The results are compared with those obtained by scheduling the missions without any problem decomposition. The applicability of this approach to large-scale mission scheduling problems is also discussed.

  7. The effect of the flipped model on achievement in an introductory college physics course

    NASA Astrophysics Data System (ADS)

    Winter, Joshua Brian

    The flipped or inverted classroom model is one in which the time and place for traditional lecture and homework are reversed. Traditional lecture is replaced by online videos assigned as homework. This frees up time in class to be spent with more student centered activities such as discussion based concept questions and group problem solving. While growing in popularity, research on the effectiveness of this format is sparse. In this quasi-experimental study, two sections of an introductory algebra-based college physics course were examined over a five week period. Each section was taught with either the traditional or flipped model and physics knowledge achieved was compared using independent samples t-tests on both the instructor's unit exam and the Mechanics Baseline Test pre/posttest normalized gain. Results indicated that there was no statistically significant difference between the flipped model and the traditional lecture format. Avenues for further research are discussed.

  8. Sensing Attribute Weights: A Novel Basic Belief Assignment Method

    PubMed Central

    Jiang, Wen; Zhuang, Miaoyan; Xie, Chunhe; Wu, Jun

    2017-01-01

    Dempster–Shafer evidence theory is widely used in many soft sensors data fusion systems on account of its good performance for handling the uncertainty information of soft sensors. However, how to determine basic belief assignment (BBA) is still an open issue. The existing methods to determine BBA do not consider the reliability of each attribute; at the same time, they cannot effectively determine BBA in the open world. In this paper, based on attribute weights, a novel method to determine BBA is proposed not only in the closed world, but also in the open world. The Gaussian model of each attribute is built using the training samples firstly. Second, the similarity between the test sample and the attribute model is measured based on the Gaussian membership functions. Then, the attribute weights are generated using the overlap degree among the classes. Finally, BBA is determined according to the sensed attribute weights. Several examples with small datasets show the validity of the proposed method. PMID:28358325

  9. Sensing Attribute Weights: A Novel Basic Belief Assignment Method.

    PubMed

    Jiang, Wen; Zhuang, Miaoyan; Xie, Chunhe; Wu, Jun

    2017-03-30

    Dempster-Shafer evidence theory is widely used in many soft sensors data fusion systems on account of its good performance for handling the uncertainty information of soft sensors. However, how to determine basic belief assignment (BBA) is still an open issue. The existing methods to determine BBA do not consider the reliability of each attribute; at the same time, they cannot effectively determine BBA in the open world. In this paper, based on attribute weights, a novel method to determine BBA is proposed not only in the closed world, but also in the open world. The Gaussian model of each attribute is built using the training samples firstly. Second, the similarity between the test sample and the attribute model is measured based on the Gaussian membership functions. Then, the attribute weights are generated using the overlap degree among the classes. Finally, BBA is determined according to the sensed attribute weights. Several examples with small datasets show the validity of the proposed method.

  10. The Effect of Scheduling Models for Introductory Algebra on 9th-Grade Students, Test Scores and Grades

    ERIC Educational Resources Information Center

    O'Hanlon, Angela L.

    2011-01-01

    The purpose of the study was to determine the effect of pacing and scheduling of algebra coursework on assigned 9th-grade students who traditionally would qualify for pre-algebra instruction and same course 9th-grade students who traditionally would qualify for standard algebra instruction. Students were selected based on completion of first-year…

  11. Methods and Models for the Construction of Weakly Parallel Tests. Research Report 90-4.

    ERIC Educational Resources Information Center

    Adema, Jos J.

    Methods are proposed for the construction of weakly parallel tests, that is, tests with the same test information function. A mathematical programing model for constructing tests with a prespecified test information function and a heuristic for assigning items to tests such that their information functions are equal play an important role in the…

  12. Ability-Grouping and Academic Inequality: Evidence from Rule-Based Student Assignments. NBER Working Paper No. 14911

    ERIC Educational Resources Information Center

    Jackson, C. Kirabo

    2009-01-01

    In Trinidad and Tobago students are assigned to secondary schools after fifth grade based on achievement tests, leading to large differences in the school environments to which students of differing initial levels of achievement are exposed. Using both a regression discontinuity design and rule-based instrumental variables to address…

  13. Earthquake likelihood model testing

    USGS Publications Warehouse

    Schorlemmer, D.; Gerstenberger, M.C.; Wiemer, S.; Jackson, D.D.; Rhoades, D.A.

    2007-01-01

    INTRODUCTIONThe Regional Earthquake Likelihood Models (RELM) project aims to produce and evaluate alternate models of earthquake potential (probability per unit volume, magnitude, and time) for California. Based on differing assumptions, these models are produced to test the validity of their assumptions and to explore which models should be incorporated in seismic hazard and risk evaluation. Tests based on physical and geological criteria are useful but we focus on statistical methods using future earthquake catalog data only. We envision two evaluations: a test of consistency with observed data and a comparison of all pairs of models for relative consistency. Both tests are based on the likelihood method, and both are fully prospective (i.e., the models are not adjusted to fit the test data). To be tested, each model must assign a probability to any possible event within a specified region of space, time, and magnitude. For our tests the models must use a common format: earthquake rates in specified “bins” with location, magnitude, time, and focal mechanism limits.Seismology cannot yet deterministically predict individual earthquakes; however, it should seek the best possible models for forecasting earthquake occurrence. This paper describes the statistical rules of an experiment to examine and test earthquake forecasts. The primary purposes of the tests described below are to evaluate physical models for earthquakes, assure that source models used in seismic hazard and risk studies are consistent with earthquake data, and provide quantitative measures by which models can be assigned weights in a consensus model or be judged as suitable for particular regions.In this paper we develop a statistical method for testing earthquake likelihood models. A companion paper (Schorlemmer and Gerstenberger 2007, this issue) discusses the actual implementation of these tests in the framework of the RELM initiative.Statistical testing of hypotheses is a common task and a wide range of possible testing procedures exist. Jolliffe and Stephenson (2003) present different forecast verifications from atmospheric science, among them likelihood testing of probability forecasts and testing the occurrence of binary events. Testing binary events requires that for each forecasted event, the spatial, temporal and magnitude limits be given. Although major earthquakes can be considered binary events, the models within the RELM project express their forecasts on a spatial grid and in 0.1 magnitude units; thus the results are a distribution of rates over space and magnitude. These forecasts can be tested with likelihood tests.In general, likelihood tests assume a valid null hypothesis against which a given hypothesis is tested. The outcome is either a rejection of the null hypothesis in favor of the test hypothesis or a nonrejection, meaning the test hypothesis cannot outperform the null hypothesis at a given significance level. Within RELM, there is no accepted null hypothesis and thus the likelihood test needs to be expanded to allow comparable testing of equipollent hypotheses.To test models against one another, we require that forecasts are expressed in a standard format: the average rate of earthquake occurrence within pre-specified limits of hypocentral latitude, longitude, depth, magnitude, time period, and focal mechanisms. Focal mechanisms should either be described as the inclination of P-axis, declination of P-axis, and inclination of the T-axis, or as strike, dip, and rake angles. Schorlemmer and Gerstenberger (2007, this issue) designed classes of these parameters such that similar models will be tested against each other. These classes make the forecasts comparable between models. Additionally, we are limited to testing only what is precisely defined and consistently reported in earthquake catalogs. Therefore it is currently not possible to test such information as fault rupture length or area, asperity location, etc. Also, to account for data quality issues, we allow for location and magnitude uncertainties as well as the probability that an event is dependent on another event.As we mentioned above, only models with comparable forecasts can be tested against each other. Our current tests are designed to examine grid-based models. This requires that any fault-based model be adapted to a grid before testing is possible. While this is a limitation of the testing, it is an inherent difficulty in any such comparative testing. Please refer to appendix B for a statistical evaluation of the application of the Poisson hypothesis to fault-based models.The testing suite we present consists of three different tests: L-Test, N-Test, and R-Test. These tests are defined similarily to Kagan and Jackson (1995). The first two tests examine the consistency of the hypotheses with the observations while the last test compares the spatial performances of the models.

  14. Directed Design of Experiments for Validating Probability of Detection Capability of a Testing System

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R. (Inventor)

    2012-01-01

    A method of validating a probability of detection (POD) testing system using directed design of experiments (DOE) includes recording an input data set of observed hit and miss or analog data for sample components as a function of size of a flaw in the components. The method also includes processing the input data set to generate an output data set having an optimal class width, assigning a case number to the output data set, and generating validation instructions based on the assigned case number. An apparatus includes a host machine for receiving the input data set from the testing system and an algorithm for executing DOE to validate the test system. The algorithm applies DOE to the input data set to determine a data set having an optimal class width, assigns a case number to that data set, and generates validation instructions based on the case number.

  15. Identification of Heterogeneous Cognitive Subgroups in Community-Dwelling Older Adults: A Latent Class Analysis of the Einstein Aging Study.

    PubMed

    Zammit, Andrea R; Hall, Charles B; Lipton, Richard B; Katz, Mindy J; Muniz-Terrera, Graciela

    2018-05-01

    The aim of this study was to identify natural subgroups of older adults based on cognitive performance, and to establish each subgroup's characteristics based on demographic factors, physical function, psychosocial well-being, and comorbidity. We applied latent class (LC) modeling to identify subgroups in baseline assessments of 1345 Einstein Aging Study (EAS) participants free of dementia. The EAS is a community-dwelling cohort study of 70+ year-old adults living in the Bronx, NY. We used 10 neurocognitive tests and 3 covariates (age, sex, education) to identify latent subgroups. We used goodness-of-fit statistics to identify the optimal class solution and assess model adequacy. We also validated our model using two-fold split-half cross-validation. The sample had a mean age of 78.0 (SD=5.4) and a mean of 13.6 years of education (SD=3.5). A 9-class solution based on cognitive performance at baseline was the best-fitting model. We characterized the 9 identified classes as (i) disadvantaged, (ii) poor language, (iii) poor episodic memory and fluency, (iv) poor processing speed and executive function, (v) low average, (vi) high average, (vii) average, (viii) poor executive and poor working memory, (ix) elite. The cross validation indicated stable class assignment with the exception of the average and high average classes. LC modeling in a community sample of older adults revealed 9 cognitive subgroups. Assignment of subgroups was reliable and associated with external validators. Future work will test the predictive validity of these groups for outcomes such as Alzheimer's disease, vascular dementia and death, as well as markers of biological pathways that contribute to cognitive decline. (JINS, 2018, 24, 511-523).

  16. Introduction of a new laboratory test: an econometric approach with the use of neural network analysis.

    PubMed

    Jabor, A; Vlk, T; Boril, P

    1996-04-15

    We designed a simulation model for the assessment of the financial risks involved when a new diagnostic test is introduced in the laboratory. The model is based on a neural network consisting of ten neurons and assumes that input entities can have assigned appropriate uncertainty. Simulations are done on a 1-day interval basis. Risk analysis completes the model and the financial effects are evaluated for a selected time period. The basic output of the simulation consists of total expenses and income during the simulation time, net present value of the project at the end of simulation, total number of control samples during simulation, total number of patients evaluated and total number of used kits.

  17. The Academic Effects of Summer Instruction and Retention in New York City

    ERIC Educational Resources Information Center

    Mariano, Louis T.; Martorell, Paco

    2013-01-01

    This article examines the impacts of summer instruction and test-based grade retention in New York City. We use a research design that exploits test score cutoffs used in assignment to these treatments. We find modest positive effects of summer instruction on English language arts (ELA) achievement for students assigned to summer instruction…

  18. Effectiveness of a Case-Based Computer Program on Students' Ethical Decision Making.

    PubMed

    Park, Eun-Jun; Park, Mihyun

    2015-11-01

    The aim of this study was to test the effectiveness of a case-based computer program, using an integrative ethical decision-making model, on the ethical decision-making competency of nursing students in South Korea. This study used a pre- and posttest comparison design. Students in the intervention group used a computer program for case analysis assignments, whereas students in the standard group used a traditional paper assignment for case analysis. The findings showed that using the case-based computer program as a complementary tool for the ethics courses offered at the university enhanced students' ethical preparedness and satisfaction with the course. On the basis of the findings, it is recommended that nurse educators use a case-based computer program as a complementary self-study tool in ethics courses to supplement student learning without an increase in course hours, particularly in terms of analyzing ethics cases with dilemma scenarios and exercising ethical decision making. Copyright 2015, SLACK Incorporated.

  19. Multibody modeling and verification

    NASA Technical Reports Server (NTRS)

    Wiens, Gloria J.

    1989-01-01

    A summary of a ten week project on flexible multibody modeling, verification and control is presented. Emphasis was on the need for experimental verification. A literature survey was conducted for gathering information on the existence of experimental work related to flexible multibody systems. The first portion of the assigned task encompassed the modeling aspects of flexible multibodies that can undergo large angular displacements. Research in the area of modeling aspects were also surveyed, with special attention given to the component mode approach. Resulting from this is a research plan on various modeling aspects to be investigated over the next year. The relationship between the large angular displacements, boundary conditions, mode selection, and system modes is of particular interest. The other portion of the assigned task was the generation of a test plan for experimental verification of analytical and/or computer analysis techniques used for flexible multibody systems. Based on current and expected frequency ranges of flexible multibody systems to be used in space applications, an initial test article was selected and designed. A preliminary TREETOPS computer analysis was run to ensure frequency content in the low frequency range, 0.1 to 50 Hz. The initial specifications of experimental measurement and instrumentation components were also generated. Resulting from this effort is the initial multi-phase plan for a Ground Test Facility of Flexible Multibody Systems for Modeling Verification and Control. The plan focusses on the Multibody Modeling and Verification (MMV) Laboratory. General requirements of the Unobtrusive Sensor and Effector (USE) and the Robot Enhancement (RE) laboratories were considered during the laboratory development.

  20. Using School Lotteries to Evaluate the Value-Added Model

    ERIC Educational Resources Information Center

    Deutsch, Jonah

    2013-01-01

    There has been an active debate in the literature over the validity of value-added models. In this study, the author tests the central assumption of value-added models that school assignment is random relative to expected test scores conditional on prior test scores, demographic variables, and other controls. He uses a Chicago charter school's…

  1. Approach for Text Classification Based on the Similarity Measurement between Normal Cloud Models

    PubMed Central

    Dai, Jin; Liu, Xin

    2014-01-01

    The similarity between objects is the core research area of data mining. In order to reduce the interference of the uncertainty of nature language, a similarity measurement between normal cloud models is adopted to text classification research. On this basis, a novel text classifier based on cloud concept jumping up (CCJU-TC) is proposed. It can efficiently accomplish conversion between qualitative concept and quantitative data. Through the conversion from text set to text information table based on VSM model, the text qualitative concept, which is extraction from the same category, is jumping up as a whole category concept. According to the cloud similarity between the test text and each category concept, the test text is assigned to the most similar category. By the comparison among different text classifiers in different feature selection set, it fully proves that not only does CCJU-TC have a strong ability to adapt to the different text features, but also the classification performance is also better than the traditional classifiers. PMID:24711737

  2. Impacts of high resolution data on traveler compliance levels in emergency evacuation simulations

    DOE PAGES

    Lu, Wei; Han, Lee D.; Liu, Cheng; ...

    2016-05-05

    In this article, we conducted a comparison study of evacuation assignment based on Traffic Analysis Zones (TAZ) and high resolution LandScan USA Population Cells (LPC) with detailed real world roads network. A platform for evacuation modeling built on high resolution population distribution data and activity-based microscopic traffic simulation was proposed. This platform can be extended to any cities in the world. The results indicated that evacuee compliance behavior affects evacuation efficiency with traditional TAZ assignment, but it did not significantly compromise the performance with high resolution LPC assignment. The TAZ assignment also underestimated the real travel time during evacuation. Thismore » suggests that high data resolution can improve the accuracy of traffic modeling and simulation. The evacuation manager should consider more diverse assignment during emergency evacuation to avoid congestions.« less

  3. Educating resident physicians using virtual case-based simulation improves diabetes management: a randomized controlled trial.

    PubMed

    Sperl-Hillen, JoAnn; O'Connor, Patrick J; Ekstrom, Heidi L; Rush, William A; Asche, Stephen E; Fernandes, Omar D; Appana, Deepika; Amundson, Gerald H; Johnson, Paul E; Curran, Debra M

    2014-12-01

    To test a virtual case-based Simulated Diabetes Education intervention (SimDE) developed to teach primary care residents how to manage diabetes. Nineteen primary care residency programs, with 341 volunteer residents in all postgraduate years (PGY), were randomly assigned to a SimDE intervention group or control group (CG). The Web-based interactive educational intervention used computerized virtual patients who responded to provider actions through programmed simulation models. Eighteen distinct learning cases (L-cases) were assigned to SimDE residents over six months from 2010 to 2011. Impact was assessed using performance on four virtual assessment cases (A-cases), an objective knowledge test, and pre-post changes in self-assessed diabetes knowledge and confidence. Group comparisons were analyzed using generalized linear mixed models, controlling for clustering of residents within residency programs and differences in baseline knowledge. The percentages of residents appropriately achieving A-case composite clinical goals for glucose, blood pressure, and lipids were as follows: A-case 1: SimDE = 21.2%, CG = 1.8%, P = .002; A-case 2: SimDE = 15.7%, CG = 4.7%, P = .02; A-case 3: SimDE = 48.0%, CG = 10.4%, P < .001; and A-case 4: SimDE = 42.1%, CG = 18.7%, P = .004. The mean knowledge score and pre-post changes in self-assessed knowledge and confidence were significantly better for SimDE group than CG participants. A virtual case-based simulated diabetes education intervention improved diabetes management skills, knowledge, and confidence for primary care residents.

  4. Training Post-9/11 Police Officers with a Counter-Terrorism Reality-Based Training Model: A Case Study

    ERIC Educational Resources Information Center

    Biddle, Christopher J.

    2013-01-01

    The purpose of this qualitative holistic multiple-case study was to identify the optimal theoretical approach for a Counter-Terrorism Reality-Based Training (CTRBT) model to train post-9/11 police officers to perform effectively in their counter-terrorism assignments. Post-9/11 police officers assigned to counter-terrorism duties are not trained…

  5. Online Assignments in Economics: A Test of Their Effectiveness

    ERIC Educational Resources Information Center

    Kennelly, Brendan; Considine, John; Flannery, Darragh

    2011-01-01

    This article compares the effectiveness of online and paper-based assignments and tutorials using summative assessment results. All of the students in a large managerial economics course at National University of Ireland, Galway were asked to do six assignments online using Aplia and to do two on paper. The authors examined whether a student's…

  6. Mind map learning for advanced engineering study: case study in system dynamics

    NASA Astrophysics Data System (ADS)

    Woradechjumroen, Denchai

    2018-01-01

    System Dynamics (SD) is one of the subjects that were use in learning Automatic Control Systems in dynamic and control field. Mathematical modelling and solving skills of students for engineering systems are expecting outcomes of the course which can be further used to efficiently study control systems and mechanical vibration; however, the fundamental of the SD includes strong backgrounds in Dynamics and Differential Equations, which are appropriate to the students in governmental universities that have strong skills in Mathematics and Scientifics. For private universities, students are weak in the above subjects since they obtained high vocational certificate from Technical College or Polytechnic School, which emphasize the learning contents in practice. To enhance their learning for improving their backgrounds, this paper applies mind maps based problem based learning to relate the essential relations of mathematical and physical equations. With the advantages of mind maps, each student is assigned to design individual mind maps for self-leaning development after they attend the class and learn overall picture of each chapter from the class instructor. Four problems based mind maps learning are assigned to each student. Each assignment is evaluated via mid-term and final examinations, which are issued in terms of learning concepts and applications. In the method testing, thirty students are tested and evaluated via student learning backgrounds in the past. The result shows that well-design mind maps can improve learning performance based on outcome evaluation. Especially, mind maps can reduce time-consuming and reviewing for Mathematics and Physics in SD significantly.

  7. Model Based Reconstruction of UT Array Data

    NASA Astrophysics Data System (ADS)

    Calmon, P.; Iakovleva, E.; Fidahoussen, A.; Ribay, G.; Chatillon, S.

    2008-02-01

    Beyond the detection of defects, their characterization (identification, positioning, sizing) is one goal of great importance often assigned to the analysis of NDT data. The first step of such analysis in the case of ultrasonic testing amounts to image in the part the detected echoes. This operation is in general achieved by considering time of flights and by applying simplified algorithms which are often valid only on canonical situations. In this communication we present an overview of different imaging techniques studied at CEA LIST and based on the exploitation of direct models which enable to address complex configurations and are available in the CIVA software plat-form. We discuss in particular ray-model based algorithms, algorithms derived from classical synthetic focusing and processing of the full inter-element matrix (MUSIC algorithm).

  8. Optimizing and accelerating the assignation of lineages in Mycobacterium tuberculosis using novel alternative single-tube assays

    PubMed Central

    Carcelén, María; Abascal, Estefanía; Herranz, Marta; Santantón, Sheila; Zenteno, Roberto; Ruiz Serrano, María Jesús; Bouza, Emilio

    2017-01-01

    The assignation of lineages in Mycobacterium tuberculosis (MTB) provides valuable information for evolutionary and phylogeographic studies and makes for more accurate knowledge of the distribution of this pathogen worldwide. Differences in virulence have also been found for certain lineages. MTB isolates were initially assigned to lineages based on data obtained from genotyping techniques, such as spoligotyping or MIRU-VNTR analysis, some of which are more suitable for molecular epidemiology studies. However, since these methods are subject to a certain degree of homoplasy, other criteria have been chosen to assign lineages. These are based on targeting robust and specific SNPs for each lineage. Here, we propose two newly designed multiplex targeting methods—both of which are single-tube tests—to optimize the assignation of the six main lineages in MTB. The first method is based on ASO-PCR and offers an inexpensive and easy-to-implement assay for laboratories with limited resources. The other, which is based on SNaPshot, enables more refined standardized assignation of lineages for laboratories with better resources. Both methods performed well when assigning lineages from cultured isolates from a control panel, a test panel, and a problem panel from an unrelated population, Mexico, which included isolates in which standard genotyping was not able to classify lineages. Both tests were also able to assign lineages from stored isolates, without the need for subculture or purification of DNA, and even directly from clinical specimens with a medium-high bacilli burden. Our assays could broaden the contexts where information on lineages can be acquired, thus enabling us to quickly update data from retrospective collections and to merge data with those obtained at the time of diagnosis of a new TB case. PMID:29091913

  9. Stimulating Scientific Reasoning with Drawing-Based Modeling

    NASA Astrophysics Data System (ADS)

    Heijnes, Dewi; van Joolingen, Wouter; Leenaars, Frank

    2018-02-01

    We investigate the way students' reasoning about evolution can be supported by drawing-based modeling. We modified the drawing-based modeling tool SimSketch to allow for modeling evolutionary processes. In three iterations of development and testing, students in lower secondary education worked on creating an evolutionary model. After each iteration, the user interface and instructions were adjusted based on students' remarks and the teacher's observations. Students' conversations were analyzed on reasoning complexity as a measurement of efficacy of the modeling tool and the instructions. These findings were also used to compose a set of recommendations for teachers and curriculum designers for using and constructing models in the classroom. Our findings suggest that to stimulate scientific reasoning in students working with a drawing-based modeling, tool instruction about the tool and the domain should be integrated. In creating models, a sufficient level of scaffolding is necessary. Without appropriate scaffolds, students are not able to create the model. With scaffolding that is too high, students may show reasoning that incorrectly assigns external causes to behavior in the model.

  10. The direct assignment option as a modular design component: an example for the setting of two predefined subgroups.

    PubMed

    An, Ming-Wen; Lu, Xin; Sargent, Daniel J; Mandrekar, Sumithra J

    2015-01-01

    A phase II design with an option for direct assignment (stop randomization and assign all patients to experimental treatment based on interim analysis, IA) for a predefined subgroup was previously proposed. Here, we illustrate the modularity of the direct assignment option by applying it to the setting of two predefined subgroups and testing for separate subgroup main effects. We power the 2-subgroup direct assignment option design with 1 IA (DAD-1) to test for separate subgroup main effects, with assessment of power to detect an interaction in a post-hoc test. Simulations assessed the statistical properties of this design compared to the 2-subgroup balanced randomized design with 1 IA, BRD-1. Different response rates for treatment/control in subgroup 1 (0.4/0.2) and in subgroup 2 (0.1/0.2, 0.4/0.2) were considered. The 2-subgroup DAD-1 preserves power and type I error rate compared to the 2-subgroup BRD-1, while exhibiting reasonable power in a post-hoc test for interaction. The direct assignment option is a flexible design component that can be incorporated into broader design frameworks, while maintaining desirable statistical properties, clinical appeal, and logistical simplicity.

  11. ConfChem Conference on Select 2016 BCCE Presentations: Specifications Grading in the Flipped Organic Classroom

    ERIC Educational Resources Information Center

    Ring, Joshua

    2017-01-01

    Specifications Grading is a system of course-long student assessment based on the division of learning objectives into clearly defined skill tests or assignments. Each skill is evaluated at a mastery level, with opportunities for students to learn from their mistakes and then be re-evaluated for skill tests, or resubmit assignments. Specifications…

  12. Nursing research on a first aid model of double personnel for major burn patients.

    PubMed

    Wu, Weiwei; Shi, Kai; Jin, Zhenghua; Liu, Shuang; Cai, Duo; Zhao, Jingchun; Chi, Cheng; Yu, Jiaao

    2015-03-01

    This study explored the effect of a first aid model employing two nurses on the efficient rescue operation time and the efficient resuscitation time for major burn patients. A two-nurse model of first aid was designed for major burn patients. The model includes a division of labor between the first aid nurses and the re-organization of emergency carts. The clinical effectiveness of the process was examined in a retrospective chart review of 156 cases of major burn patients, experiencing shock and low blood volume, who were admitted to the intensive care unit of the department of burn surgery between November 2009 and June 2013. Of the 156 major burn cases, 87 patients who received first aid using the double personnel model were assigned to the test group and the 69 patients who received first aid using the standard first aid model were assigned to the control group. The efficient rescue operation time and the efficient resuscitation time for the patients were compared between the two groups. Student's t tests were used to the compare the mean difference between the groups. Statistically significant differences between the two groups were found on both measures (P's < 0.05), with the test group having lower times than the control group. The efficient rescue operation time was 14.90 ± 3.31 min in the test group and 30.42 ± 5.65 min in the control group. The efficient resuscitation time was 7.4 ± 3.2 h in the test group and 9.5 ± 2.7 h in the control group. A two-nurse first aid model based on scientifically validated procedures and a reasonable division of labor can shorten the efficient rescue operation time and the efficient resuscitation time for major burn patients. Given these findings, the model appears to be worthy of clinical application.

  13. Increasing School Success Through Partnership-Based Family Competency Training

    PubMed Central

    Spoth, Richard; Randall, G. Kevin; Shin, Chungyeol

    2008-01-01

    An expanding body of research suggests an important role for parent or family competency training in children’s social-emotional learning and related school success. This article summarizes a test of a longitudinal model examining partnership-based family competency training effects on academic success in a general population. Specifically, it examines indirect effects of the Iowa Strengthening Families Program (ISFP) on school engagement in 8th grade and academic success in the 12th grade, through direct ISFP effects on intervention-targeted outcomes—parenting competencies and student substance-related risk—in 6th grade. Twenty-two rural schools were randomly assigned to either ISFP or a minimal-contact control group; data were collected from 445 families. Following examination of the equivalence of the measurement model across group and time, a structural equation modeling approach was used to test the hypothesized model and corresponding hypothesized structural paths. Significant effects of the ISFP were found on proximal intervention outcomes, intermediate school engagement, and the academic success of high school seniors. PMID:20376279

  14. Implementation of K-Means Clustering Method for Electronic Learning Model

    NASA Astrophysics Data System (ADS)

    Latipa Sari, Herlina; Suranti Mrs., Dewi; Natalia Zulita, Leni

    2017-12-01

    Teaching and Learning process at SMK Negeri 2 Bengkulu Tengah has applied e-learning system for teachers and students. The e-learning was based on the classification of normative, productive, and adaptive subjects. SMK Negeri 2 Bengkulu Tengah consisted of 394 students and 60 teachers with 16 subjects. The record of e-learning database was used in this research to observe students’ activity pattern in attending class. K-Means algorithm in this research was used to classify students’ learning activities using e-learning, so that it was obtained cluster of students’ activity and improvement of student’s ability. Implementation of K-Means Clustering method for electronic learning model at SMK Negeri 2 Bengkulu Tengah was conducted by observing 10 students’ activities, namely participation of students in the classroom, submit assignment, view assignment, add discussion, view discussion, add comment, download course materials, view article, view test, and submit test. In the e-learning model, the testing was conducted toward 10 students that yielded 2 clusters of membership data (C1 and C2). Cluster 1: with membership percentage of 70% and it consisted of 6 members, namely 1112438 Anggi Julian, 1112439 Anis Maulita, 1112441 Ardi Febriansyah, 1112452 Berlian Sinurat, 1112460 Dewi Anugrah Anwar and 1112467 Eka Tri Oktavia Sari. Cluster 2:with membership percentage of 30% and it consisted of 4 members, namely 1112463 Dosita Afriyani, 1112471 Erda Novita, 1112474 Eskardi and 1112477 Fachrur Rozi.

  15. Investigating the Correlation Between Pharmacy Student Performance on the Health Science Reasoning Test and a Critical Thinking Assignment.

    PubMed

    Nornoo, Adwoa O; Jackson, Jonathan; Axtell, Samantha

    2017-03-25

    Objective. To determine whether there is a correlation between pharmacy students' scores on the Health Science Reasoning Test (HSRT) and their grade on a package insert assignment designed to assess critical thinking. Methods. The HSRT was administered to first-year pharmacy students during a critical-thinking course in the spring semester. In the same semester, a required package insert assignment was completed in a pharmacokinetics course. To determine whether there was a relationship between HSRT scores and grades on the assignment, a Spearman's rho correlation test was performed. Results. A very weak but significant positive correlation was found between students' grades on the assignment and their overall HSRT score (r=0.19, p <0.05), as well as deduction (a scale score of the HSRT; r=0.26, p <0.01). Conclusion. Based on a very weak but significant correlation to HSRT scores, this study demonstrated the potential of a package insert assignment to be used as one of the components to measure critical-thinking skills in pharmacy students.

  16. Investigating the Correlation Between Pharmacy Student Performance on the Health Science Reasoning Test and a Critical Thinking Assignment

    PubMed Central

    Jackson, Jonathan; Axtell, Samantha

    2017-01-01

    Objective. To determine whether there is a correlation between pharmacy students’ scores on the Health Science Reasoning Test (HSRT) and their grade on a package insert assignment designed to assess critical thinking. Methods. The HSRT was administered to first-year pharmacy students during a critical-thinking course in the spring semester. In the same semester, a required package insert assignment was completed in a pharmacokinetics course. To determine whether there was a relationship between HSRT scores and grades on the assignment, a Spearman’s rho correlation test was performed. Results. A very weak but significant positive correlation was found between students’ grades on the assignment and their overall HSRT score (r=0.19, p<0.05), as well as deduction (a scale score of the HSRT; r=0.26, p<0.01). Conclusion. Based on a very weak but significant correlation to HSRT scores, this study demonstrated the potential of a package insert assignment to be used as one of the components to measure critical-thinking skills in pharmacy students. PMID:28381884

  17. Genetic screening and testing in an episode-based payment model: preserving patient autonomy.

    PubMed

    Sutherland, Sharon; Farrell, Ruth M; Lockwood, Charles

    2014-11-01

    The State of Ohio is implementing an episode-based payment model for perinatal care. All costs of care will be tabulated for each live birth and assigned to the delivering provider, creating a three-tiered model for reimbursement for care. Providers will be reimbursed as usual for care that is average in cost and quality, while instituting rewards or penalties for those outside the expected range in either domain. There are few exclusions, and all methods of genetic screening and diagnostic testing are included in the episode cost calculation as proposed. Prenatal ultrasonography, genetic screening, and diagnostic testing are critical components of the delivery of high-quality, evidence-based prenatal care. These tests provide pregnant women with key information about the pregnancy, which, in turn, allows them to work closely with their health care provider to determine optimal prenatal care. The concepts of informed consent and decision-making, cornerstones of the ethical practice of medicine, are founded on the principles of autonomy and respect for persons. These principles recognize that patients' rights to make choices and take actions are based on their personal beliefs and values. Given the personal nature of such decisions, it is critical that patients have unbarred access to prenatal genetic tests if they elect to use them as part of their prenatal care. The proposed restructuring of reimbursement creates a clear conflict between patient autonomy and physician financial incentives.

  18. USE OF TRANS-CONTEXTUAL MODEL-BASED PHYSICAL ACTIVITY COURSE IN DEVELOPING LEISURE-TIME PHYSICAL ACTIVITY BEHAVIOR OF UNIVERSITY STUDENTS.

    PubMed

    Müftüler, Mine; İnce, Mustafa Levent

    2015-08-01

    This study examined how a physical activity course based on the Trans-Contextual Model affected the variables of perceived autonomy support, autonomous motivation, determinants of leisure-time physical activity behavior, basic psychological needs satisfaction, and leisure-time physical activity behaviors. The participants were 70 Turkish university students (M age=23.3 yr., SD=3.2). A pre-test-post-test control group design was constructed. Initially, the participants were randomly assigned into an experimental (n=35) and a control (n=35) group. The experimental group followed a 12 wk. trans-contextual model-based intervention. The participants were pre- and post-tested in terms of Trans-Contextual Model constructs and of self-reported leisure-time physical activity behaviors. Multivariate analyses showed significant increases over the 12 wk. period for perceived autonomy support from instructor and peers, autonomous motivation in leisure-time physical activity setting, positive intention and perceived behavioral control over leisure-time physical activity behavior, more fulfillment of psychological needs, and more engagement in leisure-time physical activity behavior in the experimental group. These results indicated that the intervention was effective in developing leisure-time physical activity and indicated that the Trans-Contextual Model is a useful way to conceptualize these relationships.

  19. A feedback model of figure-ground assignment.

    PubMed

    Domijan, Drazen; Setić, Mia

    2008-05-30

    A computational model is proposed in order to explain how bottom-up and top-down signals are combined into a unified perception of figure and background. The model is based on the interaction between the ventral and the dorsal stream. The dorsal stream computes saliency based on boundary signals provided by the simple and the complex cortical cells. Output from the dorsal stream is projected to the surface network which serves as a blackboard on which the surface representation is formed. The surface network is a recurrent network which segregates different surfaces by assigning different firing rates to them. The figure is labeled by the maximal firing rate. Computer simulations showed that the model correctly assigns figural status to the surface with a smaller size, a greater contrast, convexity, surroundedness, horizontal-vertical orientation and a higher spatial frequency content. The simple gradient of activity in the dorsal stream enables the simulation of the new principles of the lower region and the top-bottom polarity. The model also explains how the exogenous attention and the endogenous attention may reverse the figural assignment. Due to the local excitation in the surface network, neural activity at the cued region will spread over the whole surface representation. Therefore, the model implements the object-based attentional selection.

  20. Fully automatic assignment of small molecules' NMR spectra without relying on chemical shift predictions.

    PubMed

    Castillo, Andrés M; Bernal, Andrés; Patiny, Luc; Wist, Julien

    2015-08-01

    We present a method for the automatic assignment of small molecules' NMR spectra. The method includes an automatic and novel self-consistent peak-picking routine that validates NMR peaks in each spectrum against peaks in the same or other spectra that are due to the same resonances. The auto-assignment routine used is based on branch-and-bound optimization and relies predominantly on integration and correlation data; chemical shift information may be included when available to fasten the search and shorten the list of viable assignments, but in most cases tested, it is not required in order to find the correct assignment. This automatic assignment method is implemented as a web-based tool that runs without any user input other than the acquired spectra. Copyright © 2015 John Wiley & Sons, Ltd.

  1. Expanding the Parameters for Excellence in Patient Assignments: Is Leveraging an Evidence-Data-Based Acuity Methodology Realistic?

    PubMed

    Gray, Joel; Kerfoot, Karlene

    2016-01-01

    Finding the balance of equitable assignments continues to be a challenge for health care organizations seeking to leverage evidence-based leadership practices. Ratios and subjective acuity strategies for nurse-patient staffing continue to be the dominant approach in health care organizations. In addition to ratio-based assignments and acuity-based assignment models driven by financial targets, more emphasis on using evidence-based leadership strategies to manage and create science for effective staffing is needed. In particular, nurse leaders are challenged to increase the sophistication of management of patient turnover (admissions, discharges, and transfers) and integrate tools from Lean methodologies and quality management strategies to determine the effectiveness of nurse-patient staffing.

  2. A stochastic inference of de novo CNV detection and association test in multiplex schizophrenia families.

    PubMed

    Wang, Shi-Heng; Chen, Wei J; Tsai, Yu-Chin; Huang, Yung-Hsiang; Hwu, Hai-Gwo; Hsiao, Chuhsing K

    2013-01-01

    The copy number variation (CNV) is a type of genetic variation in the genome. It is measured based on signal intensity measures and can be assessed repeatedly to reduce the uncertainty in PCR-based typing. Studies have shown that CNVs may lead to phenotypic variation and modification of disease expression. Various challenges exist, however, in the exploration of CNV-disease association. Here we construct latent variables to infer the discrete CNV values and to estimate the probability of mutations. In addition, we propose to pool rare variants to increase the statistical power and we conduct family studies to mitigate the computational burden in determining the composition of CNVs on each chromosome. To explore in a stochastic sense the association between the collapsing CNV variants and disease status, we utilize a Bayesian hierarchical model incorporating the mutation parameters. This model assigns integers in a probabilistic sense to the quantitatively measured copy numbers, and is able to test simultaneously the association for all variants of interest in a regression framework. This integrative model can account for the uncertainty in copy number assignment and differentiate if the variation was de novo or inherited on the basis of posterior probabilities. For family studies, this model can accommodate the dependence within family members and among repeated CNV data. Moreover, the Mendelian rule can be assumed under this model and yet the genetic variation, including de novo and inherited variation, can still be included and quantified directly for each individual. Finally, simulation studies show that this model has high true positive and low false positive rates in the detection of de novo mutation.

  3. Meta-cognitive student reflections

    NASA Astrophysics Data System (ADS)

    Barquist, Britt; Stewart, Jim

    2009-05-01

    We have recently concluded a project testing the effectiveness of a weekly assignment designed to encourage awareness and improvement of meta-cognitive skills. The project is based on the idea that successful problem solvers implement a meta-cognitive process in which they identify the specific concept they are struggling with, and then identify what they understand, what they don't understand, and what they need to know in order to resolve their problem. The assignment required the students to write an email assessing the level of completion of a weekly workbook assignment and to examine in detail their experiences regarding a specific topic they struggled with. The assignment guidelines were designed to coach them through this meta-cognitive process. We responded to most emails with advice for next week's assignment. Our data follow 12 students through a quarter consisting of 11 email assignments which were scored using a rubric based on the assignment guidelines. We found no correlation between rubric scores and final grades. We do have anecdotal evidence that the assignment was beneficial.

  4. A modeling of dynamic storage assignment for order picking in beverage warehousing with Drive-in Rack system

    NASA Astrophysics Data System (ADS)

    Hadi, M. Z.; Djatna, T.; Sugiarto

    2018-04-01

    This paper develops a dynamic storage assignment model to solve storage assignment problem (SAP) for beverages order picking in a drive-in rack warehousing system to determine the appropriate storage location and space for each beverage products dynamically so that the performance of the system can be improved. This study constructs a graph model to represent drive-in rack storage position then combine association rules mining, class-based storage policies and an arrangement rule algorithm to determine an appropriate storage location and arrangement of the product according to dynamic orders from customers. The performance of the proposed model is measured as rule adjacency accuracy, travel distance (for picking process) and probability a product become expiry using Last Come First Serve (LCFS) queue approach. Finally, the proposed model is implemented through computer simulation and compare the performance for different storage assignment methods as well. The result indicates that the proposed model outperforms other storage assignment methods.

  5. Training for the Future. How Can Trainees Meet Current and Future Needs of Industry? Guidelines and Models for the Development of Interdisciplinary Assignments Based on the Concept of Key Technologies.

    ERIC Educational Resources Information Center

    Bolton, William; Clyde, Albert

    This document provides guidelines for the development of interdisciplinary assignments to help prepare learners for the developing needs of industry; it also contains a collection of model assignments produced by 12 British colleges. An introduction explains how to use the document and offers a checklist for the development of interdisciplinary…

  6. Preparation of Term Papers Based upon a Research-Process Model.

    ERIC Educational Resources Information Center

    Feldmann, Rodney Mansfield; Schloman, Barbara Frick

    1990-01-01

    Described is an alternative method of term paper preparation which provides a step-by-step sequence of assignments and provides feedback to the students at all stages in the preparation of the report. An example of this model is provided including 13 sequential assignments. (CW)

  7. Post-processing techniques to enhance reliability of assignment algorithm based performance measures : [technical summary].

    DOT National Transportation Integrated Search

    2011-01-01

    Travel demand modeling plays a key role in the transportation system planning and evaluation process. The four-step sequential travel demand model is the most widely used technique in practice. Traffic assignment is the key step in the conventional f...

  8. Educational Strategies for Learning to Learn from Role Models.

    ERIC Educational Resources Information Center

    Williams, Martha

    The way that socialization, via role modeling, can be enhanced in professional education is discussed, and 10 class assignments are used to illustrate teaching methods for enhancing role modeling, based on a course on women in administration at The University of Texas at Austin. Among the objectives of the course assignments are the following: to…

  9. Joint Inference of Population Assignment and Demographic History

    PubMed Central

    Choi, Sang Chul; Hey, Jody

    2011-01-01

    A new approach to assigning individuals to populations using genetic data is described. Most existing methods work by maximizing Hardy–Weinberg and linkage equilibrium within populations, neither of which will apply for many demographic histories. By including a demographic model, within a likelihood framework based on coalescent theory, we can jointly study demographic history and population assignment. Genealogies and population assignments are sampled from a posterior distribution using a general isolation-with-migration model for multiple populations. A measure of partition distance between assignments facilitates not only the summary of a posterior sample of assignments, but also the estimation of the posterior density for the demographic history. It is shown that joint estimates of assignment and demographic history are possible, including estimation of population phylogeny for samples from three populations. The new method is compared to results of a widely used assignment method, using simulated and published empirical data sets. PMID:21775468

  10. RMS Cost Model User’s Manual

    DTIC Science & Technology

    1975-09-01

    1TPLAX 00049650 PLAN ASSIGN 17,5 0004970U ENTER 1 00049750 ADVANCE MHIt6913) 0004983,) TABULATF 3 0G049850 TEST LF V139FN2,PLAL 000’.9900 PLAJ TEST NE P16...GATE LR V15,PLAM 00050300 TRANSFER ,FLTA 00050350 PLAN LEAVE 1 00050400 TRANSFER ,AAB 00050450 PLAK ASSIGN L99P11 00050500 REMOVE 28 00050550 ASSIGN...SFER RLAkA I’ , ,J PMCF L INK 27,FIFO I0 O - PMC6 ADVANCE MXl(IV32) 3006hO UNLINK 27,SMGQtltl4#Pl4 O~Of, c-71’n TRANSFER tPMCp 00066P’,) 84 PMCR SPLIT

  11. Systematic assignment of thermodynamic constraints in metabolic network models

    PubMed Central

    Kümmel, Anne; Panke, Sven; Heinemann, Matthias

    2006-01-01

    Background The availability of genome sequences for many organisms enabled the reconstruction of several genome-scale metabolic network models. Currently, significant efforts are put into the automated reconstruction of such models. For this, several computational tools have been developed that particularly assist in identifying and compiling the organism-specific lists of metabolic reactions. In contrast, the last step of the model reconstruction process, which is the definition of the thermodynamic constraints in terms of reaction directionalities, still needs to be done manually. No computational method exists that allows for an automated and systematic assignment of reaction directions in genome-scale models. Results We present an algorithm that – based on thermodynamics, network topology and heuristic rules – automatically assigns reaction directions in metabolic models such that the reaction network is thermodynamically feasible with respect to the production of energy equivalents. It first exploits all available experimentally derived Gibbs energies of formation to identify irreversible reactions. As these thermodynamic data are not available for all metabolites, in a next step, further reaction directions are assigned on the basis of network topology considerations and thermodynamics-based heuristic rules. Briefly, the algorithm identifies reaction subsets from the metabolic network that are able to convert low-energy co-substrates into their high-energy counterparts and thus net produce energy. Our algorithm aims at disabling such thermodynamically infeasible cyclic operation of reaction subnetworks by assigning reaction directions based on a set of thermodynamics-derived heuristic rules. We demonstrate our algorithm on a genome-scale metabolic model of E. coli. The introduced systematic direction assignment yielded 130 irreversible reactions (out of 920 total reactions), which corresponds to about 70% of all irreversible reactions that are required to disable thermodynamically infeasible energy production. Conclusion Although not being fully comprehensive, our algorithm for systematic reaction direction assignment could define a significant number of irreversible reactions automatically with low computational effort. We envision that the presented algorithm is a valuable part of a computational framework that assists the automated reconstruction of genome-scale metabolic models. PMID:17123434

  12. A web-based portfolio model as the students' final assignment: Dealing with the development of higher education trend

    NASA Astrophysics Data System (ADS)

    Utanto, Yuli; Widhanarto, Ghanis Putra; Maretta, Yoris Adi

    2017-03-01

    This study aims to develop a web-based portfolio model. The model developed in this study could reveal the effectiveness of the new model in experiments conducted at research respondents in the department of curriculum and educational technology FIP Unnes. In particular, the further research objectives to be achieved through this development of research, namely: (1) Describing the process of implementing a portfolio in a web-based model; (2) Assessing the effectiveness of web-based portfolio model for the final task, especially in Web-Based Learning courses. This type of research is the development of research Borg and Gall (2008: 589) says "educational research and development (R & D) is a process used to develop and validate educational production". The series of research and development carried out starting with exploration and conceptual studies, followed by testing and evaluation, and also implementation. For the data analysis, the technique used is simple descriptive analysis, analysis of learning completeness, which then followed by prerequisite test for normality and homogeneity to do T - test. Based on the data analysis, it was concluded that: (1) a web-based portfolio model can be applied to learning process in higher education; (2) The effectiveness of web-based portfolio model with field data from the respondents of large group trial participants (field trial), the number of respondents who reached mastery learning (a score of 60 and above) were 24 people (92.3%) in which it indicates that the web-based portfolio model is effective. The conclusion of this study is that a web-based portfolio model is effective. The implications of the research development of this model, the next researcher is expected to be able to use the guideline of the development model based on the research that has already been conducted to be developed on other subjects.

  13. Validation of a semi-quantitative job exposure matrix at a Söderberg aluminum smelter.

    PubMed

    Friesen, M C; Demers, P A; Spinelli, J J; Le, N D

    2003-08-01

    We tested the validity of a job exposure matrix (JEM) for coal tar pitch volatiles (CTPV) at a Söderberg aluminum smelter. The JEM had been developed by a committee of company hygienists and union representatives for an earlier study of cancer incidence and mortality. Our aim was to test the validity and reliability of the expert-based assignments. Personal CTPV exposure measurements (n = 1879) overlapped 11 yr of the JEM. The arithmetic mean was calculated for 35 job/time period combinations (35% of the exposed work history), categorized using the original exposure intervals, and compared with the expert-based assignments. The expert-based and the measurement-based exposure assignments were only moderately correlated (Spearman's rho = 0.42; weighted kappa = 0.39, CI 0.10-0.69). Only 40% of the expert-based medium category assignments were correctly assigned, with better agreement in the low (84%) and high (100%) categories. Pot operation jobs exhibited better agreement (rho = 0.60) than the maintenance and pot shell repair jobs (rho = 0.25). The mid-point value of the medium category was overestimated by 0.3 mg/m(3). The expert-based exposure assignments may be improved by better characterizing the transitions between exposure categories, by accounting for exposure differences between pot lines and by re-examining the category mid-point values used in calculating the cumulative exposure. Lack of historical exposure measurements often requires reliance on expert knowledge to assess exposure levels. Validating the experts' estimates against available exposure measurements may help to identify weaknesses in the exposure assessment where improvements may be possible, as was shown here.

  14. Fluidity models in ancient Greece and current practices of sex assignment

    PubMed Central

    Chen, Min-Jye; McCann-Crosby, Bonnie; Gunn, Sheila; Georgiadis, Paraskevi; Placencia, Frank; Mann, David; Axelrad, Marni; Karaviti, L.P; McCullough, Laurence B.

    2018-01-01

    Disorders of sexual differentiation such as androgen insensitivity and gonadal dysgenesis can involve an intrinsic fluidity at different levels, from the anatomical and biological to the social (gender) that must be considered in the context of social constraints. Sex assignment models based on George Engel’s biopsychosocial aspects model of biology accept fluidity of gender as a central concept and therefore help establish expectations within the uncertainty of sex assignment and anticipate potential changes. The biology underlying the fluidity inherent to these disorders should be presented to parents at diagnosis, an approach that the gender medicine field should embrace as good practice. Greek mythology provides many accepted archetypes of change, and the ancient Greek appreciation of metamorphosis can be used as context with these patients. Our goal is to inform expertise and optimal approaches, knowing that this fluidity may eventually necessitate sex reassignment. Physicians should provide sex assignment education based on different components of sexual differentiation, prepare parents for future hormone-triggered changes in their children, and establish a sex-assignment algorithm. PMID:28478088

  15. Fluidity models in ancient Greece and current practices of sex assignment.

    PubMed

    Chen, Min-Jye; McCann-Crosby, Bonnie; Gunn, Sheila; Georgiadis, Paraskevi; Placencia, Frank; Mann, David; Axelrad, Marni; Karaviti, L P; McCullough, Laurence B

    2017-06-01

    Disorders of sexual differentiation such as androgen insensitivity and gonadal dysgenesis can involve an intrinsic fluidity at different levels, from the anatomical and biological to the social (gender) that must be considered in the context of social constraints. Sex assignment models based on George Engel's biopsychosocial aspects model of biology accept fluidity of gender as a central concept and therefore help establish expectations within the uncertainty of sex assignment and anticipate potential changes. The biology underlying the fluidity inherent to these disorders should be presented to parents at diagnosis, an approach that the gender medicine field should embrace as good practice. Greek mythology provides many accepted archetypes of change, and the ancient Greek appreciation of metamorphosis can be used as context with these patients. Our goal is to inform expertise and optimal approaches, knowing that this fluidity may eventually necessitate sex reassignment. Physicians should provide sex assignment education based on different components of sexual differentiation, prepare parents for future hormone-triggered changes in their children, and establish a sex-assignment algorithm. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Genetic tests for estimating dairy breed proportion and parentage assignment in East African crossbred cattle.

    PubMed

    Strucken, Eva M; Al-Mamun, Hawlader A; Esquivelzeta-Rabell, Cecilia; Gondro, Cedric; Mwai, Okeyo A; Gibson, John P

    2017-09-12

    Smallholder dairy farming in much of the developing world is based on the use of crossbred cows that combine local adaptation traits of indigenous breeds with high milk yield potential of exotic dairy breeds. Pedigree recording is rare in such systems which means that it is impossible to make informed breeding decisions. High-density single nucleotide polymorphism (SNP) assays allow accurate estimation of breed composition and parentage assignment but are too expensive for routine application. Our aim was to determine the level of accuracy achieved with low-density SNP assays. We constructed subsets of 100 to 1500 SNPs from the 735k-SNP Illumina panel by selecting: (a) on high minor allele frequencies (MAF) in a crossbred population; (b) on large differences in allele frequency between ancestral breeds; (c) at random; or (d) with a differential evolution algorithm. These panels were tested on a dataset of 1933 crossbred dairy cattle from Kenya/Uganda and on crossbred populations from Ethiopia (N = 545) and Tanzania (N = 462). Dairy breed proportions were estimated by using the ADMIXTURE program, a regression approach, and SNP-best linear unbiased prediction, and tested against estimates obtained by ADMIXTURE based on the 735k-SNP panel. Performance for parentage assignment was based on opposing homozygotes which were used to calculate the separation value (sv) between true and false assignments. Panels of SNPs based on the largest differences in allele frequency between European dairy breeds and a combined Nelore/N'Dama population gave the best predictions of dairy breed proportion (r 2  = 0.962 to 0.994 for 100 to 1500 SNPs) with an average absolute bias of 0.026. Panels of SNPs based on the highest MAF in the crossbred population (Kenya/Uganda) gave the most accurate parentage assignments (sv = -1 to 15 for 100 to 1500 SNPs). Due to the different required properties of SNPs, panels that did well for breed composition did poorly for parentage assignment and vice versa. A combined panel of 400 SNPs was not able to assign parentages correctly, thus we recommend the use of 200 SNPs either for breed proportion prediction or parentage assignment, independently.

  17. Assessment of a novel group-centered testing schema in an upper-level undergraduate molecular biotechnology course.

    PubMed

    Srougi, Melissa C; Miller, Heather B; Witherow, D Scott; Carson, Susan

    2013-01-01

    Providing students with assignments that focus on critical thinking is an important part of their scientific and intellectual development. However, as class sizes increase, so does the grading burden, prohibiting many faculty from incorporating critical thinking assignments in the classroom. In an effort to continue to provide our students with meaningful critical thinking exercises, we implemented a novel group-centered, problem-based testing scheme. We wanted to assess how performing critical thinking problem sets as group work compares to performing the sets as individual work, in terms of student attitudes and learning outcomes. During two semesters of our recombinant DNA course, students had the same lecture material and similar assessments. In the Fall semester, student learning was assessed by two collaborative take-home exams, followed immediately by individual, closed-book in-class exams on the same content, as well as a final cumulative exam. Student teams on the take-home exams were instructor-assigned, and each team turned in one collaborative exam. In the Spring semester, the control group of students were required to turn in their own individual take-home exams, followed by the in-class exams and final cumulative exam. For the majority of students, learning outcomes were met, regardless of whether they worked in teams. In addition, collaborative learning was favorably received by students and grading was reduced for instructors. These data suggest that group-centered, problem-based learning is a useful model for achievement of student learning outcomes in courses where it would be infeasible to provide feedback on individual critical thinking assignments due to grading volume. Copyright © 2013 Wiley Periodicals, Inc.

  18. The effect of reflective writing interventions on the critical thinking skills and dispositions of baccalaureate nursing students.

    PubMed

    Naber, Jessica; Wyatt, Tami H

    2014-01-01

    The importance of critical thinking is well-documented by the American Association of Colleges of Nursing and the National League for Nursing. Reflective writing is often used to increase understanding and analytical ability. The lack of empirical evidence about the effect of reflective writing interventions on critical thinking supports the examination of this concept. Study objectives were: This study used an experimental, pretest-posttest design. The setting was two schools of nursing at universities in the southern United States. The convenience sample included 70 fourth-semester students in baccalaureate nursing programs. Randomly assigned control and experimental groups completed the California Critical Thinking Skills Test (CCTST) and the California Critical Thinking Dispositions Inventory Test (CCTDI). The experimental group completed six reflective writing assignments. Both groups completed the two tests again. Results showed that the experimental group had a significant increase (p=0.03) on the truthseeking subscale of the CCTDI when compared to the control group. The experimental group's scores increased on four CCTST subscales and were higher than the control group's on three CCTST subscales. The results of this study make it imperative for nursing schools to consider including reflective writing-especially assignments based on Paul's (1993) model-in nursing courses. If future studies, testing over longer periods of time, show significant increases in critical thinking, those interventions could be incorporated into nursing curriculum and change the way nurse educators evaluate students. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. More reliable inference for the dissimilarity index of segregation

    PubMed Central

    Allen, Rebecca; Burgess, Simon; Davidson, Russell; Windmeijer, Frank

    2015-01-01

    Summary The most widely used measure of segregation is the so‐called dissimilarity index. It is now well understood that this measure also reflects randomness in the allocation of individuals to units (i.e. it measures deviations from evenness, not deviations from randomness). This leads to potentially large values of the segregation index when unit sizes and/or minority proportions are small, even if there is no underlying systematic segregation. Our response to this is to produce adjustments to the index, based on an underlying statistical model. We specify the assignment problem in a very general way, with differences in conditional assignment probabilities underlying the resulting segregation. From this, we derive a likelihood ratio test for the presence of any systematic segregation, and bias adjustments to the dissimilarity index. We further develop the asymptotic distribution theory for testing hypotheses concerning the magnitude of the segregation index and show that the use of bootstrap methods can improve the size and power properties of test procedures considerably. We illustrate these methods by comparing dissimilarity indices across school districts in England to measure social segregation. PMID:27774035

  20. Using parentage analysis to examine gene flow and spatial genetic structure.

    PubMed

    Kane, Nolan C; King, Matthew G

    2009-04-01

    Numerous approaches have been developed to examine recent and historical gene flow between populations, but few studies have used empirical data sets to compare different approaches. Some methods are expected to perform better under particular scenarios, such as high or low gene flow, but this, too, has rarely been tested. In this issue of Molecular Ecology, Saenz-Agudelo et al. (2009) apply assignment tests and parentage analysis to microsatellite data from five geographically proximal (2-6 km) and one much more distant (1500 km) panda clownfish populations, showing that parentage analysis performed better in situations of high gene flow, while their assignment tests did better with low gene flow. This unusually complete data set is comprised of multiple exhaustively sampled populations, including nearly all adults and large numbers of juveniles, enabling the authors to ask questions that in many systems would be impossible to answer. Their results emphasize the importance of selecting the right analysis to use, based on the underlying model and how well its assumptions are met by the populations to be analysed.

  1. Pain sensitivity and torque used during measurement predicts change in range of motion at the knee.

    PubMed

    Bishop, Mark D; George, Steven Z

    2017-01-01

    To determine the extent to which changes in knee range of motion (ROM) after a stretching program are related to sensory factors at the time of testing and the amount of force used during the measurement of ROM, rather than changes in soft-tissue properties. Randomized, single-blind design. Participants were randomly assigned to a control or stretching group. Research laboratory. Forty-four healthy volunteers (22.8±2.8 years of age; 23 men). The stretching group undertook static stretching twice a day for 8 weeks. The control group continued with routine activity, but was discouraged from starting a flexibility program. ROM and tissue extensibility was assessed using a Biodex3 dynamometer, and ratings of thermal pain were collected at baseline and at 4 and 8 weeks by an examiner blinded to group assignment. Multilevel modeling was used to examine predictors of ROM across time. The stretching group showed a 6% increase, and the control group had a 2% increase, in ROM over the 8-week program. However, when fixed and random effects were tested in a complete model, the group assignment was not significant. End-point torque during ROM testing ( p =0.021) and the ratings in response to thermal testing ( p <0.001) were significant, however. ROM measured in a testing session was not predicted by assignment to a stretching program. Rather, ROM was predicted by the ratings of thermal stimuli and the peak torque used to apply the stretch.

  2. Progress toward the determination of correct classification rates in fire debris analysis.

    PubMed

    Waddell, Erin E; Song, Emma T; Rinke, Caitlin N; Williams, Mary R; Sigman, Michael E

    2013-07-01

    Principal components analysis (PCA), linear discriminant analysis (LDA), and quadratic discriminant analysis (QDA) were used to develop a multistep classification procedure for determining the presence of ignitable liquid residue in fire debris and assigning any ignitable liquid residue present into the classes defined under the American Society for Testing and Materials (ASTM) E 1618-10 standard method. A multistep classification procedure was tested by cross-validation based on model data sets comprised of the time-averaged mass spectra (also referred to as total ion spectra) of commercial ignitable liquids and pyrolysis products from common building materials and household furnishings (referred to simply as substrates). Fire debris samples from laboratory-scale and field test burns were also used to test the model. The optimal model's true-positive rate was 81.3% for cross-validation samples and 70.9% for fire debris samples. The false-positive rate was 9.9% for cross-validation samples and 8.9% for fire debris samples. © 2013 American Academy of Forensic Sciences.

  3. Students' Engagement in Collaborative Knowledge Construction in Group Assignments for Information Literacy

    ERIC Educational Resources Information Center

    Sormunen, Eero; Tanni, Mikko; Heinström, Jannica

    2013-01-01

    Introduction: Information literacy instruction is often undertaken in schools as collaborative source-based writing assignments. his paper presents the findings of a study on collaboration in two school assignments designed for information literacy. Method: The study draws on the models of cooperative and collaborative learning and the task-based…

  4. A discrete decentralized variable structure robotic controller

    NASA Technical Reports Server (NTRS)

    Tumeh, Zuheir S.

    1989-01-01

    A decentralized trajectory controller for robotic manipulators is designed and tested using a multiprocessor architecture and a PUMA 560 robot arm. The controller is made up of a nominal model-based component and a correction component based on a variable structure suction control approach. The second control component is designed using bounds on the difference between the used and actual values of the model parameters. Since the continuous manipulator system is digitally controlled along a trajectory, a discretized equivalent model of the manipulator is used to derive the controller. The motivation for decentralized control is that the derived algorithms can be executed in parallel using a distributed, relatively inexpensive, architecture where each joint is assigned a microprocessor. Nonlinear interaction and coupling between joints is treated as a disturbance torque that is estimated and compensated for.

  5. The Learning Journal Bridge: From Classroom Concepts to Leadership Practices

    ERIC Educational Resources Information Center

    Maellaro, Rosemary

    2013-01-01

    The value of reflective writing assignments as learning tools for business students has been well-established. While the management education literature includes numerous examples of such assignments that are based on Kolb's (1984) experiential learning model, many of them engage only the first two phases of the model. When students do not move…

  6. An Improved SoC Test Scheduling Method Based on Simulated Annealing Algorithm

    NASA Astrophysics Data System (ADS)

    Zheng, Jingjing; Shen, Zhihang; Gao, Huaien; Chen, Bianna; Zheng, Weida; Xiong, Xiaoming

    2017-02-01

    In this paper, we propose an improved SoC test scheduling method based on simulated annealing algorithm (SA). It is our first to disorganize IP core assignment for each TAM to produce a new solution for SA, allocate TAM width for each TAM using greedy algorithm and calculate corresponding testing time. And accepting the core assignment according to the principle of simulated annealing algorithm and finally attain the optimum solution. Simultaneously, we run the test scheduling experiment with the international reference circuits provided by International Test Conference 2002(ITC’02) and the result shows that our algorithm is superior to the conventional integer linear programming algorithm (ILP), simulated annealing algorithm (SA) and genetic algorithm(GA). When TAM width reaches to 48,56 and 64, the testing time based on our algorithm is lesser than the classic methods and the optimization rates are 30.74%, 3.32%, 16.13% respectively. Moreover, the testing time based on our algorithm is very close to that of improved genetic algorithm (IGA), which is state-of-the-art at present.

  7. Future aircraft networks and schedules

    NASA Astrophysics Data System (ADS)

    Shu, Yan

    2011-07-01

    Because of the importance of air transportation scheduling, the emergence of small aircraft and the vision of future fuel-efficient aircraft, this thesis has focused on the study of aircraft scheduling and network design involving multiple types of aircraft and flight services. It develops models and solution algorithms for the schedule design problem and analyzes the computational results. First, based on the current development of small aircraft and on-demand flight services, this thesis expands a business model for integrating on-demand flight services with the traditional scheduled flight services. This thesis proposes a three-step approach to the design of aircraft schedules and networks from scratch under the model. In the first step, both a frequency assignment model for scheduled flights that incorporates a passenger path choice model and a frequency assignment model for on-demand flights that incorporates a passenger mode choice model are created. In the second step, a rough fleet assignment model that determines a set of flight legs, each of which is assigned an aircraft type and a rough departure time is constructed. In the third step, a timetable model that determines an exact departure time for each flight leg is developed. Based on the models proposed in the three steps, this thesis creates schedule design instances that involve almost all the major airports and markets in the United States. The instances of the frequency assignment model created in this thesis are large-scale non-convex mixed-integer programming problems, and this dissertation develops an overall network structure and proposes iterative algorithms for solving these instances. The instances of both the rough fleet assignment model and the timetable model created in this thesis are large-scale mixed-integer programming problems, and this dissertation develops subproblem schemes for solving these instances. Based on these solution algorithms, this dissertation also presents computational results of these large-scale instances. To validate the models and solution algorithms developed, this thesis also compares the daily flight schedules that it designs with the schedules of the existing airlines. Furthermore, it creates instances that represent different economic and fuel-prices conditions and derives schedules under these different conditions. In addition, it discusses the implication of using new aircraft in the future flight schedules. Finally, future research in three areas---model, computational method, and simulation for validation---is proposed.

  8. Capacity-constrained traffic assignment in networks with residual queues

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lam, W.H.K.; Zhang, Y.

    2000-04-01

    This paper proposes a capacity-constrained traffic assignment model for strategic transport planning in which the steady-state user equilibrium principle is extended for road networks with residual queues. Therefore, the road-exit capacity and the queuing effects can be incorporated into the strategic transport model for traffic forecasting. The proposed model is applicable to the congested network particularly when the traffic demands exceeds the capacity of the network during the peak period. An efficient solution method is proposed for solving the steady-state traffic assignment problem with residual queues. Then a simple numerical example is employed to demonstrate the application of the proposedmore » model and solution method, while an example of a medium-sized arterial highway network in Sioux Falls, South Dakota, is used to test the applicability of the proposed solution to real problems.« less

  9. A Self-Instructional Approach To the Teaching of Enzymology Involving Computer-Based Sequence Analysis and Molecular Modelling.

    ERIC Educational Resources Information Center

    Attwood, Paul V.

    1997-01-01

    Describes a self-instructional assignment approach to the teaching of advanced enzymology. Presents an assignment that offers a means of teaching enzymology to students that exposes them to modern computer-based techniques of analyzing protein structure and relates structure to enzyme function. (JRH)

  10. Dynamic route and departure time choice model based on self-adaptive reference point and reinforcement learning

    NASA Astrophysics Data System (ADS)

    Li, Xue-yan; Li, Xue-mei; Yang, Lingrun; Li, Jing

    2018-07-01

    Most of the previous studies on dynamic traffic assignment are based on traditional analytical framework, for instance, the idea of Dynamic User Equilibrium has been widely used in depicting both the route choice and the departure time choice. However, some recent studies have demonstrated that the dynamic traffic flow assignment largely depends on travelers' rationality degree, travelers' heterogeneity and what the traffic information the travelers have. In this paper, we develop a new self-adaptive multi agent model to depict travelers' behavior in Dynamic Traffic Assignment. We use Cumulative Prospect Theory with heterogeneous reference points to illustrate travelers' bounded rationality. We use reinforcement-learning model to depict travelers' route and departure time choosing behavior under the condition of imperfect information. We design the evolution rule of travelers' expected arrival time and the algorithm of traffic flow assignment. Compared with the traditional model, the self-adaptive multi agent model we proposed in this paper can effectively help travelers avoid the rush hour. Finally, we report and analyze the effect of travelers' group behavior on the transportation system, and give some insights into the relation between travelers' group behavior and the performance of transportation system.

  11. A nonparametric clustering technique which estimates the number of clusters

    NASA Technical Reports Server (NTRS)

    Ramey, D. B.

    1983-01-01

    In applications of cluster analysis, one usually needs to determine the number of clusters, K, and the assignment of observations to each cluster. A clustering technique based on recursive application of a multivariate test of bimodality which automatically estimates both K and the cluster assignments is presented.

  12. Accuracy of pastoralists' memory-based kinship assignment of Ankole cattle: a microsatellite DNA analysis.

    PubMed

    Kugonza, D R; Kiwuwa, G H; Mpairwe, D; Jianlin, H; Nabasirye, M; Okeyo, A M; Hanotte, O

    2012-02-01

    This study aimed to estimate the level of relatedness within Ankole cattle herds using autosomal microsatellite markers and to assess the accuracy of relationship assignment based on farmers' memory. Eight cattle populations (four from each of two counties in Mbarara district in Uganda) were studied. Cattle in each population shared varying degrees of relatedness (first-, second- and third-degree relatives and unrelated individuals). Only memory-based kinship assignments which farmers knew with some confidence were tested in this experiment. DNA isolated from the blood of a subsample of 304 animals was analysed using 19 microsatellite markers. Average within population relatedness coefficients ranged from 0.010 ± 0.005 (Nshaara) to 0.067 ± 0.004 (Tayebwa). An exclusion probability of 99.9% was observed for both sire-offspring and dam-offspring relationships using the entire panel of 19 markers. Confidence from likelihood tests performed on 292 dyads showed that first-degree relatives were more easily correctly assigned by farmers than second-degree ones (p < 0.01), which were also easier to assign than third-degree relatives (p < 0.01). Accuracy of kinship assignment by the farmers was 91.9% ± 5.0 for dam-offspring dyads, 85.5% ± 3.4 for sire-offspring dyads, 75.6% ± 12.3 for half-sib and 60.0% ± 5.0 for grand dam-grand offspring dyads. Herd size, number of dyads assigned and length of time spent by the herder with their cattle population did not correlate with error in memorizing relationships. However, herd size strongly correlated with number of dyads assigned by the herder (r = 0.967, p < 0.001). Overall, we conclude that memorized records of pastoralists can be used to trace relationships and for pedigree reconstruction within Ankole cattle populations, but with the awareness that herd size constrains the number of kinship assignments remembered by the farmer. © 2011 Blackwell Verlag GmbH.

  13. Pediatric Online Evidence-Based Medicine Assignment Is a Novel Effective Enjoyable Undergraduate Medical Teaching Tool

    PubMed Central

    Kotb, Magd A.; Elmahdy, Hesham Nabeh; Khalifa, Nour El Deen Mahmoud; El-Deen, Mohamed Hamed Nasr; Lotfi, Mohamed Amr N.

    2015-01-01

    Abstract Evidence-based medicine (EBM) is delivered through a didactic, blended learning, and mixed models. Students are supposed to construct an answerable question in PICO (patient, intervention, comparison, and outcome) framework, acquire evidence through search of literature, appraise evidence, apply it to the clinical case scenario, and assess the evidence in relation to clinical context. Yet these teaching models have limitations especially those related to group work, for example, handling uncooperative students, students who fail to contribute, students who domineer, students who have personal conflict, their impact upon progress of their groups, and inconsistent individual acquisition of required skills. At Pediatrics Department, Faculty of Medicine, Cairo University, we designed a novel undergraduate pediatric EBM assignment online system to overcome shortcomings of previous didactic method and aimed to assess its effectiveness by prospective follow-up during academic years 2012 to 2013 and 2013 to 2014. The novel web-based online interactive system was tailored to provide sequential single and group assignments for each student. Single assignment addressed a specific case scenario question, while group assignment was teamwork that addressed different questions of same case scenario. Assignment comprised scholar content and skills. We objectively analyzed students’ performance by criterion-based assessment and subjectively by anonymous student questionnaire. A total of 2879 were enrolled in 5th year Pediatrics Course consecutively, of them 2779 (96.5%) logged in and 2554 (88.7%) submitted their work. They were randomly assigned to 292 groups. A total of 2277 (89.15%) achieved ≥80% of total mark (4/5), of them 717 (28.1%) achieved a full mark. A total of 2178 (85.27%) and 2359 (92.36%) made evidence-based conclusions and recommendations in single and group assignment, respectively (P < 0.001). A total of 1102 (43.1%) answered student questionnaire, of them 898 (81.48%) found e-educational experience satisfactory, 175 (15.88%) disagreed, and 29 (2.6%) could not decide. A total of 964 (87.47%) found single assignment educational, 913 (82.84%) found group assignment educational, and 794 (72.3%) enjoyed it. Web-based online interactive undergraduate EBM assignment was found effective in teaching medical students and assured individual student acquisition of concepts and skills of pediatric EMB. It was effective in mass education, data collection, and storage essential for system and student assessment. PMID:26200621

  14. Pediatric Online Evidence-Based Medicine Assignment Is a Novel Effective Enjoyable Undergraduate Medical Teaching Tool: A SQUIRE Compliant Study.

    PubMed

    Kotb, Magd A; Elmahdy, Hesham Nabeh; Khalifa, Nour El Deen Mahmoud; El-Deen, Mohamed Hamed Nasr; Lotfi, Mohamed Amr N

    2015-07-01

    Evidence-based medicine (EBM) is delivered through a didactic, blended learning, and mixed models. Students are supposed to construct an answerable question in PICO (patient, intervention, comparison, and outcome) framework, acquire evidence through search of literature, appraise evidence, apply it to the clinical case scenario, and assess the evidence in relation to clinical context. Yet these teaching models have limitations especially those related to group work, for example, handling uncooperative students, students who fail to contribute, students who domineer, students who have personal conflict, their impact upon progress of their groups, and inconsistent individual acquisition of required skills. At Pediatrics Department, Faculty of Medicine, Cairo University, we designed a novel undergraduate pediatric EBM assignment online system to overcome shortcomings of previous didactic method and aimed to assess its effectiveness by prospective follow-up during academic years 2012 to 2013 and 2013 to 2014. The novel web-based online interactive system was tailored to provide sequential single and group assignments for each student. Single assignment addressed a specific case scenario question, while group assignment was teamwork that addressed different questions of same case scenario. Assignment comprised scholar content and skills. We objectively analyzed students' performance by criterion-based assessment and subjectively by anonymous student questionnaire. A total of 2879 were enrolled in 5th year Pediatrics Course consecutively, of them 2779 (96.5%) logged in and 2554 (88.7%) submitted their work. They were randomly assigned to 292 groups. A total of 2277 (89.15%) achieved ≥ 80% of total mark (4/5), of them 717 (28.1%) achieved a full mark. A total of 2178 (85.27%) and 2359 (92.36%) made evidence-based conclusions and recommendations in single and group assignment, respectively (P < 0.001). A total of 1102 (43.1%) answered student questionnaire, of them 898 (81.48%) found e-educational experience satisfactory, 175 (15.88%) disagreed, and 29 (2.6%) could not decide. A total of 964 (87.47%) found single assignment educational, 913 (82.84%) found group assignment educational, and 794 (72.3%) enjoyed it. Web-based online interactive undergraduate EBM assignment was found effective in teaching medical students and assured individual student acquisition of concepts and skills of pediatric EMB. It was effective in mass education, data collection, and storage essential for system and student assessment.

  15. The effectiveness of advance organiser model on students' academic achievement in learning work and energy

    NASA Astrophysics Data System (ADS)

    Gidena, Asay; Gebeyehu, Desta

    2017-11-01

    The purpose of this study was to investigate the effectiveness of the advance organiser model (AOM) on students' academic achievement in learning work and energy. The design of the study was quasi-experimental pretest-posttest nonequivalent control groups. The total population of the study was 139 students of three sections in Endabaguna preparatory school in Tigray Region, Ethiopia. Two sections with equivalent means on the pretest were taken to participate in the study purposely and one section assigned as the experimental group and the other section assigned as the control group randomly. The experimental group was taught using the lesson plan based on the AOM, and the control group was taught using the lesson plan based on the conventional teaching method. Pretest and posttest were administered before and after the treatment, respectively. Independent sample t-test was used to analyse the data at the probability level of 0.05. The findings of the study showed that the AOM was more effective than the conventional teaching method with effect size of 0.49. This model was also effective to teach male and female students and objectives namely understanding and application. However, both methods were equally important to teach work and energy under the objective knowledge level.

  16. Model of load distribution for earth observation satellite

    NASA Astrophysics Data System (ADS)

    Tu, Shumin; Du, Min; Li, Wei

    2017-03-01

    For the system of multiple types of EOS (Earth Observing Satellites), it is a vital issue to assure that each type of payloads carried by the group of EOS can be used efficiently and reasonably for in astronautics fields. Currently, most of researches on configuration of satellite and payloads focus on the scheduling for launched satellites. However, the assignments of payloads for un-launched satellites are bit researched, which are the same crucial as the scheduling of tasks. Moreover, the current models of satellite resources scheduling lack of more general characteristics. Referring the idea about roles-based access control (RBAC) of information system, this paper brings forward a model based on role-mining of RBAC to improve the generality and foresight of the method of assignments of satellite-payload. By this way, the assignment of satellite-payload can be mapped onto the problem of role-mining. A novel method will be introduced, based on the idea of biclique-combination in graph theory and evolutionary algorithm in intelligence computing, to address the role-mining problem of satellite-payload assignments. The simulation experiments are performed to verify the novel method. Finally, the work of this paper is concluded.

  17. Simulating the Response of a Composite Honeycomb Energy Absorber. Part 2; Full-Scale Impact Testing

    NASA Technical Reports Server (NTRS)

    Fasanella, Edwin L.; Annett, Martin S.; Jackson, Karen E.; Polanco, Michael A.

    2012-01-01

    NASA has sponsored research to evaluate an externally deployable composite honeycomb designed to attenuate loads in the event of a helicopter crash. The concept, designated the Deployable Energy Absorber (DEA), is an expandable Kevlar(Registered TradeMark) honeycomb. The DEA has a flexible hinge that allows the honeycomb to be stowed collapsed until needed during an emergency. Evaluation of the DEA began with material characterization of the Kevlar(Registered TradeMark)-129 fabric/epoxy, and ended with a full-scale crash test of a retrofitted MD-500 helicopter. During each evaluation phase, finite element models of the test articles were developed and simulations were performed using the dynamic finite element code, LS-DYNA(Registered TradeMark). The paper will focus on simulations of two full-scale impact tests involving the DEA, a mass-simulator and a full-scale crash of an instrumented MD-500 helicopter. Isotropic (MAT24) and composite (MAT58) material models, which were assigned to DEA shell elements, were compared. Based on simulations results, the MAT58 model showed better agreement with test.

  18. Text Messaging: An Intervention to Increase Physical Activity among African American Participants in a Faith-Based, Competitive Weight Loss Program.

    PubMed

    McCoy, Pamela; Leggett, Sophia; Bhuiyan, Azad; Brown, David; Frye, Patricia; Williams, Bryman

    2017-03-29

    African American adults are less likely to meet the recommended physical activity guidelines for aerobic and muscle-strengthening activity than Caucasian adults. The purpose of this study was to assess whether a text message intervention would increase physical activity in this population. This pilot study used a pre-/post-questionnaire non-randomized design. Participants in a faith-based weight loss competition who agreed to participate in the text messaging were assigned to the intervention group ( n = 52). Participants who declined to participate in the intervention, but agreed to participate in the study, were assigned to the control group ( n = 30). The text messages provided strategies for increasing physical activity and were based on constructs of the Health Belief Model and the Information-Motivation-Behavioral Skills Model. Chi square tests determined the intervention group participants increased exercise time by approximately eight percent ( p = 0.03), while the control group's exercise time remained constant. The intervention group increased walking and running. The control group increased running. Most participants indicated that the health text messages were effective. The results of this pilot study suggest that text messaging may be an effective method for providing options for motivating individuals to increase physical activity.

  19. Text Messaging: An Intervention to Increase Physical Activity among African American Participants in a Faith-Based, Competitive Weight Loss Program

    PubMed Central

    McCoy, Pamela; Leggett, Sophia; Bhuiyan, Azad; Brown, David; Frye, Patricia; Williams, Bryman

    2017-01-01

    African American adults are less likely to meet the recommended physical activity guidelines for aerobic and muscle-strengthening activity than Caucasian adults. The purpose of this study was to assess whether a text message intervention would increase physical activity in this population. This pilot study used a pre-/post-questionnaire non-randomized design. Participants in a faith-based weight loss competition who agreed to participate in the text messaging were assigned to the intervention group (n = 52). Participants who declined to participate in the intervention, but agreed to participate in the study, were assigned to the control group (n = 30). The text messages provided strategies for increasing physical activity and were based on constructs of the Health Belief Model and the Information-Motivation-Behavioral Skills Model. Chi square tests determined the intervention group participants increased exercise time by approximately eight percent (p = 0.03), while the control group’s exercise time remained constant. The intervention group increased walking and running. The control group increased running. Most participants indicated that the health text messages were effective. The results of this pilot study suggest that text messaging may be an effective method for providing options for motivating individuals to increase physical activity. PMID:28353650

  20. Within-Site Variation in Feather Stable Hydrogen Isotope (δ2Hf) Values of Boreal Songbirds: Implications for Assignment to Molt Origin.

    PubMed

    Nordell, Cameron J; Haché, Samuel; Bayne, Erin M; Sólymos, Péter; Foster, Kenneth R; Godwin, Christine M; Krikun, Richard; Pyle, Peter; Hobson, Keith A

    2016-01-01

    Understanding bird migration and dispersal is important to inform full life-cycle conservation planning. Stable hydrogen isotope ratios from feathers (δ2Hf) can be linked to amount-weighted long-term, growing season precipitation δ2H (δ2Hp) surfaces to create δ2Hf isoscapes for assignment to molt origin. However, transfer functions linking δ2Hp with δ2Hf are influenced by physiological and environmental processes. A better understanding of the causes and consequences of variation in δ2Hf values among individuals and species will improve the predictive ability of geographic assignment tests. We tested for effects of species, land cover, forage substrate, nest substrate, diet composition, body mass, sex, and phylogenetic relatedness on δ2Hf from individuals at least two years old of 21 songbird species captured during the same breeding season at a site in northeastern Alberta, Canada. For four species, we also tested for a year × species interaction effect on δ2Hf. A model including species as single predictor received the most support (AIC weight = 0.74) in explaining variation in δ2Hf. A species-specific variance parameter was part of all best-ranked models, suggesting variation in δ2Hf was not consistent among species. The second best-ranked model included a forage substrate × diet interaction term (AIC weight = 0.16). There was a significant year × species interaction effect on δ2Hf suggesting that interspecific differences in δ2Hf can differ among years. Our results suggest that within- and among-year interspecific variation in δ2Hf is the most important source of variance typically not being explicitly quantified in geographic assignment tests using non-specific transfer functions to convert δ2Hp into δ2Hf. However, this source of variation is consistent with the range of variation from the transfer functions most commonly being propagated in assignment tests of geographic origins for passerines breeding in North America.

  1. NVR-BIP: Nuclear Vector Replacement using Binary Integer Programming for NMR Structure-Based Assignments.

    PubMed

    Apaydin, Mehmet Serkan; Çatay, Bülent; Patrick, Nicholas; Donald, Bruce R

    2011-05-01

    Nuclear magnetic resonance (NMR) spectroscopy is an important experimental technique that allows one to study protein structure and dynamics in solution. An important bottleneck in NMR protein structure determination is the assignment of NMR peaks to the corresponding nuclei. Structure-based assignment (SBA) aims to solve this problem with the help of a template protein which is homologous to the target and has applications in the study of structure-activity relationship, protein-protein and protein-ligand interactions. We formulate SBA as a linear assignment problem with additional nuclear overhauser effect constraints, which can be solved within nuclear vector replacement's (NVR) framework (Langmead, C., Yan, A., Lilien, R., Wang, L. and Donald, B. (2003) A Polynomial-Time Nuclear Vector Replacement Algorithm for Automated NMR Resonance Assignments. Proc. the 7th Annual Int. Conf. Research in Computational Molecular Biology (RECOMB) , Berlin, Germany, April 10-13, pp. 176-187. ACM Press, New York, NY. J. Comp. Bio. , (2004), 11, pp. 277-298; Langmead, C. and Donald, B. (2004) An expectation/maximization nuclear vector replacement algorithm for automated NMR resonance assignments. J. Biomol. NMR , 29, 111-138). Our approach uses NVR's scoring function and data types and also gives the option of using CH and NH residual dipolar coupling (RDCs), instead of NH RDCs which NVR requires. We test our technique on NVR's data set as well as on four new proteins. Our results are comparable to NVR's assignment accuracy on NVR's test set, but higher on novel proteins. Our approach allows partial assignments. It is also complete and can return the optimum as well as near-optimum assignments. Furthermore, it allows us to analyze the information content of each data type and is easily extendable to accept new forms of input data, such as additional RDCs.

  2. EFICAz2: enzyme function inference by a combined approach enhanced by machine learning.

    PubMed

    Arakaki, Adrian K; Huang, Ying; Skolnick, Jeffrey

    2009-04-13

    We previously developed EFICAz, an enzyme function inference approach that combines predictions from non-completely overlapping component methods. Two of the four components in the original EFICAz are based on the detection of functionally discriminating residues (FDRs). FDRs distinguish between member of an enzyme family that are homofunctional (classified under the EC number of interest) or heterofunctional (annotated with another EC number or lacking enzymatic activity). Each of the two FDR-based components is associated to one of two specific kinds of enzyme families. EFICAz exhibits high precision performance, except when the maximal test to training sequence identity (MTTSI) is lower than 30%. To improve EFICAz's performance in this regime, we: i) increased the number of predictive components and ii) took advantage of consensual information from the different components to make the final EC number assignment. We have developed two new EFICAz components, analogs to the two FDR-based components, where the discrimination between homo and heterofunctional members is based on the evaluation, via Support Vector Machine models, of all the aligned positions between the query sequence and the multiple sequence alignments associated to the enzyme families. Benchmark results indicate that: i) the new SVM-based components outperform their FDR-based counterparts, and ii) both SVM-based and FDR-based components generate unique predictions. We developed classification tree models to optimally combine the results from the six EFICAz components into a final EC number prediction. The new implementation of our approach, EFICAz2, exhibits a highly improved prediction precision at MTTSI < 30% compared to the original EFICAz, with only a slight decrease in prediction recall. A comparative analysis of enzyme function annotation of the human proteome by EFICAz2 and KEGG shows that: i) when both sources make EC number assignments for the same protein sequence, the assignments tend to be consistent and ii) EFICAz2 generates considerably more unique assignments than KEGG. Performance benchmarks and the comparison with KEGG demonstrate that EFICAz2 is a powerful and precise tool for enzyme function annotation, with multiple applications in genome analysis and metabolic pathway reconstruction. The EFICAz2 web service is available at: http://cssb.biology.gatech.edu/skolnick/webservice/EFICAz2/index.html.

  3. Comparison of the didactic lecture with the simulation/model approach for the teaching of a novel perioperative ultrasound curriculum to anesthesiology residents.

    PubMed

    Ramsingh, Davinder; Alexander, Brenton; Le, Khanhvan; Williams, Wendell; Canales, Cecilia; Cannesson, Maxime

    2014-09-01

    To expose residents to two methods of education for point-of-care ultrasound, a traditional didactic lecture and a model/simulation-based lecture, which focus on concepts of cardiopulmonary function, volume status, and evaluation of severe thoracic/abdominal injuries; and to assess which method is more effective. Single-center, prospective, blinded trial. University hospital. Anesthesiology residents who were assigned to an educational day during the two-month research study period. Residents were allocated to two groups to receive either a 90-minute, one-on-one didactic lecture or a 90-minute lecture in a simulation center, during which they practiced on a human model and simulation mannequin (normal pathology). Data points included a pre-lecture multiple-choice test, post-lecture multiple-choice test, and post-lecture, human model-based examination. Post-lecture tests were performed within three weeks of the lecture. An experienced sonographer who was blinded to the education modality graded the model-based skill assessment examinations. Participants completed a follow-up survey to assess the perceptions of the quality of their instruction between the two groups. 20 residents completed the study. No differences were noted between the two groups in pre-lecture test scores (P = 0.97), but significantly higher scores for the model/simulation group occurred on both the post-lecture multiple choice (P = 0.038) and post-lecture model (P = 0.041) examinations. Follow-up resident surveys showed significantly higher scores in the model/simulation group regarding overall interest in perioperative ultrasound (P = 0.047) as well understanding of the physiologic concepts (P = 0.021). A model/simulation-based based lecture series may be more effective in teaching the skills needed to perform a point-of-care ultrasound examination to anesthesiology residents. Copyright © 2014 Elsevier Inc. All rights reserved.

  4. Anesthesia machine checkout and room setup: a randomized, single-blind, comparison of two teaching modalities.

    PubMed

    Spofford, Christina M; Bayman, Emine O; Szeluga, Debra J; From, Robert P

    2012-01-01

    Novel methods for teaching are needed to enhance the efficiency of academic anesthesia departments as well as provide approaches to learning that are aligned with current trends and advances in technology. A video was produced that taught the key elements of anesthesia machine checkout and room set up. Novice learners were randomly assigned to receive either the new video format or traditional lecture-based format for this topic during their regularly scheduled lecture series. Primary outcome was the difference in written examination score before and after teaching between the two groups. Secondary outcome was the satisfaction score of the trainees in the two groups. Forty-two students assigned to the video group and 36 students assigned to the lecture group completed the study. Students in each group similar interest in anesthesia, pre-test scores, post-test scores, and final exam scores. The median posttest to pretest difference was greater in the video groups (3.5 (3.0-5.0) vs 2.5 (2.0-3.0), for video and lecture groups respectively, p 0.002). Despite improved test scores, students reported higher satisfaction the traditional, lecture-based format (22.0 (18.0-24.0) vs 24.0 (20.0-28.0), for video and lecture groups respectively, p <0.004). Higher pre-test to post-test improvements were observed among students in the video-based teaching group, however students rated traditional, live lectures higher than newer video-based teaching.

  5. geomIO: A tool for geodynamicists to turn 2D cross-sections into 3D geometries

    NASA Astrophysics Data System (ADS)

    Baumann, Tobias; Bauville, Arthur

    2016-04-01

    In numerical deformation models, material properties are usually defined on elements (e.g., in body-fitted finite elements), or on a set of Lagrangian markers (Eulerian, ALE or mesh-free methods). In any case, geometrical constraints are needed to assign different material properties to the model domain. Whereas simple geometries such as spheres, layers or cuboids can easily be programmed, it quickly gets complex and time-consuming to create more complicated geometries for numerical model setups, especially in three dimensions. geomIO (geometry I/O, http://geomio.bitbucket.org/) is a MATLAB-based library that has two main functionalities. First, it can be used to create 3D volumes based on series of 2D vector drawings similar to a CAD program; and second, it uses these 3D volumes to assign material properties to the numerical model domain. The drawings can conveniently be created using the open-source vector graphics software Inkscape. Adobe Illustrator is also partially supported. The drawings represent a series of cross-sections in the 3D model domain, for example, cross-sectional interpretations of seismic tomography. geomIO is then used to read the drawings and to create 3D volumes by interpolating between the cross-sections. In the second part, the volumes are used to assign material phases to markers inside the volumes. Multiple volumes can be created at the same time and, depending on the order of assignment, unions or intersections can be built to assign additional material phases. geomIO also offers the possibility to create 3D temperature structures for geodynamic models based on depth dependent parameterisations, for example the half space cooling model. In particular, this can be applied to geometries of subducting slabs of arbitrary shape. Yet, geomIO is held very general, and can be used for a variety of applications. We present examples of setup generation from pictures of micro-scale tectonics and lithospheric scale setups of 3D present-day model geometries.

  6. Tracing the geographic origin of traded leopard body parts in the indian subcontinent with DNA-based assignment tests.

    PubMed

    Mondol, Samrat; Sridhar, Vanjulavalli; Yadav, Prasanjeet; Gubbi, Sanjay; Ramakrishnan, Uma

    2015-04-01

    Illicit trade in wildlife products is rapidly decimating many species across the globe. Such trade is often underestimated for wide-ranging species until it is too late for the survival of their remaining populations. Policing this trade could be vastly improved if one could reliably determine geographic origins of illegal wildlife products and identify areas where greater enforcement is needed. Using DNA-based assignment tests (i.e., samples are assigned to geographic locations), we addressed these factors for leopards (Panthera pardus) on the Indian subcontinent. We created geography-specific allele frequencies from a genetic reference database of 173 leopards across India to infer geographic origins of DNA samples from 40 seized leopard skins. Sensitivity analyses of samples of known geographic origins and assignments of seized skins demonstrated robust assignments for Indian leopards. We found that confiscated pelts seized in small numbers were not necessarily from local leopards. The geographic footprint of large seizures appeared to be bigger than the cumulative footprint of several smaller seizures, indicating widespread leopard poaching across the subcontinent. Our seized samples had male-biased sex ratios, especially the large seizures. From multiple seized sample assignments, we identified central India as a poaching hotspot for leopards. The techniques we applied can be used to identify origins of seized illegal wildlife products and trade routes at the subcontinent scale and beyond. © 2014 Society for Conservation Biology.

  7. Learning Natural Selection in 4th Grade with Multi-Agent-Based Computational Models

    NASA Astrophysics Data System (ADS)

    Dickes, Amanda Catherine; Sengupta, Pratim

    2013-06-01

    In this paper, we investigate how elementary school students develop multi-level explanations of population dynamics in a simple predator-prey ecosystem, through scaffolded interactions with a multi-agent-based computational model (MABM). The term "agent" in an MABM indicates individual computational objects or actors (e.g., cars), and these agents obey simple rules assigned or manipulated by the user (e.g., speeding up, slowing down, etc.). It is the interactions between these agents, based on the rules assigned by the user, that give rise to emergent, aggregate-level behavior (e.g., formation and movement of the traffic jam). Natural selection is such an emergent phenomenon, which has been shown to be challenging for novices (K16 students) to understand. Whereas prior research on learning evolutionary phenomena with MABMs has typically focused on high school students and beyond, we investigate how elementary students (4th graders) develop multi-level explanations of some introductory aspects of natural selection—species differentiation and population change—through scaffolded interactions with an MABM that simulates predator-prey dynamics in a simple birds-butterflies ecosystem. We conducted a semi-clinical interview based study with ten participants, in which we focused on the following: a) identifying the nature of learners' initial interpretations of salient events or elements of the represented phenomena, b) identifying the roles these interpretations play in the development of their multi-level explanations, and c) how attending to different levels of the relevant phenomena can make explicit different mechanisms to the learners. In addition, our analysis also shows that although there were differences between high- and low-performing students (in terms of being able to explain population-level behaviors) in the pre-test, these differences disappeared in the post-test.

  8. Phenomenological model for transient deformation based on state variables

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jackson, M S; Cho, C W; Alexopoulos, P

    The state variable theory of Hart, while providing a unified description of plasticity-dominated deformation, exhibits deficiencies when it is applied to transient deformation phenomena at stresses below yield. It appears that the description of stored anelastic strain is oversimplified. Consideration of a simple physical picture based on continuum dislocation pileups suggests that the neglect of weak barriers to dislocation motion is the source of these inadequacies. An appropriately modified description incorporating such barriers then allows the construction of a macroscopic model including transient effects. Although the flow relations for the microplastic element required in the new theory are not known,more » tentative assignments may be made for such functions. The model then exhibits qualitatively correct behavior when tensile, loading-unloading, reverse loading, and load relaxation tests are simulated. Experimental procedures are described for determining the unknown parameters and functions in the new model.« less

  9. Guidelines for assigning allowable properties to visually graded foreign species based on test data from full sized specimens

    Treesearch

    David W. Green; Bradley E. Shelley

    2006-01-01

    The objective of this document is to provide philosophy and guidelines for the assignment of allowable properties to visually graded dimension lumber produced from trees not grown in the United States. This document assumes, as a starting point, the procedures of ASTM D 1990.

  10. Optimal Weight Assignment for a Chinese Signature File.

    ERIC Educational Resources Information Center

    Liang, Tyne; And Others

    1996-01-01

    Investigates the performance of a character-based Chinese text retrieval scheme in which monogram keys and bigram keys are encoded into document signatures. Tests and verifies the theoretical predictions of the optimal weight assignments and the minimal false hit rate in experiments using a real Chinese corpus for disyllabic queries of different…

  11. 47 CFR 87.303 - Frequencies.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... only. (f) Frequency assignments for Flight Test VHF Stations may be based on either 8.33 kHz or 25 kHz spacing. Assignable frequencies include the interstitial frequencies 8.33 kHz from the VHF frequencies listed in paragraphs (a) and (b) of this section. Each 8.33 kHz interstitial frequency is subject to the...

  12. 47 CFR 87.303 - Frequencies.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... only. (f) Frequency assignments for Flight Test VHF Stations may be based on either 8.33 kHz or 25 kHz spacing. Assignable frequencies include the interstitial frequencies 8.33 kHz from the VHF frequencies listed in paragraphs (a) and (b) of this section. Each 8.33 kHz interstitial frequency is subject to the...

  13. 47 CFR 87.303 - Frequencies.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... only. (f) Frequency assignments for Flight Test VHF Stations may be based on either 8.33 kHz or 25 kHz spacing. Assignable frequencies include the interstitial frequencies 8.33 kHz from the VHF frequencies listed in paragraphs (a) and (b) of this section. Each 8.33 kHz interstitial frequency is subject to the...

  14. 47 CFR 87.303 - Frequencies.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... only. (f) Frequency assignments for Flight Test VHF Stations may be based on either 8.33 kHz or 25 kHz spacing. Assignable frequencies include the interstitial frequencies 8.33 kHz from the VHF frequencies listed in paragraphs (a) and (b) of this section. Each 8.33 kHz interstitial frequency is subject to the...

  15. High angle of attack control law development for a free-flight wind tunnel model using direct eigenstructure assignment

    NASA Technical Reports Server (NTRS)

    Wendel, Thomas R.; Boland, Joseph R.; Hahne, David E.

    1991-01-01

    Flight-control laws are developed for a wind-tunnel aircraft model flying at a high angle of attack by using a synthesis technique called direct eigenstructure assignment. The method employs flight guidelines and control-power constraints to develop the control laws, and gain schedules and nonlinear feedback compensation provide a framework for considering the nonlinear nature of the attack angle. Linear and nonlinear evaluations show that the control laws are effective, a conclusion that is further confirmed by a scale model used for free-flight testing.

  16. Prediction of aquatic toxicity mode of action using linear discriminant and random forest models.

    PubMed

    Martin, Todd M; Grulke, Christopher M; Young, Douglas M; Russom, Christine L; Wang, Nina Y; Jackson, Crystal R; Barron, Mace G

    2013-09-23

    The ability to determine the mode of action (MOA) for a diverse group of chemicals is a critical part of ecological risk assessment and chemical regulation. However, existing MOA assignment approaches in ecotoxicology have been limited to a relatively few MOAs, have high uncertainty, or rely on professional judgment. In this study, machine based learning algorithms (linear discriminant analysis and random forest) were used to develop models for assigning aquatic toxicity MOA. These methods were selected since they have been shown to be able to correlate diverse data sets and provide an indication of the most important descriptors. A data set of MOA assignments for 924 chemicals was developed using a combination of high confidence assignments, international consensus classifications, ASTER (ASessment Tools for the Evaluation of Risk) predictions, and weight of evidence professional judgment based an assessment of structure and literature information. The overall data set was randomly divided into a training set (75%) and a validation set (25%) and then used to develop linear discriminant analysis (LDA) and random forest (RF) MOA assignment models. The LDA and RF models had high internal concordance and specificity and were able to produce overall prediction accuracies ranging from 84.5 to 87.7% for the validation set. These results demonstrate that computational chemistry approaches can be used to determine the acute toxicity MOAs across a large range of structures and mechanisms.

  17. Modeling Aspects Of Nature Of Science To Preservice Elementary Teachers

    NASA Astrophysics Data System (ADS)

    Ashcraft, Paul

    2007-01-01

    Nature of science was modeled using guided inquiry activities in the university classroom with elementary education majors. A physical science content course initially used an Aristotelian model where students discussed the relationship between distance from a constant radiation source and the amount of radiation received based on accepted ``truths'' or principles and concluded that there was an inverse relationship. The class became Galilean in nature, using the scientific method to test that hypothesis. Examining data, the class rejected their hypothesis and concluded that there is an inverse square relationship. Assignments, given before and after the hypothesis testing, show the student's misconceptions and their acceptance of scientifically acceptable conceptions. Answers on exam questions further support this conceptual change. Students spent less class time on the inverse square relationship later when examining electrostatic force, magnetic force, gravity, and planetary solar radiation because the students related this particular experience to other physical relationships.

  18. Dynamic traffic assignment based trailblazing guide signing for major traffic generator.

    DOT National Transportation Integrated Search

    2009-11-01

    The placement of guide signs and the display of dynamic massage signs greatly affect drivers : understanding of the network and therefore their route choices. Most existing dynamic traffic assignment : models assume that drivers heading to a Major...

  19. AssignFit: a program for simultaneous assignment and structure refinement from solid-state NMR spectra

    PubMed Central

    Tian, Ye; Schwieters, Charles D.; Opella, Stanley J.; Marassi, Francesca M.

    2011-01-01

    AssignFit is a computer program developed within the XPLOR-NIH package for the assignment of dipolar coupling (DC) and chemical shift anisotropy (CSA) restraints derived from the solid-state NMR spectra of protein samples with uniaxial order. The method is based on minimizing the difference between experimentally observed solid-state NMR spectra and the frequencies back calculated from a structural model. Starting with a structural model and a set of DC and CSA restraints grouped only by amino acid type, as would be obtained by selective isotopic labeling, AssignFit generates all of the possible assignment permutations and calculates the corresponding atomic coordinates oriented in the alignment frame, together with the associated set of NMR frequencies, which are then compared with the experimental data for best fit. Incorporation of AssignFit in a simulated annealing refinement cycle provides an approach for simultaneous assignment and structure refinement (SASR) of proteins from solid-state NMR orientation restraints. The methods are demonstrated with data from two integral membrane proteins, one α-helical and one β-barrel, embedded in phospholipid bilayer membranes. PMID:22036904

  20. Fuzzy Computing Model of Activity Recognition on WSN Movement Data for Ubiquitous Healthcare Measurement.

    PubMed

    Chiang, Shu-Yin; Kan, Yao-Chiang; Chen, Yun-Shan; Tu, Ying-Ching; Lin, Hsueh-Chun

    2016-12-03

    Ubiquitous health care (UHC) is beneficial for patients to ensure they complete therapeutic exercises by self-management at home. We designed a fuzzy computing model that enables recognizing assigned movements in UHC with privacy. The movements are measured by the self-developed body motion sensor, which combines both accelerometer and gyroscope chips to make an inertial sensing node compliant with a wireless sensor network (WSN). The fuzzy logic process was studied to calculate the sensor signals that would entail necessary features of static postures and dynamic motions. Combinations of the features were studied and the proper feature sets were chosen with compatible fuzzy rules. Then, a fuzzy inference system (FIS) can be generated to recognize the assigned movements based on the rules. We thus implemented both fuzzy and adaptive neuro-fuzzy inference systems in the model to distinguish static and dynamic movements. The proposed model can effectively reach the recognition scope of the assigned activity. Furthermore, two exercises of upper-limb flexion in physical therapy were applied for the model in which the recognition rate can stand for the passing rate of the assigned motions. Finally, a web-based interface was developed to help remotely measure movement in physical therapy for UHC.

  1. Fuzzy Computing Model of Activity Recognition on WSN Movement Data for Ubiquitous Healthcare Measurement

    PubMed Central

    Chiang, Shu-Yin; Kan, Yao-Chiang; Chen, Yun-Shan; Tu, Ying-Ching; Lin, Hsueh-Chun

    2016-01-01

    Ubiquitous health care (UHC) is beneficial for patients to ensure they complete therapeutic exercises by self-management at home. We designed a fuzzy computing model that enables recognizing assigned movements in UHC with privacy. The movements are measured by the self-developed body motion sensor, which combines both accelerometer and gyroscope chips to make an inertial sensing node compliant with a wireless sensor network (WSN). The fuzzy logic process was studied to calculate the sensor signals that would entail necessary features of static postures and dynamic motions. Combinations of the features were studied and the proper feature sets were chosen with compatible fuzzy rules. Then, a fuzzy inference system (FIS) can be generated to recognize the assigned movements based on the rules. We thus implemented both fuzzy and adaptive neuro-fuzzy inference systems in the model to distinguish static and dynamic movements. The proposed model can effectively reach the recognition scope of the assigned activity. Furthermore, two exercises of upper-limb flexion in physical therapy were applied for the model in which the recognition rate can stand for the passing rate of the assigned motions. Finally, a web-based interface was developed to help remotely measure movement in physical therapy for UHC. PMID:27918482

  2. Metabarcoding of marine nematodes – evaluation of reference datasets used in tree-based taxonomy assignment approach

    PubMed Central

    2016-01-01

    Abstract Background Metabarcoding is becoming a common tool used to assess and compare diversity of organisms in environmental samples. Identification of OTUs is one of the critical steps in the process and several taxonomy assignment methods were proposed to accomplish this task. This publication evaluates the quality of reference datasets, alongside with several alignment and phylogeny inference methods used in one of the taxonomy assignment methods, called tree-based approach. This approach assigns anonymous OTUs to taxonomic categories based on relative placements of OTUs and reference sequences on the cladogram and support that these placements receive. New information In tree-based taxonomy assignment approach, reliable identification of anonymous OTUs is based on their placement in monophyletic and highly supported clades together with identified reference taxa. Therefore, it requires high quality reference dataset to be used. Resolution of phylogenetic trees is strongly affected by the presence of erroneous sequences as well as alignment and phylogeny inference methods used in the process. Two preparation steps are essential for the successful application of tree-based taxonomy assignment approach. Curated collections of genetic information do include erroneous sequences. These sequences have detrimental effect on the resolution of cladograms used in tree-based approach. They must be identified and excluded from the reference dataset beforehand. Various combinations of multiple sequence alignment and phylogeny inference methods provide cladograms with different topology and bootstrap support. These combinations of methods need to be tested in order to determine the one that gives highest resolution for the particular reference dataset. Completing the above mentioned preparation steps is expected to decrease the number of unassigned OTUs and thus improve the results of the tree-based taxonomy assignment approach. PMID:27932919

  3. Metabarcoding of marine nematodes - evaluation of reference datasets used in tree-based taxonomy assignment approach.

    PubMed

    Holovachov, Oleksandr

    2016-01-01

    Metabarcoding is becoming a common tool used to assess and compare diversity of organisms in environmental samples. Identification of OTUs is one of the critical steps in the process and several taxonomy assignment methods were proposed to accomplish this task. This publication evaluates the quality of reference datasets, alongside with several alignment and phylogeny inference methods used in one of the taxonomy assignment methods, called tree-based approach. This approach assigns anonymous OTUs to taxonomic categories based on relative placements of OTUs and reference sequences on the cladogram and support that these placements receive. In tree-based taxonomy assignment approach, reliable identification of anonymous OTUs is based on their placement in monophyletic and highly supported clades together with identified reference taxa. Therefore, it requires high quality reference dataset to be used. Resolution of phylogenetic trees is strongly affected by the presence of erroneous sequences as well as alignment and phylogeny inference methods used in the process. Two preparation steps are essential for the successful application of tree-based taxonomy assignment approach. Curated collections of genetic information do include erroneous sequences. These sequences have detrimental effect on the resolution of cladograms used in tree-based approach. They must be identified and excluded from the reference dataset beforehand.Various combinations of multiple sequence alignment and phylogeny inference methods provide cladograms with different topology and bootstrap support. These combinations of methods need to be tested in order to determine the one that gives highest resolution for the particular reference dataset.Completing the above mentioned preparation steps is expected to decrease the number of unassigned OTUs and thus improve the results of the tree-based taxonomy assignment approach.

  4. Optimizing location of manufacturing industries in the context of economic globalization: A bi-level model based approach

    NASA Astrophysics Data System (ADS)

    Wu, Shanhua; Yang, Zhongzhen

    2018-07-01

    This paper aims to optimize the locations of manufacturing industries in the context of economic globalization by proposing a bi-level programming model which integrates the location optimization model with the traffic assignment model. In the model, the transport network is divided into the subnetworks of raw materials and products respectively. The upper-level model is used to determine the location of industries and the OD matrices of raw materials and products. The lower-level model is used to calculate the attributes of traffic flow under given OD matrices. To solve the model, the genetic algorithm is designed. The proposed method is tested using the Chinese steel industry as an example. The result indicates that the proposed method could help the decision-makers to implement the location decisions for the manufacturing industries effectively.

  5. Augmenting superpopulation capture-recapture models with population assignment data

    USGS Publications Warehouse

    Wen, Zhi; Pollock, Kenneth; Nichols, James; Waser, Peter

    2011-01-01

    Ecologists applying capture-recapture models to animal populations sometimes have access to additional information about individuals' populations of origin (e.g., information about genetics, stable isotopes, etc.). Tests that assign an individual's genotype to its most likely source population are increasingly used. Here we show how to augment a superpopulation capture-recapture model with such information. We consider a single superpopulation model without age structure, and split each entry probability into separate components due to births in situ and immigration. We show that it is possible to estimate these two probabilities separately. We first consider the case of perfect information about population of origin, where we can distinguish individuals born in situ from immigrants with certainty. Then we consider the more realistic case of imperfect information, where we use genetic or other information to assign probabilities to each individual's origin as in situ or outside the population. We use a resampling approach to impute the true population of origin from imperfect assignment information. The integration of data on population of origin with capture-recapture data allows us to determine the contributions of immigration and in situ reproduction to the growth of the population, an issue of importance to ecologists. We illustrate our new models with capture-recapture and genetic assignment data from a population of banner-tailed kangaroo rats Dipodomys spectabilis in Arizona.

  6. Predicting Daily Insolation with Hourly Cloud Height and Coverage.

    NASA Astrophysics Data System (ADS)

    Meyers, T. P.; Dale, R. F.

    1983-04-01

    Solar radiation information is used in crop growth, boundary layer, entomological and plant pathological models, and in determining the potential use of active and passive solar energy systems. Yet solar radiation is among the least measured meteorological variables.A semi-physical model based on standard meteorological data was developed to estimate solar radiation received at the earth's surface. The radiation model includes the effects of Rayleigh scattering, absorption by water vapor and permanent gases, and absorption and scattering by aerosols and clouds. Cloud attenuation is accounted for by assigning transmission coefficients based on cloud height and amount. The cloud transmission coefficients for various heights and coverages were derived empirically from hourly observations of solar radiation in conjunction with corresponding cloud observations at West Lafayette, Indiana. The model was tested with independent data from West Lafayette and Indianapolis, Madison, WI, Omaha, NE, Columbia, MO, Nashville, TN, Seattle, WA, Los Angeles, CA, Phoenix, AZ, Lake Charles, LA, Miami, FL, and Sterling, VA. For each of these locations a 16% random sample of days was drawn within each of the 12 months in a year for testing the model. Excellent agreement between predicted and observed radiation values was obtained for all stations tested. Mean absolute errors ranged from 1.05 to 1.80 MJ m2 day1 and root-mean-square errors ranged from 1.31 to 2.32 MJ m2 day1. The model's performance judged by relative error was found to be independent of season and cloud amount for all locations tested.

  7. Measuring geographic access to health care: raster and network-based methods

    PubMed Central

    2012-01-01

    Background Inequalities in geographic access to health care result from the configuration of facilities, population distribution, and the transportation infrastructure. In recent accessibility studies, the traditional distance measure (Euclidean) has been replaced with more plausible measures such as travel distance or time. Both network and raster-based methods are often utilized for estimating travel time in a Geographic Information System. Therefore, exploring the differences in the underlying data models and associated methods and their impact on geographic accessibility estimates is warranted. Methods We examine the assumptions present in population-based travel time models. Conceptual and practical differences between raster and network data models are reviewed, along with methodological implications for service area estimates. Our case study investigates Limited Access Areas defined by Michigan’s Certificate of Need (CON) Program. Geographic accessibility is calculated by identifying the number of people residing more than 30 minutes from an acute care hospital. Both network and raster-based methods are implemented and their results are compared. We also examine sensitivity to changes in travel speed settings and population assignment. Results In both methods, the areas identified as having limited accessibility were similar in their location, configuration, and shape. However, the number of people identified as having limited accessibility varied substantially between methods. Over all permutations, the raster-based method identified more area and people with limited accessibility. The raster-based method was more sensitive to travel speed settings, while the network-based method was more sensitive to the specific population assignment method employed in Michigan. Conclusions Differences between the underlying data models help to explain the variation in results between raster and network-based methods. Considering that the choice of data model/method may substantially alter the outcomes of a geographic accessibility analysis, we advise researchers to use caution in model selection. For policy, we recommend that Michigan adopt the network-based method or reevaluate the travel speed assignment rule in the raster-based method. Additionally, we recommend that the state revisit the population assignment method. PMID:22587023

  8. Choosing colors for map display icons using models of visual search.

    PubMed

    Shive, Joshua; Francis, Gregory

    2013-04-01

    We show how to choose colors for icons on maps to minimize search time using predictions of a model of visual search. The model analyzes digital images of a search target (an icon on a map) and a search display (the map containing the icon) and predicts search time as a function of target-distractor color distinctiveness and target eccentricity. We parameterized the model using data from a visual search task and performed a series of optimization tasks to test the model's ability to choose colors for icons to minimize search time across icons. Map display designs made by this procedure were tested experimentally. In a follow-up experiment, we examined the model's flexibility to assign colors in novel search situations. The model fits human performance, performs well on the optimization tasks, and can choose colors for icons on maps with novel stimuli to minimize search time without requiring additional model parameter fitting. Models of visual search can suggest color choices that produce search time reductions for display icons. Designers should consider constructing visual search models as a low-cost method of evaluating color assignments.

  9. Cost-Effectiveness of POC Coagulation Testing Using Multiple Electrode Aggregometry.

    PubMed

    Straub, Niels; Bauer, Ekaterina; Agarwal, Seema; Meybohm, Patrick; Zacharowski, Kai; Hanke, Alexander A; Weber, Christian F

    2016-01-01

    The economic effects of Point-of-Care (POC) coagulation testing including Multiple Electrode Aggregometry (MEA) with the Multiplate device have not been examined. A health economic model with associated clinical endpoints was developed to calculate the effectiveness and estimated costs of coagulation analyses based on standard laboratory testing (SLT) or POC testing offering the possibility to assess platelet dysfunction using aggregometric measures. Cost estimates included pre- and perioperative costs of hemotherapy, intra- and post-operative coagulation testing costs, and hospitalization costs, including the costs of transfusion-related complications. Our model calculation using a simulated true-to-life cohort of 10,000 cardiac surgery patients assigned to each testing alternative demonstrated that there were 950 fewer patients in the POC branch who required any transfusion of red blood cells. The subsequent numbers of massive transfusions and patients with transfusion-related complications were reduced with the POC testing by 284 and 126, respectively. The average expected total cost in the POC branch was 288 Euro lower for every treated patient than that in the SLT branch. Incorporating aggregometric analyses using MEA into hemotherapy algorithms improved medical outcomes in cardiac surgery patients in the presented health economic model. There was an overall better economic outcome associated with POC testing compared with SLT testing despite the higher costs of testing.

  10. A smoothed stochastic earthquake rate model considering seismicity and fault moment release for Europe

    NASA Astrophysics Data System (ADS)

    Hiemer, S.; Woessner, J.; Basili, R.; Danciu, L.; Giardini, D.; Wiemer, S.

    2014-08-01

    We present a time-independent gridded earthquake rate forecast for the European region including Turkey. The spatial component of our model is based on kernel density estimation techniques, which we applied to both past earthquake locations and fault moment release on mapped crustal faults and subduction zone interfaces with assigned slip rates. Our forecast relies on the assumption that the locations of past seismicity is a good guide to future seismicity, and that future large-magnitude events occur more likely in the vicinity of known faults. We show that the optimal weighted sum of the corresponding two spatial densities depends on the magnitude range considered. The kernel bandwidths and density weighting function are optimized using retrospective likelihood-based forecast experiments. We computed earthquake activity rates (a- and b-value) of the truncated Gutenberg-Richter distribution separately for crustal and subduction seismicity based on a maximum likelihood approach that considers the spatial and temporal completeness history of the catalogue. The final annual rate of our forecast is purely driven by the maximum likelihood fit of activity rates to the catalogue data, whereas its spatial component incorporates contributions from both earthquake and fault moment-rate densities. Our model constitutes one branch of the earthquake source model logic tree of the 2013 European seismic hazard model released by the EU-FP7 project `Seismic HAzard haRmonization in Europe' (SHARE) and contributes to the assessment of epistemic uncertainties in earthquake activity rates. We performed retrospective and pseudo-prospective likelihood consistency tests to underline the reliability of our model and SHARE's area source model (ASM) using the testing algorithms applied in the collaboratory for the study of earthquake predictability (CSEP). We comparatively tested our model's forecasting skill against the ASM and find a statistically significant better performance for testing periods of 10-20 yr. The testing results suggest that our model is a viable candidate model to serve for long-term forecasting on timescales of years to decades for the European region.

  11. Developing a Model Component

    NASA Technical Reports Server (NTRS)

    Fields, Christina M.

    2013-01-01

    The Spaceport Command and Control System (SCCS) Simulation Computer Software Configuration Item (CSCI) is,. responsible for providing simulations to support test and verification of SCCS hardware and software. The Universal Coolant Transporter System (UCTS) is a Space Shuttle Orbiter support piece of the Ground Servicing Equipment (GSE). The purpose of the UCTS is to provide two support services to the Space Shuttle Orbiter immediately after landing at the Shuttle Landing Facility. The Simulation uses GSE Models to stand in for the actual systems to support testing of SCCS systems s:luring their development. As an intern at KSC, my assignment was to develop a model component for the UCTS. I was given a fluid component (drier) to model in Matlab. The drier was a Catch All replaceable core type filter-drier. The filter-drier provides maximum protection for the thermostatic expansion valve and solenoid valve from dirt that may be in the system. The filter-drier also protects the valves from freezing up. I researched fluid dynamics to understand the function of my component. I completed training for UNIX and Simulink to help aid in my assignment. The filter-drier was modeled by determining affects it has on the pressure, velocity and temperature of the system. I used Bernoulli's Equation to calculate the pressure and velocity differential through the dryer. I created my model filter-drier in Simulink and wrote the test script to test the component. I completed component testing and captured test data. The finalized model was sent for peer review for any improvements.

  12. The Kitchen Is Your Laboratory: A Research-Based Term-Paper Assignment in a Science Writing Course

    ERIC Educational Resources Information Center

    Jones, Clinton D.

    2011-01-01

    A term-paper assignment that encompasses the full scientific method has been developed and implemented in an undergraduate science writing and communication course with no laboratory component. Students are required to develop their own hypotheses, design experiments to test their hypotheses, and collect empirical data as independent scientists in…

  13. DeltaSA tool for source apportionment benchmarking, description and sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Pernigotti, D.; Belis, C. A.

    2018-05-01

    DeltaSA is an R-package and a Java on-line tool developed at the EC-Joint Research Centre to assist and benchmark source apportionment applications. Its key functionalities support two critical tasks in this kind of studies: the assignment of a factor to a source in factor analytical models (source identification) and the model performance evaluation. The source identification is based on the similarity between a given factor and source chemical profiles from public databases. The model performance evaluation is based on statistical indicators used to compare model output with reference values generated in intercomparison exercises. The references values are calculated as the ensemble average of the results reported by participants that have passed a set of testing criteria based on chemical profiles and time series similarity. In this study, a sensitivity analysis of the model performance criteria is accomplished using the results of a synthetic dataset where "a priori" references are available. The consensus modulated standard deviation punc gives the best choice for the model performance evaluation when a conservative approach is adopted.

  14. Constructing cardiovascular fitness knowledge in physical education

    PubMed Central

    Zhang, Tan; Chen, Ang; Chen, Senlin; Hong, Deockki; Loflin, Jerry; Ennis, Catherine

    2015-01-01

    In physical education, it has become necessary for children to learn kinesiological knowledge for understanding the benefits of physical activity and developing a physically active lifestyle. This study was conducted to determine the extent to which cognitive assignments about healthful living and fitness contributed to knowledge growth on cardiorespiratory fitness and health. Fourth grade students (N = 616) from 15 randomly sampled urban elementary schools completed 34 cognitive assignments related to the cardiorespiratory physical activities they were engaged in across 10 lessons. Performance on the assignments were analyzed in relation to their knowledge gain measured using a standardized knowledge test. A multivariate discriminant analysis revealed that the cognitive assignments contributed to knowledge gain but the contribution varied assignment by assignment. A multiple regression analysis indicated that students’ assignment performance by lesson contributed positively to their knowledge growth scores. A content analysis based on the constructivist learning framework showed that observing–reasoning assignments contributed the most to knowledge growth. Analytical and analytical–application assignments contributed less than the constructivist theories would predict. PMID:25995702

  15. Genetic programming for evolving due-date assignment models in job shop environments.

    PubMed

    Nguyen, Su; Zhang, Mengjie; Johnston, Mark; Tan, Kay Chen

    2014-01-01

    Due-date assignment plays an important role in scheduling systems and strongly influences the delivery performance of job shops. Because of the stochastic and dynamic nature of job shops, the development of general due-date assignment models (DDAMs) is complicated. In this study, two genetic programming (GP) methods are proposed to evolve DDAMs for job shop environments. The experimental results show that the evolved DDAMs can make more accurate estimates than other existing dynamic DDAMs with promising reusability. In addition, the evolved operation-based DDAMs show better performance than the evolved DDAMs employing aggregate information of jobs and machines.

  16. The Effects of a Flipped Classroom Model of Instruction on Students' Performance and Attitudes towards Chemistry

    ERIC Educational Resources Information Center

    Olakanmi, Eunice Eyitayo

    2017-01-01

    This study establishes the effects of a flipped classroom model of instruction on academic performance and attitudes of 66 first-year secondary school students towards chemistry. A pre-test and post-test experimental design was employed to assign students randomly into either the experimental or control group. In order to assess the suitability of…

  17. Developing learning community model with soft skill integration for the building engineering apprenticeship programme in vocational high school

    NASA Astrophysics Data System (ADS)

    Sutrisno, Dardiri, Ahmad; Sugandi, R. Machmud

    2017-09-01

    This study aimed to address the procedure, effectiveness, and problems in the implementation of learning model for Building Engineering Apprenticeship Training Programme. This study was carried out through survey method and experiment. The data were collected using questionnaire, test, and assessment sheet. The collected data were examined through description, t-test, and covariance analysis. The results of the study showed that (1) the model's procedure covered preparation course, readiness assessment, assignment distribution, handing over students to apprenticeship instructors, task completion, assisting, field assessment, report writing, and follow-up examination, (2) the Learning Community model could significantly improve students' active learning, but not improve students' hard skills and soft skills, and (3) the problems emerging in the implementation of the model were (1) students' difficulties in finding apprenticeship places and qualified instructors, and asking for relevant tasks, (2) teachers' difficulties in determining relevant tasks and monitoring students, and (3) apprenticeship instructors' difficulties in assigning, monitoring, and assessing students.

  18. System engineering of aerospace and advanced technology programs at an astronautics company (record of study)

    NASA Astrophysics Data System (ADS)

    Kennedy, Mike O.

    An internship with the Martin Marietta Astronautics Group that was performed in partial fulfillment of the requirements for the Doctor of Engineering degree is documented. The internship included assignments with two Martin Marietta companies, on three different programs and in four areas of engineering. A first-hand look is taken at system engineering, SDI and advanced program management, and the way Martin Marietta conducts business. The five internship objectives were related to assignments in system modeling, system integration, engineering analysis and technical management: (1) The effects of thermally and mechanically induced mirror surface distortions upon the wavefront intensity field of a high energy laser beam passing through the optical train of a space-based laser system were modeled. (2) The restrictive as opposed to the broad interpretation of the 1972 ABM Treaty, and the capability of the Strategic Defense Initiative Zenith Star Program to comply with the Treaty were evaluated. (3) The capability of Martin Marietta to develop an automated analysis system to integrate and analyze Superconducting Super Collider detector designs was investigated. (4) The thermal models that were developed in support of the Small Intercontinental Ballistic Missile flight tests were described. (5) The technical management role of the Product Integrity Engineer assigned to the Zenith Star spacecraft's Beam Control and Transfer Subsystem was discussed. The relationships between the engineering, business, security and social concerns associated with the practice of engineering and the management of programs by a major defense contractor are explored.

  19. Principles for designing randomized preventive trials in mental health: an emerging developmental epidemiology paradigm.

    PubMed

    Brown, C H; Liao, J

    1999-10-01

    An emerging population-based paradigm is now being used to guide the design of preventive trials used to test developmental models. We discuss elements of the designs of several ongoing randomized preventive trials involving reduction of risk for children of divorce, for children who exhibit behavioral or learning problems, and for children whose parents are being treated for depression. To test developmental models using this paradigm, we introduce three classes of design issues: design for prerandomization, design for intervention, and design for postintervention. For each of these areas, we present quantitative results from power calculations. Both scientific and cost implications of these power calculations are discussed in terms of variation among subjects on preintervention measures, unit of intervention, assignment, balancing, number of pretest and posttest measures, and the examination of moderation effects.

  20. Simple Queueing Model Applied to the City of Portland

    NASA Astrophysics Data System (ADS)

    Simon, Patrice M.; Esser, Jörg; Nagel, Kai

    We use a simple traffic micro-simulation model based on queueing dynamics as introduced by Gawron [IJMPC, 9(3):393, 1998] in order to simulate traffic in Portland/Oregon. Links have a flow capacity, that is, they do not release more vehicles per second than is possible according to their capacity. This leads to queue built-up if demand exceeds capacity. Links also have a storage capacity, which means that once a link is full, vehicles that want to enter the link need to wait. This leads to queue spill-back through the network. The model is compatible with route-plan-based approaches such as TRANSIMS, where each vehicle attempts to follow its pre-computed path. Yet, both the data requirements and the computational requirements are considerably lower than for the full TRANSIMS microsimulation. Indeed, the model uses standard emme/2 network data, and runs about eight times faster than real time with more than 100 000 vehicles simultaneously in the simulation on a single Pentium-type CPU. We derive the model's fundamental diagrams and explain it. The simulation is used to simulate traffic on the emme/2 network of the Portland (Oregon) metropolitan region (20 000 links). Demand is generated by a simplified home-to-work destination assignment which generates about half a million trips for the morning peak. Route assignment is done by iterative feedback between micro-simulation and router. An iterative solution of the route assignment for the above problem can be achieved within about half a day of computing time on a desktop workstation. We compare results with field data and with results of traditional assignment runs by the Portland Metropolitan Planning Organization. Thus, with a model such as this one, it is possible to use a dynamic, activities-based approach to transportation simulation (such as in TRANSIMS) with affordable data and hardware. This should enable systematic research about the coupling of demand generation, route assignment, and micro-simulation output.

  1. A Computer Model for the Transmission Characteristics of Dielectric Radomes

    DTIC Science & Technology

    1992-03-01

    GAUS.F....... 104 APPENDIX D........................105 A. ARGUMENTS: CIRCTHETA. CIRCRHO AND CIRCPHI . . . 105 B. TEST PROGRAM: CIRCSUB.F...ETSCAT(500),EPSCAT(500),ETHF(500),EPHF(500) INTEGER NT,NPHI,CNRHO,CNPHI,NP,SELECTION REAL MODE,BASE,RS,ZP, RHB ,ZHB DATA PI,START,STOP/3.1415926,0.,90...ZH(I)).LT..OO1) ZH(I)=O. IF(ABS(RH(I)).LT..O01) RH(I)=O. ZHB=ZH (I) /BK RHB =RH (I) /BK ZiG (i)=IMP C C ASSIGN SURFACE IMPEDANCE AT THIS POINT. THE

  2. A Spiking Neural Network in sEMG Feature Extraction.

    PubMed

    Lobov, Sergey; Mironov, Vasiliy; Kastalskiy, Innokentiy; Kazantsev, Victor

    2015-11-03

    We have developed a novel algorithm for sEMG feature extraction and classification. It is based on a hybrid network composed of spiking and artificial neurons. The spiking neuron layer with mutual inhibition was assigned as feature extractor. We demonstrate that the classification accuracy of the proposed model could reach high values comparable with existing sEMG interface systems. Moreover, the algorithm sensibility for different sEMG collecting systems characteristics was estimated. Results showed rather equal accuracy, despite a significant sampling rate difference. The proposed algorithm was successfully tested for mobile robot control.

  3. A physics-based probabilistic forecasting model for rainfall-induced shallow landslides at regional scale

    NASA Astrophysics Data System (ADS)

    Zhang, Shaojie; Zhao, Luqiang; Delgado-Tellez, Ricardo; Bao, Hongjun

    2018-03-01

    Conventional outputs of physics-based landslide forecasting models are presented as deterministic warnings by calculating the safety factor (Fs) of potentially dangerous slopes. However, these models are highly dependent on variables such as cohesion force and internal friction angle which are affected by a high degree of uncertainty especially at a regional scale, resulting in unacceptable uncertainties of Fs. Under such circumstances, the outputs of physical models are more suitable if presented in the form of landslide probability values. In order to develop such models, a method to link the uncertainty of soil parameter values with landslide probability is devised. This paper proposes the use of Monte Carlo methods to quantitatively express uncertainty by assigning random values to physical variables inside a defined interval. The inequality Fs < 1 is tested for each pixel in n simulations which are integrated in a unique parameter. This parameter links the landslide probability to the uncertainties of soil mechanical parameters and is used to create a physics-based probabilistic forecasting model for rainfall-induced shallow landslides. The prediction ability of this model was tested in a case study, in which simulated forecasting of landslide disasters associated with heavy rainfalls on 9 July 2013 in the Wenchuan earthquake region of Sichuan province, China, was performed. The proposed model successfully forecasted landslides in 159 of the 176 disaster points registered by the geo-environmental monitoring station of Sichuan province. Such testing results indicate that the new model can be operated in a highly efficient way and show more reliable results, attributable to its high prediction accuracy. Accordingly, the new model can be potentially packaged into a forecasting system for shallow landslides providing technological support for the mitigation of these disasters at regional scale.

  4. Version Control in Project-Based Learning

    ERIC Educational Resources Information Center

    Milentijevic, Ivan; Ciric, Vladimir; Vojinovic, Oliver

    2008-01-01

    This paper deals with the development of a generalized model for version control systems application as a support in a range of project-based learning methods. The model is given as UML sequence diagram and described in detail. The proposed model encompasses a wide range of different project-based learning approaches by assigning a supervisory…

  5. Ride quality evaluation. IV - Models of subjective reaction to aircraft motion

    NASA Technical Reports Server (NTRS)

    Jacobson, I. D.; Richards, L. G.

    1978-01-01

    The paper examines models of human reaction to the motions typically experienced on short-haul aircraft flights. Data are taken on the regularly scheduled flights of four commercial airlines - three airplanes and one helicopter. The data base consists of: (1) a series of motion recordings distributed over each flight, each including all six degrees of freedom of motion; temperature, pressure, and noise are also recorded; (2) ratings of perceived comfort and satisfaction from the passengers on each flight; (3) moment-by-moment comfort ratings from a test subject assigned to each airplane; and (4) overall comfort ratings for each flight from the test subjects. Regression models are obtained for prediction of rated comfort from rms values for six degrees of freedom of motion. It is shown that the model C = 2.1 + 17.1 T + 17.2 V (T = transverse acceleration, V = vertical acceleration) gives a good fit to the airplane data but is less acceptable for the helicopter data.

  6. Subject specific finite element modeling of periprosthetic femoral fracture using element deactivation to simulate bone failure.

    PubMed

    Miles, Brad; Kolos, Elizabeth; Walter, William L; Appleyard, Richard; Shi, Angela; Li, Qing; Ruys, Andrew J

    2015-06-01

    Subject-specific finite element (FE) modeling methodology could predict peri-prosthetic femoral fracture (PFF) for cementless hip arthoplasty in the early postoperative period. This study develops methodology for subject-specific finite element modeling by using the element deactivation technique to simulate bone failure and validate with experimental testing, thereby predicting peri-prosthetic femoral fracture in the early postoperative period. Material assignments for biphasic and triphasic models were undertaken. Failure modeling with the element deactivation feature available in ABAQUS 6.9 was used to simulate a crack initiation and propagation in the bony tissue based upon a threshold of fracture strain. The crack mode for the biphasic models was very similar to the experimental testing crack mode, with a similar shape and path of the crack. The fracture load is sensitive to the friction coefficient at the implant-bony interface. The development of a novel technique to simulate bone failure by element deactivation of subject-specific finite element models could aid prediction of fracture load in addition to fracture risk characterization for PFF. Copyright © 2015 IPEM. Published by Elsevier Ltd. All rights reserved.

  7. Diagnostic, Predictive and Compositional Modeling with Data Mining in Integrated Learning Environments

    ERIC Educational Resources Information Center

    Lee, Chien-Sing

    2007-01-01

    Models represent a set of generic patterns to test hypotheses. This paper presents the CogMoLab student model in the context of an integrated learning environment. Three aspects are discussed: diagnostic and predictive modeling with respect to the issues of credit assignment and scalability and compositional modeling of the student profile in the…

  8. Using probability modelling and genetic parentage assignment to test the role of local mate availability in mating system variation.

    PubMed

    Blyton, Michaela D J; Banks, Sam C; Peakall, Rod; Lindenmayer, David B

    2012-02-01

    The formal testing of mating system theories with empirical data is important for evaluating the relative importance of different processes in shaping mating systems in wild populations. Here, we present a generally applicable probability modelling framework to test the role of local mate availability in determining a population's level of genetic monogamy. We provide a significance test for detecting departures in observed mating patterns from model expectations based on mate availability alone, allowing the presence and direction of behavioural effects to be inferred. The assessment of mate availability can be flexible and in this study it was based on population density, sex ratio and spatial arrangement. This approach provides a useful tool for (1) isolating the effect of mate availability in variable mating systems and (2) in combination with genetic parentage analyses, gaining insights into the nature of mating behaviours in elusive species. To illustrate this modelling approach, we have applied it to investigate the variable mating system of the mountain brushtail possum (Trichosurus cunninghami) and compared the model expectations with the outcomes of genetic parentage analysis over an 18-year study. The observed level of monogamy was higher than predicted under the model. Thus, behavioural traits, such as mate guarding or selective mate choice, may increase the population level of monogamy. We show that combining genetic parentage data with probability modelling can facilitate an improved understanding of the complex interactions between behavioural adaptations and demographic dynamics in driving mating system variation. © 2011 Blackwell Publishing Ltd.

  9. Assigning king eiders to wintering regions in the Bering Sea using stable isotopes of feathers and claws

    USGS Publications Warehouse

    Oppel, S.; Powell, A.N.

    2008-01-01

    Identification of wintering regions for birds sampled during the breeding season is crucial to understanding how events outside the breeding season may affect populations. We assigned king eiders captured on breeding grounds in northern Alaska to 3 broad geographic wintering regions in the Bering Sea using stable carbon and nitrogen isotopes obtained from head feathers. Using a discriminant function analysis of feathers obtained from birds tracked with satellite transmitters, we estimated that 88 % of feathers were assigned to the region in which they were grown. We then assigned 84 birds of unknown origin to wintering regions based on their head feather isotope ratios, and tested the utility of claws for geographic assignment. Based on the feather results, we estimated that similar proportions of birds in our study area use each of the 3 wintering regions in the Bering Sea. These results are in close agreement with estimates from satellite telemetry and show the usefulness of stable isotope signatures of feathers in assigning marine birds to geographic regions. The use of claws is currently limited by incomplete understanding of claw growth rates. Data presented here will allow managers of eiders, other marine birds, and marine mammals to assign animals to regions in the Bering Sea based on stable isotope signatures of body tissues. ?? Inter-Research 2008.

  10. Determinants of Default in P2P Lending

    PubMed Central

    2015-01-01

    This paper studies P2P lending and the factors explaining loan default. This is an important issue because in P2P lending individual investors bear the credit risk, instead of financial institutions, which are experts in dealing with this risk. P2P lenders suffer a severe problem of information asymmetry, because they are at a disadvantage facing the borrower. For this reason, P2P lending sites provide potential lenders with information about borrowers and their loan purpose. They also assign a grade to each loan. The empirical study is based on loans’ data collected from Lending Club (N = 24,449) from 2008 to 2014 that are first analyzed by using univariate means tests and survival analysis. Factors explaining default are loan purpose, annual income, current housing situation, credit history and indebtedness. Secondly, a logistic regression model is developed to predict defaults. The grade assigned by the P2P lending site is the most predictive factor of default, but the accuracy of the model is improved by adding other information, especially the borrower’s debt level. PMID:26425854

  11. Determinants of Default in P2P Lending.

    PubMed

    Serrano-Cinca, Carlos; Gutiérrez-Nieto, Begoña; López-Palacios, Luz

    2015-01-01

    This paper studies P2P lending and the factors explaining loan default. This is an important issue because in P2P lending individual investors bear the credit risk, instead of financial institutions, which are experts in dealing with this risk. P2P lenders suffer a severe problem of information asymmetry, because they are at a disadvantage facing the borrower. For this reason, P2P lending sites provide potential lenders with information about borrowers and their loan purpose. They also assign a grade to each loan. The empirical study is based on loans' data collected from Lending Club (N = 24,449) from 2008 to 2014 that are first analyzed by using univariate means tests and survival analysis. Factors explaining default are loan purpose, annual income, current housing situation, credit history and indebtedness. Secondly, a logistic regression model is developed to predict defaults. The grade assigned by the P2P lending site is the most predictive factor of default, but the accuracy of the model is improved by adding other information, especially the borrower's debt level.

  12. Computer-based coding of free-text job descriptions to efficiently identify occupations in epidemiological studies

    PubMed Central

    Russ, Daniel E.; Ho, Kwan-Yuet; Colt, Joanne S.; Armenti, Karla R.; Baris, Dalsu; Chow, Wong-Ho; Davis, Faith; Johnson, Alison; Purdue, Mark P.; Karagas, Margaret R.; Schwartz, Kendra; Schwenn, Molly; Silverman, Debra T.; Johnson, Calvin A.; Friesen, Melissa C.

    2016-01-01

    Background Mapping job titles to standardized occupation classification (SOC) codes is an important step in identifying occupational risk factors in epidemiologic studies. Because manual coding is time-consuming and has moderate reliability, we developed an algorithm called SOCcer (Standardized Occupation Coding for Computer-assisted Epidemiologic Research) to assign SOC-2010 codes based on free-text job description components. Methods Job title and task-based classifiers were developed by comparing job descriptions to multiple sources linking job and task descriptions to SOC codes. An industry-based classifier was developed based on the SOC prevalence within an industry. These classifiers were used in a logistic model trained using 14,983 jobs with expert-assigned SOC codes to obtain empirical weights for an algorithm that scored each SOC/job description. We assigned the highest scoring SOC code to each job. SOCcer was validated in two occupational data sources by comparing SOC codes obtained from SOCcer to expert assigned SOC codes and lead exposure estimates obtained by linking SOC codes to a job-exposure matrix. Results For 11,991 case-control study jobs, SOCcer-assigned codes agreed with 44.5% and 76.3% of manually assigned codes at the 6- and 2-digit level, respectively. Agreement increased with the score, providing a mechanism to identify assignments needing review. Good agreement was observed between lead estimates based on SOCcer and manual SOC assignments (kappa: 0.6–0.8). Poorer performance was observed for inspection job descriptions, which included abbreviations and worksite-specific terminology. Conclusions Although some manual coding will remain necessary, using SOCcer may improve the efficiency of incorporating occupation into large-scale epidemiologic studies. PMID:27102331

  13. Global Optimization of Emergency Evacuation Assignments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Han, Lee; Yuan, Fang; Chin, Shih-Miao

    2006-01-01

    Conventional emergency evacuation plans often assign evacuees to fixed routes or destinations based mainly on geographic proximity. Such approaches can be inefficient if the roads are congested, blocked, or otherwise dangerous because of the emergency. By not constraining evacuees to prespecified destinations, a one-destination evacuation approach provides flexibility in the optimization process. We present a framework for the simultaneous optimization of evacuation-traffic distribution and assignment. Based on the one-destination evacuation concept, we can obtain the optimal destination and route assignment by solving a one-destination traffic-assignment problem on a modified network representation. In a county-wide, large-scale evacuation case study, the one-destinationmore » model yields substantial improvement over the conventional approach, with the overall evacuation time reduced by more than 60 percent. More importantly, emergency planners can easily implement this framework by instructing evacuees to go to destinations that the one-destination optimization process selects.« less

  14. Autonomous Guidance Strategy for Spacecraft Formations and Reconfiguration Maneuvers

    NASA Astrophysics Data System (ADS)

    Wahl, Theodore P.

    A guidance strategy for autonomous spacecraft formation reconfiguration maneuvers is presented. The guidance strategy is presented as an algorithm that solves the linked assignment and delivery problems. The assignment problem is the task of assigning the member spacecraft of the formation to their new positions in the desired formation geometry. The guidance algorithm uses an auction process (also called an "auction algorithm''), presented in the dissertation, to solve the assignment problem. The auction uses the estimated maneuver and time of flight costs between the spacecraft and targets to create assignments which minimize a specific "expense'' function for the formation. The delivery problem is the task of delivering the spacecraft to their assigned positions, and it is addressed through one of two guidance schemes described in this work. The first is a delivery scheme based on artificial potential function (APF) guidance. APF guidance uses the relative distances between the spacecraft, targets, and any obstacles to design maneuvers based on gradients of potential fields. The second delivery scheme is based on model predictive control (MPC); this method uses a model of the system dynamics to plan a series of maneuvers designed to minimize a unique cost function. The guidance algorithm uses an analytic linearized approximation of the relative orbital dynamics, the Yamanaka-Ankersen state transition matrix, in the auction process and in both delivery methods. The proposed guidance strategy is successful, in simulations, in autonomously assigning the members of the formation to new positions and in delivering the spacecraft to these new positions safely using both delivery methods. This guidance algorithm can serve as the basis for future autonomous guidance strategies for spacecraft formation missions.

  15. Summer School Effects in a Randomized Field Trial

    ERIC Educational Resources Information Center

    Zvoch, Keith; Stevens, Joseph J.

    2013-01-01

    This field-based randomized trial examined the effect of assignment to and participation in summer school for two moderately at-risk samples of struggling readers. Application of multiple regression models to difference scores capturing the change in summer reading fluency revealed that kindergarten students randomly assigned to summer school…

  16. The Ohio Patient Navigation Research Program: does the American Cancer Society patient navigation model improve time to resolution in patients with abnormal screening tests?

    PubMed

    Paskett, Electra D; Katz, Mira L; Post, Douglas M; Pennell, Michael L; Young, Gregory S; Seiber, Eric E; Harrop, J Phil; DeGraffinreid, Cecilia R; Tatum, Cathy M; Dean, Julie A; Murray, David M

    2012-10-01

    Patient navigation (PN) has been suggested as a way to reduce cancer health disparities; however, many models of PN exist and most have not been carefully evaluated. The goal of this study was to test the Ohio American Cancer Society model of PN as it relates to reducing time to diagnostic resolution among persons with abnormal breast, cervical, or colorectal cancer screening tests or symptoms. A total of 862 patients from 18 clinics participated in this group-randomized trial. Chart review documented the date of the abnormality and the date of resolution. The primary analysis used shared frailty models to test for the effect of PN on time to resolution. Crude HR were reported as there was no evidence of confounding. HRs became significant at 6 months; conditional on the random clinic effect, the resolution rate at 15 months was 65% higher in the PN arm (P = 0.012 for difference in resolution rate across arms; P = 0.009 for an increase in the HR over time). Participants with abnormal cancer screening tests or symptoms resolved faster if assigned to PN compared with those not assigned to PN. The effect of PN became apparent beginning six months after detection of the abnormality. PN may help address health disparities by reducing time to resolution after an abnormal cancer screening test. 2012 AACR

  17. Isolated development of inner (wall) caries like lesions in a bacterial-based in vitro model.

    PubMed

    Diercke, K; Lussi, A; Kersten, T; Seemann, R

    2009-12-01

    The study conducted in a bacterial-based in vitro caries model aimed to determine whether typical inner secondary caries lesions can be detected at cavity walls of restorations with selected gap widths when the development of outer lesions is inhibited. Sixty bovine tooth specimens were randomly assigned to the following groups: test group 50 (TG50; gap, 50 microm), test group 100 (TG100; gap, 100 microm), test group 250 (TG250; gap, 250 microm) and a control group (CG; gap, 250 microm). The outer tooth surface of the test group specimens was covered with an acid-resistant varnish to inhibit the development of an outer caries lesion. After incubation in the caries model, the area of demineralization at the cavity wall was determined by confocal laser scanning microscopy. All test group specimens demonstrated only wall lesions. The CG specimens developed outer and wall lesions. The TG250 specimens showed significantly less wall lesion area compared to the CG (p < 0.05). In the test groups, a statistically significant increase (p < 0.05) in lesion area could be detected in enamel between TG50 and TG250 and in dentine between TG50 and TG100. In conclusion, the inner wall lesions of secondary caries can develop without the presence of outer lesions and therefore can be regarded as an entity on their own. The extent of independently developed wall lesions increased with gap width in the present setting.

  18. What is a species? A new universal method to measure differentiation and assess the taxonomic rank of allopatric populations, using continuous variables

    PubMed Central

    Donegan, Thomas M.

    2018-01-01

    Abstract Existing models for assigning species, subspecies, or no taxonomic rank to populations which are geographically separated from one another were analyzed. This was done by subjecting over 3,000 pairwise comparisons of vocal or biometric data based on birds to a variety of statistical tests that have been proposed as measures of differentiation. One current model which aims to test diagnosability (Isler et al. 1998) is highly conservative, applying a hard cut-off, which excludes from consideration differentiation below diagnosis. It also includes non-overlap as a requirement, a measure which penalizes increases to sample size. The “species scoring” model of Tobias et al. (2010) involves less drastic cut-offs, but unlike Isler et al. (1998), does not control adequately for sample size and attributes scores in many cases to differentiation which is not statistically significant. Four different models of assessing effect sizes were analyzed: using both pooled and unpooled standard deviations and controlling for sample size using t-distributions or omitting to do so. Pooled standard deviations produced more conservative effect sizes when uncontrolled for sample size but less conservative effect sizes when so controlled. Pooled models require assumptions to be made that are typically elusive or unsupported for taxonomic studies. Modifications to improving these frameworks are proposed, including: (i) introducing statistical significance as a gateway to attributing any weighting to findings of differentiation; (ii) abandoning non-overlap as a test; (iii) recalibrating Tobias et al. (2010) scores based on effect sizes controlled for sample size using t-distributions. A new universal method is proposed for measuring differentiation in taxonomy using continuous variables and a formula is proposed for ranking allopatric populations. This is based first on calculating effect sizes using unpooled standard deviations, controlled for sample size using t-distributions, for a series of different variables. All non-significant results are excluded by scoring them as zero. Distance between any two populations is calculated using Euclidian summation of non-zeroed effect size scores. If the score of an allopatric pair exceeds that of a related sympatric pair, then the allopatric population can be ranked as species and, if not, then at most subspecies rank should be assigned. A spreadsheet has been programmed and is being made available which allows this and other tests of differentiation and rank studied in this paper to be rapidly analyzed. PMID:29780266

  19. Modeling aesthetics to support an ecosystem services approach for natural resource management decision making.

    PubMed

    Booth, Pieter N; Law, Sheryl A; Ma, Jane; Buonagurio, John; Boyd, James; Turnley, Jessica

    2017-09-01

    This paper reviews literature on aesthetics and describes the development of vista and landscape aesthetics models. Spatially explicit variables were chosen to represent physical characteristics of natural landscapes that are important to aesthetic preferences. A vista aesthetics model evaluates the aesthetics of natural landscapes viewed from distances of more than 1000 m, and a landscape aesthetics model evaluates the aesthetic value of wetlands and forests within 1000 m from the viewer. Each of the model variables is quantified using spatially explicit metrics on a pixel-specific basis within EcoAIM™, a geographic information system (GIS)-based ecosystem services (ES) decision analysis support tool. Pixel values are "binned" into ranked categories, and weights are assigned to select variables to represent stakeholder preferences. The final aesthetic score is the weighted sum of all variables and is assigned ranked values from 1 to 10. Ranked aesthetic values are displayed on maps by patch type and integrated within EcoAIM. The response of the aesthetic scoring in the models was tested by comparing current conditions in a discrete area of the facility with a Development scenario in the same area. The Development scenario consisted of two 6-story buildings and a trail replacing natural areas. The results of the vista aesthetic model indicate that the viewshed area variable had the greatest effect on the aesthetics overall score. Results from the landscape aesthetics model indicate a 10% increase in overall aesthetics value, attributed to the increase in landscape diversity. The models are sensitive to the weights assigned to certain variables by the user, and these weights should be set to reflect regional landscape characteristics as well as stakeholder preferences. This demonstration project shows that natural landscape aesthetics can be evaluated as part of a nonmonetary assessment of ES, and a scenario-building exercise provides end users with a tradeoff analysis in support of natural resource management decisions. Integr Environ Assess Manag 2017;13:926-938. © 2017 SETAC. © 2017 SETAC.

  20. Aircraft flight test trajectory control

    NASA Technical Reports Server (NTRS)

    Menon, P. K. A.; Walker, R. A.

    1988-01-01

    Two design techniques for linear flight test trajectory controllers (FTTCs) are described: Eigenstructure assignment and the minimum error excitation technique. The two techniques are used to design FTTCs for an F-15 aircraft model for eight different maneuvers at thirty different flight conditions. An evaluation of the FTTCs is presented.

  1. Secrets in the eyes of Black Oystercatchers: A new sexing technique

    USGS Publications Warehouse

    Guzzetti, B.M.; Talbot, S.L.; Tessler, D.F.; Gill, V.A.; Murphy, E.C.

    2008-01-01

    Sexing oystercatchers in the field is difficult because males and females have identical plumage and are similar in size. Although Black Oystercatchers (Haematopus bachmani) are sexually dimorphic, using morphology to determine sex requires either capturing both pair members for comparison or using discriminant analyses to assign sex probabilistically based on morphometric traits. All adult Black Oystercatchers have bright yellow eyes, but some of them have dark specks, or eye flecks, in their irides. We hypothesized that this easily observable trait was sex-linked and could be used as a novel diagnostic tool for identifying sex. To test this, we compared data for oystercatchers from genetic molecular markers (CHD-W/CHD-Z and HINT-W/HINT-Z), morphometric analyses, and eye-fleck category (full eye flecks, slight eye flecks, and no eye flecks). Compared to molecular markers, we found that discriminant analyses based on morphological characteristics yielded variable results that were confounded by geographical differences in morphology. However, we found that eye flecks were sex-linked. Using an eye-fleck model where all females have full eye flecks and males have either slight eye flecks or no eye flecks, we correctly assigned the sex of 117 of 125 (94%) oystercatchers. Using discriminant analysis based on morphological characteristics, we correctly assigned the sex of 105 of 119 (88%) birds. Using the eye-fleck technique for sexing Black Oystercatchers may be preferable for some investigators because it is as accurate as discriminant analysis based on morphology and does not require capturing the birds. ??2008 Association of Field Ornithologists.

  2. Text Summarization Model based on Facility Location Problem

    NASA Astrophysics Data System (ADS)

    Takamura, Hiroya; Okumura, Manabu

    e propose a novel multi-document generic summarization model based on the budgeted median problem, which is a facility location problem. The summarization method based on our model is an extractive method, which selects sentences from the given document cluster and generates a summary. Each sentence in the document cluster will be assigned to one of the selected sentences, where the former sentece is supposed to be represented by the latter. Our method selects sentences to generate a summary that yields a good sentence assignment and hence covers the whole content of the document cluster. An advantage of this method is that it can incorporate asymmetric relations between sentences such as textual entailment. Through experiments, we showed that the proposed method yields good summaries on the dataset of DUC'04.

  3. Evacuee Compliance Behavior Analysis using High Resolution Demographic Information

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Wei; Han, Lee; Liu, Cheng

    2014-01-01

    The purpose of this study is to examine whether evacuee compliance behavior with route assignments from different resolutions of demographic data would impact the evacuation performance. Most existing evacuation strategies assume that travelers will follow evacuation instructions, while in reality a certain percent of evacuees do not comply with prescribed instructions. In this paper, a comparison study of evacuation assignment based on Traffic Analysis Zones (TAZ) and high resolution LandScan USA Population Cells (LPC) were conducted for the detailed road network representing Alexandria, Virginia. A revised platform for evacuation modeling built on high resolution demographic data and activity-based microscopic trafficmore » simulation is proposed. The results indicate that evacuee compliance behavior affects evacuation efficiency with traditional TAZ assignment, but it does not significantly compromise the efficiency with high resolution LPC assignment. The TAZ assignment also underestimates the real travel time during evacuation, especially for high compliance simulations. This suggests that conventional evacuation studies based on TAZ assignment might not be effective at providing efficient guidance to evacuees. From the high resolution data perspective, traveler compliance behavior is an important factor but it does not impact the system performance significantly. The highlight of evacuee compliance behavior analysis should be emphasized on individual evacuee level route/shelter assignments, rather than the whole system performance.« less

  4. 3D micro-crack propagation simulation at enamel/adhesive interface using FE submodeling and element death techniques.

    PubMed

    Liu, Heng-Liang; Lin, Chun-Li; Sun, Ming-Tsung; Chang, Yen-Hsiang

    2010-06-01

    This study investigates micro-crack propagation at the enamel/adhesive interface using finite element (FE) submodeling and element death techniques. A three-dimensional (3D) FE macro-model of the enamel/adhesive/ceramic subjected to shear bond testing was generated and analyzed. A 3D micro-model with interfacial bonding structure was constructed at the upper enamel/adhesive interface where the stress concentration was found from the macro-model results. The morphology of this interfacial bonding structure (i.e., resin tag) was assigned based on resin tag geometry and enamel rod arrangement from a scanning electron microscopy micrograph. The boundary conditions for the micro-model were determined from the macro-model results. A custom iterative code combined with the element death technique was used to calculate the micro-crack propagation. Parallel experiments were performed to validate this FE simulation. The stress concentration within the adhesive occurred mainly at the upper corner near the enamel/adhesive interface and the resin tag base. A simulated fracture path was found at the resin tag base along the enamel/adhesive interface. A morphological observation of the fracture patterns obtained from in vitro testing corresponded with the simulation results. This study shows that the FE submodeling and element death techniques could be used to simulate the 3D micro-stress pattern and the crack propagation noted at the enamel/adhesive interface.

  5. PROMOTING SUPPORTIVE PARENTING IN NEW MOTHERS WITH SUBSTANCE-USE PROBLEMS: A PILOT RANDOMIZED TRIAL OF RESIDENTIAL TREATMENT PLUS AN ATTACHMENT-BASED PARENTING PROGRAM

    PubMed Central

    BERLIN, LISA J.; SHANAHAN, MEGHAN; CARMODY, KAREN APPLEYARD

    2015-01-01

    This pilot randomized trial tested the feasibility and efficacy of supplementing residential substance-abuse treatment for new mothers with a brief, yet rigorous, attachment-based parenting program. Twenty-one predominantly (86%) White mothers and their infants living together in residential substance-abuse treatment were randomly assigned to the program (n = 11) or control (n = 10) group. Program mothers received 10 home-based sessions of Dozier’s Attachment and Biobehavioral Catch-up (ABC) intervention. Postintervention observations revealed more supportive parenting behaviors among the randomly assigned ABC mothers. PMID:25424409

  6. Pod nursing on a medical/surgical unit: implementation and outcomes evaluation.

    PubMed

    Friese, Christopher R; Grunawalt, Julie C; Bhullar, Sara; Bihlmeyer, Karen; Chang, Robert; Wood, Winnie

    2014-04-01

    A medical/surgical unit at the University of Michigan Health System implemented a pod nursing model of care to improve efficiency and patient and staff satisfaction. One centralized station was replaced with 4 satellites and supplies were relocated next to patient rooms. Patients were assigned to 2 nurses who worked as partners. Three patient (satisfaction, call lights, and falls) and nurse (satisfaction and overtime) outcomes improved after implementation. Efforts should be focused on addressing patient acuity imbalances across assignments and strengthening communication among the healthcare team. Studies are needed to test the model in larger and more diverse settings.

  7. Earth surface modeling for education: How effective is it? Latest classroom tests with Web-based Interactive Landform Simulation Model - Grand Canyon (WILSIM-GC)

    NASA Astrophysics Data System (ADS)

    Luo, W.; Pelletier, J. D.; Smith, T.; Whalley, K.; Shelhamer, A.; Darling, A.; Ormand, C. J.; Duffin, K.; Hung, W. C.; Iverson, E. A. R.; Shernoff, D.; Zhai, X.; Chiang, J. L.; Lotter, N.

    2016-12-01

    The Web-based Interactive Landform Simulation Model - Grand Canyon (WILSIM-GC, http://serc.carleton.edu/landform/) is a simplified version of a physically-based model that simulates bedrock channel erosion, cliff retreat, and base level change. Students can observe the landform evolution in animation under different scenarios by changing parameter values. In addition, cross-sections and profiles at different time intervals can be displayed and saved for further quantitative analysis. Students were randomly assigned to a treatment group (using WILSIM-GC simulation) or a control group (using traditional paper-based material). Pre- and post-tests were administered to measure students' understanding of the concepts and processes related to Grand Canyon formation and evolution. Results from the ANOVA showed that for both groups there were statistically significant growth in scores from pre-test to post-test [F(1, 47) = 25.82, p < .001], but the growth in scores between the two groups was not statistically significant [F(1, 47) = 0.08, p =.774]. In semester 1, the WILSIM-GC group showed greater growth, while in semester 2, the paper-based group showed greater growth. Additionally, a significant time × group × gender × semester interaction effect was observed [F(1, 47) = 4.76, p =.034]. Here, in semester 1 female students were more strongly advantaged by the WILSIM-GC intervention than male students, while in semester 2, female students were less strongly advantaged than male students. The new results are consistent with our initial findings (Luo et al., 2016) and others reported in the literature, i.e., simulation approach is at least equally effective as traditional paper-based method in teaching students about landform evolution. Survey data indicate that students favor the simulation approach. Further study is needed to investigate the reasons for the difference by gender.

  8. The effect of reading assignments in guided inquiry learning on students’ critical thinking skills

    NASA Astrophysics Data System (ADS)

    Syarkowi, A.

    2018-05-01

    The purpose of this study was to determine the effect of reading assignment in guided inquiry learning on senior high school students’ critical thinking skills. The research method which was used in this research was quasi-experiment research method with reading task as the treatment. Topic of inquiry process was Kirchhoff law. The instrument was used for this research was 25 multiple choice interpretive exercises with justification. The multiple choice test was divided on 3 categories such as involve basic clarification, the bases for a decision and inference skills. The result of significance test proved the improvement of students’ critical thinking skills of experiment class was significantly higher when compared with the control class, so it could be concluded that reading assignment can improve students’ critical thinking skills.

  9. Radiometry simulation within the end-to-end simulation tool SENSOR

    NASA Astrophysics Data System (ADS)

    Wiest, Lorenz; Boerner, Anko

    2001-02-01

    12 An end-to-end simulation is a valuable tool for sensor system design, development, optimization, testing, and calibration. This contribution describes the radiometry module of the end-to-end simulation tool SENSOR. It features MODTRAN 4.0-based look up tables in conjunction with a cache-based multilinear interpolation algorithm to speed up radiometry calculations. It employs a linear reflectance parameterization to reduce look up table size, considers effects due to the topology of a digital elevation model (surface slope, sky view factor) and uses a reflectance class feature map to assign Lambertian and BRDF reflectance properties to the digital elevation model. The overall consistency of the radiometry part is demonstrated by good agreement between ATCOR 4-retrieved reflectance spectra of a simulated digital image cube and the original reflectance spectra used to simulate this image data cube.

  10. Development of a web-based liver cancer prediction model for type II diabetes patients by using an artificial neural network.

    PubMed

    Rau, Hsiao-Hsien; Hsu, Chien-Yeh; Lin, Yu-An; Atique, Suleman; Fuad, Anis; Wei, Li-Ming; Hsu, Ming-Huei

    2016-03-01

    Diabetes mellitus is associated with an increased risk of liver cancer, and these two diseases are among the most common and important causes of morbidity and mortality in Taiwan. To use data mining techniques to develop a model for predicting the development of liver cancer within 6 years of diagnosis with type II diabetes. Data were obtained from the National Health Insurance Research Database (NHIRD) of Taiwan, which covers approximately 22 million people. In this study, we selected patients who were newly diagnosed with type II diabetes during the 2000-2003 periods, with no prior cancer diagnosis. We then used encrypted personal ID to perform data linkage with the cancer registry database to identify whether these patients were diagnosed with liver cancer. Finally, we identified 2060 cases and assigned them to a case group (patients diagnosed with liver cancer after diabetes) and a control group (patients with diabetes but no liver cancer). The risk factors were identified from the literature review and physicians' suggestion, then, chi-square test was conducted on each independent variable (or potential risk factor) for a comparison between patients with liver cancer and those without, those found to be significant were selected as the factors. We subsequently performed data training and testing to construct artificial neural network (ANN) and logistic regression (LR) prediction models. The dataset was randomly divided into 2 groups: a training group and a test group. The training group consisted of 1442 cases (70% of the entire dataset), and the prediction model was developed on the basis of the training group. The remaining 30% (618 cases) were assigned to the test group for model validation. The following 10 variables were used to develop the ANN and LR models: sex, age, alcoholic cirrhosis, nonalcoholic cirrhosis, alcoholic hepatitis, viral hepatitis, other types of chronic hepatitis, alcoholic fatty liver disease, other types of fatty liver disease, and hyperlipidemia. The performance of the ANN was superior to that of LR, according to the sensitivity (0.757), specificity (0.755), and the area under the receiver operating characteristic curve (0.873). After developing the optimal prediction model, we base on this model to construct a web-based application system for liver cancer prediction, which can provide support to physicians during consults with diabetes patients. In the original dataset (n=2060), 33% of diabetes patients were diagnosed with liver cancer (n=515). After using 70% of the original data to training the model and other 30% for testing, the sensitivity and specificity of our model were 0.757 and 0.755, respectively; this means that 75.7% of diabetes patients can be predicted correctly to receive a future liver cancer diagnosis, and 75.5% can be predicted correctly to not be diagnosed with liver cancer. These results reveal that this model can be used as effective predictors of liver cancer for diabetes patients, after discussion with physicians; they also agreed that model can assist physicians to advise potential liver cancer patients and also helpful to decrease the future cost incurred upon cancer treatment. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  11. Tolerance assignment in optical design

    NASA Astrophysics Data System (ADS)

    Youngworth, Richard Neil

    2002-09-01

    Tolerance assignment is necessary in any engineering endeavor because fabricated systems---due to the stochastic nature of manufacturing and assembly processes---necessarily deviate from the nominal design. This thesis addresses the problem of optical tolerancing. The work can logically be split into three different components that all play an essential role. The first part addresses the modeling of manufacturing errors in contemporary fabrication and assembly methods. The second component is derived from the design aspect---the development of a cost-based tolerancing procedure. The third part addresses the modeling of image quality in an efficient manner that is conducive to the tolerance assignment process. The purpose of the first component, modeling manufacturing errors, is twofold---to determine the most critical tolerancing parameters and to understand better the effects of fabrication errors. Specifically, mid-spatial-frequency errors, typically introduced in sub-aperture grinding and polishing fabrication processes, are modeled. The implication is that improving process control and understanding better the effects of the errors makes the task of tolerance assignment more manageable. Conventional tolerancing methods do not directly incorporate cost. Consequently, tolerancing approaches tend to focus more on image quality. The goal of the second part of the thesis is to develop cost-based tolerancing procedures that facilitate optimum system fabrication by generating the loosest acceptable tolerances. This work has the potential to impact a wide range of optical designs. The third element, efficient modeling of image quality, is directly related to the cost-based optical tolerancing method. Cost-based tolerancing requires efficient and accurate modeling of the effects of errors on the performance of optical systems. Thus it is important to be able to compute the gradient and the Hessian, with respect to the parameters that need to be toleranced, of the figure of merit that measures the image quality of a system. An algebraic method for computing the gradient and the Hessian is developed using perturbation theory.

  12. Integration of Error Compensation of Coordinate Measuring Machines into Feature Measurement: Part II—Experimental Implementation

    PubMed Central

    Calvo, Roque; D’Amato, Roberto; Gómez, Emilio; Domingo, Rosario

    2016-01-01

    Coordinate measuring machines (CMM) are main instruments of measurement in laboratories and in industrial quality control. A compensation error model has been formulated (Part I). It integrates error and uncertainty in the feature measurement model. Experimental implementation for the verification of this model is carried out based on the direct testing on a moving bridge CMM. The regression results by axis are quantified and compared to CMM indication with respect to the assigned values of the measurand. Next, testing of selected measurements of length, flatness, dihedral angle, and roundness features are accomplished. The measurement of calibrated gauge blocks for length or angle, flatness verification of the CMM granite table and roundness of a precision glass hemisphere are presented under a setup of repeatability conditions. The results are analysed and compared with alternative methods of estimation. The overall performance of the model is endorsed through experimental verification, as well as the practical use and the model capability to contribute in the improvement of current standard CMM measuring capabilities. PMID:27754441

  13. Finding models to detect Alzheimer's disease by fusing structural and neuropsychological information

    NASA Astrophysics Data System (ADS)

    Giraldo, Diana L.; García-Arteaga, Juan D.; Velasco, Nelson; Romero, Eduardo

    2015-12-01

    Alzheimer's disease (AD) is a neurodegenerative disease that affects higher brain functions. Initial diagnosis of AD is based on the patient's clinical history and a battery of neuropsychological tests. The accuracy of the diagnosis is highly dependent on the examiner's skills and on the evolution of a variable clinical frame. This work presents an automatic strategy that learns probabilistic brain models for different stages of the disease, reducing the complexity, parameter adjustment and computational costs. The proposed method starts by setting a probabilistic class description using the information stored in the neuropsychological test, followed by constructing the different structural class models using membership values from the learned probabilistic functions. These models are then used as a reference frame for the classification problem: a new case is assigned to a particular class simply by projecting to the different models. The validation was performed using a leave-one-out cross-validation, two classes were used: Normal Control (NC) subjects and patients diagnosed with mild AD. In this experiment it is possible to achieve a sensibility and specificity of 80% and 79% respectively.

  14. Preparing Computing Students for Culturally Diverse E-Mediated IT Projects

    ERIC Educational Resources Information Center

    Conrad, Marc; French, Tim; Maple, Carsten; Zhang, Sijing

    2006-01-01

    In this paper we present an account of an undergraduate team-based assignment designed to facilitate, exhibit and record team-working skills in an e-mediated environment. By linking the student feedback received to Hofstede's classic model of cultural dimensions we aim to show the assignment's suitability in revealing the student's multi-cultural…

  15. Proceedings of a workshop on fish habitat suitability index models

    USGS Publications Warehouse

    Terrell, James W.

    1984-01-01

    One of the habitat-based methodologies for impact assessment currently in use by the U.S. Fish and Wildlife Service is the Habitat Evaluation Procedures (HEP) (U.S. Fish and Wildlife Service 1980). HEP is based on the assumption that the quality of an area as wildlife habitat at a specified target year can be described by a single number, called a Habitat Suitability Index (HSI). An HSI of 1.0 represents optimum habitat: an HSI of 0.0 represents unsuitable habitat. The verbal or mathematical rules by which an HSI is assigned to an area are called an HSI model. A series of Habitat Suitability Index (HSI) models, described by Schamberger et al. (1982), have been published to assist users in applying HEP. HSI model building approaches are described in U.S. Fish and Wildlife Service (1981). One type of HSI model described in detail requires the development of Suitability Index (SI) graphs for habitat variables believed to be important for the growth, survival, standing crop, or other measure of well-being for a species. Suitability indices range from 0 to 1.0, with 1.0 representing optimum conditions for the variable. When HSI models based on suitability indices are used, habitat variable values are measured, or estimated, and converted to SI's through the use of a Suitability Index graph for each variable. Individual SI's are aggregated into an HSI. Standard methods for testing this type of HSI model did not exist at the time the studies reported in this document were performed. A workshop was held in Fort Collins, Colorado, February 14-15, 1983, that brought together biologists experienced in the use, development, and testing of aquatic HSI models, in an effort to address the following objectives: (1) review the needs of HSI model users; (2) discuss and document the results of aquatic HSI model tests; and (3) provide recommendations for the future development, testing, modification, and use of HSI models. Individual presentations, group discussions, and group decision techniques were used to develop and present information at the meeting. A synthesis of the resulting concepts, results, and recommendations follows this preface. Subsequent papers describe individual tests of selected HSI models. Most of the tests involved comparison of values from HSI models or Suitability index (SI) curves with standing crop, as required contractually. Time and budget constraints generally limited tests to the use of data previously collected for other purposes. These proceedings are intended to help persons responsible for the development, testing, or use of HSI models by increasing their understanding of potential uses and limitations of testing procedures and models based on aggregated Suitability Indices. Problems encountered when testing HSI models are described, model performance during tests is documents, and recommendations for future model development and testing presented by the participants are listed and interpreted.

  16. A modeling process to understand complex system architectures

    NASA Astrophysics Data System (ADS)

    Robinson, Santiago Balestrini

    2009-12-01

    In recent decades, several tools have been developed by the armed forces, and their contractors, to test the capability of a force. These campaign level analysis tools, often times characterized as constructive simulations are generally expensive to create and execute, and at best they are extremely difficult to verify and validate. This central observation, that the analysts are relying more and more on constructive simulations to predict the performance of future networks of systems, leads to the two central objectives of this thesis: (1) to enable the quantitative comparison of architectures in terms of their ability to satisfy a capability without resorting to constructive simulations, and (2) when constructive simulations must be created, to quantitatively determine how to spend the modeling effort amongst the different system classes. The first objective led to Hypothesis A, the first main hypotheses, which states that by studying the relationships between the entities that compose an architecture, one can infer how well it will perform a given capability. The method used to test the hypothesis is based on two assumptions: (1) the capability can be defined as a cycle of functions, and that it (2) must be possible to estimate the probability that a function-based relationship occurs between any two types of entities. If these two requirements are met, then by creating random functional networks, different architectures can be compared in terms of their ability to satisfy a capability. In order to test this hypothesis, a novel process for creating representative functional networks of large-scale system architectures was developed. The process, named the Digraph Modeling for Architectures (DiMA), was tested by comparing its results to those of complex constructive simulations. Results indicate that if the inputs assigned to DiMA are correct (in the tests they were based on time-averaged data obtained from the ABM), DiMA is able to identify which of any two architectures is better more than 98% of the time. The second objective led to Hypothesis B, the second of the main hypotheses. This hypothesis stated that by studying the functional relations, the most critical entities composing the architecture could be identified. The critical entities are those that when their behavior varies slightly, the behavior of the overall architecture varies greatly. These are the entities that must be modeled more carefully and where modeling effort should be expended. This hypothesis was tested by simplifying agent-based models to the non-trivial minimum, and executing a large number of different simulations in order to obtain statistically significant results. The tests were conducted by evolving the complex model without any error induced, and then evolving the model once again for each ranking and assigning error to any of the nodes with a probability inversely proportional to the ranking. The results from this hypothesis test indicate that depending on the structural characteristics of the functional relations, it is useful to use one of two of the intelligent rankings tested, or it is best to expend effort equally amongst all the entities. Random ranking always performed worse than uniform ranking, indicating that if modeling effort is to be prioritized amongst the entities composing the large-scale system architecture, it should be prioritized intelligently. The benefit threshold between intelligent prioritization and no prioritization lays on the large-scale system's chaotic boundary. If the large-scale system behaves chaotically, small variations in any of the entities tends to have a great impact on the behavior of the entire system. Therefore, even low ranking entities can still affect the behavior of the model greatly, and error should not be concentrated in any one entity. It was discovered that the threshold can be identified from studying the structure of the networks, in particular the cyclicity, the Off-diagonal Complexity, and the Digraph Algebraic Connectivity. (Abstract shortened by UMI.)

  17. F-100A on lakebed

    NASA Technical Reports Server (NTRS)

    1955-01-01

    North American F-100A (52-5778) Super Sabre is parked on the Rogers Dry Lakebed at Edwards Air Force Base, California, 1955. This photo shows the large tail on the F-100A. When the basic research was completed on this F-100A another program was assigned. On March 5, 1957 two aeronautical engineers and a test pilot from NACA High-Speed Flight Station took the airplane to participate in a Gunnery Operations program at Nellis Air Force Base, Nevada. When the program was completed the aircraft returned for other assignments to NACA, at Edwards, California.

  18. Stereochemical analysis of (+)-limonene using theoretical and experimental NMR and chiroptical data

    NASA Astrophysics Data System (ADS)

    Reinscheid, F.; Reinscheid, U. M.

    2016-02-01

    Using limonene as test molecule, the success and the limitations of three chiroptical methods (optical rotatory dispersion (ORD), electronic and vibrational circular dichroism, ECD and VCD) could be demonstrated. At quite low levels of theory (mpw1pw91/cc-pvdz, IEFPCM (integral equation formalism polarizable continuum model)) the experimental ORD values differ by less than 10 units from the calculated values. The modelling in the condensed phase still represents a challenge so that experimental NMR data were used to test for aggregation and solvent-solute interactions. After establishing a reasonable structural model, only the ECD spectra prediction showed a decisive dependence on the basis set: only augmented (in the case of Dunning's basis sets) or diffuse (in the case of Pople's basis sets) basis sets predicted the position and shape of the ECD bands correctly. Based on these result we propose a procedure to assign the absolute configuration (AC) of an unknown compound using the comparison between experimental and calculated chiroptical data.

  19. Video as an Effective Method to Deliver Pre-Test Information for Rapid HIV Testing

    PubMed Central

    Clark, Melissa A.; Mayer, Kenneth H.; Seage, George R.; DeGruttola, Victor G.; Becker, Bruce M.

    2008-01-01

    Objectives Video-based delivery of HIV pre-test information might assist in streamlining HIV screening and testing efforts in the emergency department (ED). The objectives of this study were to determine if the video “Do you know about rapid HIV testing?” is an acceptable alternative to an in-person information session on rapid HIV pre-test information, in regards to comprehension of rapid HIV pre-test fundamentals; and to identify patients who might have difficulties in comprehending pre-test information. Methods This was a non-inferiority trial of 574 participants in an ED opt-in rapid HIV screening program who were randomly assigned to receive identical pre-test information from either an animated and live-action 9.5-minute video, or an in-person information session. Pre-test information comprehension was assessed using a questionnaire. The video would be accepted as not inferior to the in-person information session if the 95% confidence interval (CI) of the difference (Δ) in mean scores on the questionnaire between the two information groups was less than a 10% decrease in the in-person information session arm's mean score. Linear regression models were constructed to identify patients with lower mean scores based upon study arm assignment, demographic characteristics, and history of prior HIV testing. Results The questionnaire mean scores were 20.1 (95% CI = 19.7 to 20.5) for the video arm and 20.8 (95% CI = 20.4 to 21.2) for the in-person information session arm. The difference in mean scores compared to the mean score for the in-person information session met the non-inferiority criterion for this investigation (Δ = 0.68; 95% CI = 0.18 to 1.26). In a multivariable linear regression model, Blacks/African Americans, Hispanics, and those with Medicare and Medicaid insurance exhibited slightly lower mean scores, regardless of the pre-test information delivery format. There was a strong relationship between fewer years of formal education and lower mean scores on the questionnaire. Age, gender, type of insurance, partner/marital status, and history of prior HIV testing were not predictive of scores on the questionnaire. Conclusions In terms of patient comprehension of rapid HIV pre-test information fundamentals, the video was an acceptable substitute to pre-test information delivered by an HIV test counselor. Both the video and in-person information session were less effective in providing pre-test information for patients with fewer years of formal education. PMID:19120050

  20. Single-pass beam measurements for the verification of the LHC magnetic model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Calaga, R.; Giovannozzi, M.; Redaelli, S.

    2010-05-23

    During the 2009 LHC injection tests, the polarities and effects of specific quadrupole and higher-order magnetic circuits were investigated. A set of magnet circuits had been selected for detailed investigation based on a number of criteria. On or off-momentum difference trajectories launched via appropriate orbit correctors for varying strength settings of the magnet circuits under study - e.g. main, trim and skew quadrupoles; sextupole families and spool piece correctors; skew sextupoles, octupoles - were compared with predictions from various optics models. These comparisons allowed confirming or updating the relative polarity conventions used in the optics model and the accelerator controlmore » system, as well as verifying the correct powering and assignment of magnet families. Results from measurements in several LHC sectors are presented.« less

  1. The effect of occlusion on the semantics of projective spatial terms: a case study in grounding language in perception.

    PubMed

    Kelleher, John D; Ross, Robert J; Sloan, Colm; Mac Namee, Brian

    2011-02-01

    Although data-driven spatial template models provide a practical and cognitively motivated mechanism for characterizing spatial term meaning, the influence of perceptual rather than solely geometric and functional properties has yet to be systematically investigated. In the light of this, in this paper, we investigate the effects of the perceptual phenomenon of object occlusion on the semantics of projective terms. We did this by conducting a study to test whether object occlusion had a noticeable effect on the acceptance values assigned to projective terms with respect to a 2.5-dimensional visual stimulus. Based on the data collected, a regression model was constructed and presented. Subsequent analysis showed that the regression model that included the occlusion factor outperformed an adaptation of Regier & Carlson's well-regarded AVS model for that same spatial configuration.

  2. Enabling full-field physics-based optical proximity correction via dynamic model generation

    NASA Astrophysics Data System (ADS)

    Lam, Michael; Clifford, Chris; Raghunathan, Ananthan; Fenger, Germain; Adam, Kostas

    2017-07-01

    As extreme ultraviolet lithography becomes closer to reality for high volume production, its peculiar modeling challenges related to both inter and intrafield effects have necessitated building an optical proximity correction (OPC) infrastructure that operates with field position dependency. Previous state-of-the-art approaches to modeling field dependency used piecewise constant models where static input models are assigned to specific x/y-positions within the field. OPC and simulation could assign the proper static model based on simulation-level placement. However, in the realm of 7 and 5 nm feature sizes, small discontinuities in OPC from piecewise constant model changes can cause unacceptable levels of edge placement errors. The introduction of dynamic model generation (DMG) can be shown to effectively avoid these dislocations by providing unique mask and optical models per simulation region, allowing a near continuum of models through the field. DMG allows unique models for electromagnetic field, apodization, aberrations, etc. to vary through the entire field and provides a capability to precisely and accurately model systematic field signatures.

  3. How best to obtain consent to thrombolysis: Individualized decision-making.

    PubMed

    Gong, Jingjing; Zhang, Yan; Feng, Jun; Zhang, Weiwei; Yin, Weimin; Wu, Xinhuai; Hou, Yanhong; Huang, Yonghua; Liu, Hongyun; Miao, Danmin

    2016-03-15

    To investigate the factors that influence the preferences of patients and their proxies concerning thrombolytic therapy and to determine how best to convey information. A total of 613 participants were randomly assigned to a positively or negatively framed group. Each participant completed a series of surveys. We applied latent class analysis (LCA) to explore participants' patterns of choices of thrombolysis and to classify the participants into different subgroups. Then we performed regression analyses to investigate predictors of classification of the participants into each subgroup and to establish a thrombolytic decision-making model. LCA indicated an optimal 3-subgroup model comprising intermediate, favorable to thrombolysis, and aversion to thrombolysis subgroups. Multiple regression analysis revealed that 10 factors predicted assignment to the intermediate subgroup and 4 factors predicted assignment to the aversion to thrombolysis subgroup compared with the favorable to thrombolysis subgroup. The χ(2) tests indicated that the information presentation format and the context of thrombolysis influenced participants' choices of thrombolysis and revealed a framing effect in different subgroups. The preference for thrombolysis was influenced by the positive vs negative framing scenarios, the format of item presentation, the context of thrombolysis, and individual characteristics. Inconsistent results may be due to participant heterogeneity and the evaluation of limited factors in previous studies. Based on a decision model of thrombolysis, physicians should consider the effects of positive vs negative framing and should seek a neutral tone when presenting the facts, providing an important reference point for health persuasion in other clinical domains. © 2016 American Academy of Neurology.

  4. Editorial--Avoiding Unethical Helicobacter pylori Clinical Trials: Susceptibility-Based Studies and Probiotics as Adjuvants.

    PubMed

    Graham, David Y

    2015-10-01

    As a general rule, any clinical study where the result is already known or when the investigator(s) compares an assigned treatment against another assigned treatment known to be ineffective in the study population (e.g., in a population with known clarithromycin resistance) is unethical. As susceptibility-based therapy will always be superior to empiric therapy in any population with a prevalence of antimicrobial resistance >0%, any trial that randomizes susceptibility-based therapy with empiric therapy would be unethical. The journal Helicobacter welcomes susceptibility or culture-guided studies, studies of new therapies, and studies of adjuvants and probiotics. However, the journal will not accept for review any study we judge to be lacking clinical equipoise or which assign subjects to a treatment known to be ineffective, such as a susceptibility-based clinical trial with an empiric therapy comparator. To assist authors, we provide examples and suggestions regarding trial design for comparative studies, for susceptibility-based studies, and for studies testing adjuvants or probiotics. © 2015 John Wiley & Sons Ltd.

  5. Editorial - Avoiding unethical Helicobacter pylori clinical trials: Susceptibility-based studies and probiotics as adjuvants

    PubMed Central

    Graham, David Y.

    2016-01-01

    As a general rule, any clinical study where the result is already known or when the investigator(s) compares an assigned treatment against another assigned treatment known to be ineffective in the study population (e.g. in a population with known clarithromycin resistance) is unethical. Since susceptibility-based therapy will always be superior to empiric therapy in any population with a prevalence of antimicrobial resistance greater than 0%, any trial that randomizes susceptibility-based therapy with empiric therapy would be unethical. The journal Helicobacter welcomes susceptibility or culture-guided studies, studies of new therapies and of adjuvants and probiotics. However, the Journal will not accept for review any study we judge to be lacking clinical equipoise or which assign subjects to a treatment known to be ineffective, such as a susceptibility-based clinical trial with an empiric therapy comparator. To assist authors we provide examples and suggestion regarding trial design for comparative studies, for susceptibility-based studies, and for studies testing adjuvants or probiotics. PMID:26123529

  6. Acid-base titrations of functional groups on the surface of the thermophilic bacterium Anoxybacillus flavithermus: comparing a chemical equilibrium model with ATR-IR spectroscopic data.

    PubMed

    Heinrich, Hannah T M; Bremer, Phil J; Daughney, Christopher J; McQuillan, A James

    2007-02-27

    Acid-base functional groups at the surface of Anoxybacillus flavithermus (AF) were assigned from the modeling of batch titration data of bacterial suspensions and compared with those determined from in situ infrared spectroscopic titration analysis. The computer program FITMOD was used to generate a two-site Donnan model (site 1: pKa = 3.26, wet concn = 2.46 x 10(-4) mol g(-1); site 2: pKa = 6.12, wet concn = 6.55 x 10(-5) mol g(-1)), which was able to describe data for whole exponential phase cells from both batch acid-base titrations at 0.01 M ionic strength and electrophoretic mobility measurements over a range of different pH values and ionic strengths. In agreement with information on the composition of bacterial cell walls and a considerable body of modeling literature, site 1 of the model was assigned to carboxyl groups, and site 2 was assigned to amino groups. pH difference IR spectra acquired by in situ attenuated total reflection infrared (ATR-IR) spectroscopy confirmed the presence of carboxyl groups. The spectra appear to show a carboxyl pKa in the 3.3-4.0 range. Further peaks were assigned to phosphodiester groups, which deprotonated at slightly lower pH. The presence of amino groups could not be confirmed or discounted by IR spectroscopy, but a positively charged group corresponding to site 2 was implicated by electrophoretic mobility data. Carboxyl group speciation over a pH range of 2.3-10.3 at two different ionic strengths was further compared to modeling predictions. While model predictions were strongly influenced by the ionic strength change, pH difference IR data showed no significant change. This meant that modeling predictions agreed reasonably well with the IR data for 0.5 M ionic strength but not for 0.01 M ionic strength.

  7. Predicting wetland plant community responses to proposed water-level-regulation plans for Lake Ontario: GIS-based modeling

    USGS Publications Warehouse

    Wilcox, D.A.; Xie, Y.

    2007-01-01

    Integrated, GIS-based, wetland predictive models were constructed to assist in predicting the responses of wetland plant communities to proposed new water-level regulation plans for Lake Ontario. The modeling exercise consisted of four major components: 1) building individual site wetland geometric models; 2) constructing generalized wetland geometric models representing specific types of wetlands (rectangle model for drowned river mouth wetlands, half ring model for open embayment wetlands, half ellipse model for protected embayment wetlands, and ellipse model for barrier beach wetlands); 3) assigning wetland plant profiles to the generalized wetland geometric models that identify associations between past flooding / dewatering events and the regulated water-level changes of a proposed water-level-regulation plan; and 4) predicting relevant proportions of wetland plant communities and the time durations during which they would be affected under proposed regulation plans. Based on this conceptual foundation, the predictive models were constructed using bathymetric and topographic wetland models and technical procedures operating on the platform of ArcGIS. An example of the model processes and outputs for the drowned river mouth wetland model using a test regulation plan illustrates the four components and, when compared against other test regulation plans, provided results that met ecological expectations. The model results were also compared to independent data collected by photointerpretation. Although data collections were not directly comparable, the predicted extent of meadow marsh in years in which photographs were taken was significantly correlated with extent of mapped meadow marsh in all but barrier beach wetlands. The predictive model for wetland plant communities provided valuable input into International Joint Commission deliberations on new regulation plans and was also incorporated into faunal predictive models used for that purpose.

  8. Fault detection using a two-model test for changes in the parameters of an autoregressive time series

    NASA Technical Reports Server (NTRS)

    Scholtz, P.; Smyth, P.

    1992-01-01

    This article describes an investigation of a statistical hypothesis testing method for detecting changes in the characteristics of an observed time series. The work is motivated by the need for practical automated methods for on-line monitoring of Deep Space Network (DSN) equipment to detect failures and changes in behavior. In particular, on-line monitoring of the motor current in a DSN 34-m beam waveguide (BWG) antenna is used as an example. The algorithm is based on a measure of the information theoretic distance between two autoregressive models: one estimated with data from a dynamic reference window and one estimated with data from a sliding reference window. The Hinkley cumulative sum stopping rule is utilized to detect a change in the mean of this distance measure, corresponding to the detection of a change in the underlying process. The basic theory behind this two-model test is presented, and the problem of practical implementation is addressed, examining windowing methods, model estimation, and detection parameter assignment. Results from the five fault-transition simulations are presented to show the possible limitations of the detection method, and suggestions for future implementation are given.

  9. Treatment Approaches for Presurgical Anxiety: A Health Care Concern.

    ERIC Educational Resources Information Center

    Keogh, Nancy Jones; And Others

    To test the differential effectiveness of preoperative instruction (factual information, emotional expression, and trust relationship), mastery modeling, and coping modeling, 100 children, aged 7-12, were studied. Subjects from two hospitals were randomly assigned to four experimental groups and one control group: alone (the control group, N=20);…

  10. From statistical proofs of the Kochen-Specker theorem to noise-robust noncontextuality inequalities

    NASA Astrophysics Data System (ADS)

    Kunjwal, Ravi; Spekkens, Robert W.

    2018-05-01

    The Kochen-Specker theorem rules out models of quantum theory wherein projective measurements are assigned outcomes deterministically and independently of context. This notion of noncontextuality is not applicable to experimental measurements because these are never free of noise and thus never truly projective. For nonprojective measurements, therefore, one must drop the requirement that an outcome be assigned deterministically in the model and merely require that it be assigned a distribution over outcomes in a manner that is context-independent. By demanding context independence in the representation of preparations as well, one obtains a generalized principle of noncontextuality that also supports a quantum no-go theorem. Several recent works have shown how to derive inequalities on experimental data which, if violated, demonstrate the impossibility of finding a generalized-noncontextual model of this data. That is, these inequalities do not presume quantum theory and, in particular, they make sense without requiring an operational analog of the quantum notion of projectiveness. We here describe a technique for deriving such inequalities starting from arbitrary proofs of the Kochen-Specker theorem. It extends significantly previous techniques that worked only for logical proofs, which are based on sets of projective measurements that fail to admit of any deterministic noncontextual assignment, to the case of statistical proofs, which are based on sets of projective measurements that d o admit of some deterministic noncontextual assignments, but not enough to explain the quantum statistics.

  11. MILP model for integrated balancing and sequencing mixed-model two-sided assembly line with variable launching interval and assignment restrictions

    NASA Astrophysics Data System (ADS)

    Azmi, N. I. L. Mohd; Ahmad, R.; Zainuddin, Z. M.

    2017-09-01

    This research explores the Mixed-Model Two-Sided Assembly Line (MMTSAL). There are two interrelated problems in MMTSAL which are line balancing and model sequencing. In previous studies, many researchers considered these problems separately and only few studied them simultaneously for one-sided line. However in this study, these two problems are solved simultaneously to obtain more efficient solution. The Mixed Integer Linear Programming (MILP) model with objectives of minimizing total utility work and idle time is generated by considering variable launching interval and assignment restriction constraint. The problem is analysed using small-size test cases to validate the integrated model. Throughout this paper, numerical experiment was conducted by using General Algebraic Modelling System (GAMS) with the solver CPLEX. Experimental results indicate that integrating the problems of model sequencing and line balancing help to minimise the proposed objectives function.

  12. Audio-Visual Speaker Diarization Based on Spatiotemporal Bayesian Fusion.

    PubMed

    Gebru, Israel D; Ba, Sileye; Li, Xiaofei; Horaud, Radu

    2018-05-01

    Speaker diarization consists of assigning speech signals to people engaged in a dialogue. An audio-visual spatiotemporal diarization model is proposed. The model is well suited for challenging scenarios that consist of several participants engaged in multi-party interaction while they move around and turn their heads towards the other participants rather than facing the cameras and the microphones. Multiple-person visual tracking is combined with multiple speech-source localization in order to tackle the speech-to-person association problem. The latter is solved within a novel audio-visual fusion method on the following grounds: binaural spectral features are first extracted from a microphone pair, then a supervised audio-visual alignment technique maps these features onto an image, and finally a semi-supervised clustering method assigns binaural spectral features to visible persons. The main advantage of this method over previous work is that it processes in a principled way speech signals uttered simultaneously by multiple persons. The diarization itself is cast into a latent-variable temporal graphical model that infers speaker identities and speech turns, based on the output of an audio-visual association process, executed at each time slice, and on the dynamics of the diarization variable itself. The proposed formulation yields an efficient exact inference procedure. A novel dataset, that contains audio-visual training data as well as a number of scenarios involving several participants engaged in formal and informal dialogue, is introduced. The proposed method is thoroughly tested and benchmarked with respect to several state-of-the art diarization algorithms.

  13. VITAL NMR: Using Chemical Shift Derived Secondary Structure Information for a Limited Set of Amino Acids to Assess Homology Model Accuracy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brothers, Michael C; Nesbitt, Anna E; Hallock, Michael J

    2011-01-01

    Homology modeling is a powerful tool for predicting protein structures, whose success depends on obtaining a reasonable alignment between a given structural template and the protein sequence being analyzed. In order to leverage greater predictive power for proteins with few structural templates, we have developed a method to rank homology models based upon their compliance to secondary structure derived from experimental solid-state NMR (SSNMR) data. Such data is obtainable in a rapid manner by simple SSNMR experiments (e.g., (13)C-(13)C 2D correlation spectra). To test our homology model scoring procedure for various amino acid labeling schemes, we generated a library ofmore » 7,474 homology models for 22 protein targets culled from the TALOS+/SPARTA+ training set of protein structures. Using subsets of amino acids that are plausibly assigned by SSNMR, we discovered that pairs of the residues Val, Ile, Thr, Ala and Leu (VITAL) emulate an ideal dataset where all residues are site specifically assigned. Scoring the models with a predicted VITAL site-specific dataset and calculating secondary structure with the Chemical Shift Index resulted in a Pearson correlation coefficient (-0.75) commensurate to the control (-0.77), where secondary structure was scored site specifically for all amino acids (ALL 20) using STRIDE. This method promises to accelerate structure procurement by SSNMR for proteins with unknown folds through guiding the selection of remotely homologous protein templates and assessing model quality.« less

  14. A new proposal for greenhouse gas emissions responsibility allocation: best available technologies approach.

    PubMed

    Berzosa, Álvaro; Barandica, Jesús M; Fernández-Sánchez, Gonzalo

    2014-01-01

    In recent years, several methodologies have been developed for the quantification of greenhouse gas (GHG) emissions. However, determining who is responsible for these emissions is also quite challenging. The most common approach is to assign emissions to the producer (based on the Kyoto Protocol), but proposals also exist for its allocation to the consumer (based on an ecological footprint perspective) and for a hybrid approach called shared responsibility. In this study, the existing proposals and standards regarding the allocation of GHG emissions responsibilities are analyzed, focusing on their main advantages and problems. A new model of shared responsibility that overcomes some of the existing problems is also proposed. This model is based on applying the best available technologies (BATs). This new approach allocates the responsibility between the producers and the final consumers based on the real capacity of each agent to reduce emissions. The proposed approach is demonstrated using a simple case study of a 4-step life cycle of ammonia nitrate (AN) fertilizer production. The proposed model has the characteristics that the standards and publications for assignment of GHG emissions responsibilities demand. This study presents a new way to assign responsibilities that pushes all the actors in the production chain, including consumers, to reduce pollution. © 2013 SETAC.

  15. 48 CFR 242.202 - Assignment of contract administration.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ..., charting, and geodesy services; (F) Base, post, camp, and station purchases; (G) Operation or maintenance... installation, test, and checkout of the missiles and associated equipment); (Q) Operation and maintenance of, or installation of equipment at, military test ranges, facilities, and installations; and (R) The...

  16. 48 CFR 242.202 - Assignment of contract administration.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ..., charting, and geodesy services; (F) Base, post, camp, and station purchases; (G) Operation or maintenance... installation, test, and checkout of the missiles and associated equipment); (Q) Operation and maintenance of, or installation of equipment at, military test ranges, facilities, and installations; and (R) The...

  17. 48 CFR 242.202 - Assignment of contract administration.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ..., charting, and geodesy services; (F) Base, post, camp, and station purchases; (G) Operation or maintenance... installation, test, and checkout of the missiles and associated equipment); (Q) Operation and maintenance of, or installation of equipment at, military test ranges, facilities, and installations; and (R) The...

  18. 48 CFR 242.202 - Assignment of contract administration.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ..., charting, and geodesy services; (F) Base, post, camp, and station purchases; (G) Operation or maintenance... installation, test, and checkout of the missiles and associated equipment); (Q) Operation and maintenance of, or installation of equipment at, military test ranges, facilities, and installations; and (R) The...

  19. 48 CFR 242.202 - Assignment of contract administration.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ..., charting, and geodesy services; (F) Base, post, camp, and station purchases; (G) Operation or maintenance... installation, test, and checkout of the missiles and associated equipment); (Q) Operation and maintenance of, or installation of equipment at, military test ranges, facilities, and installations; and (R) The...

  20. Knowledge-Based Runway Assignment for Arrival Aircraft in the Terminal Area

    DOT National Transportation Integrated Search

    1997-01-01

    A knowledge-based system for scheduling arrival traffic in the terminal area, : referred to as the Final Approach Spacing Tool (FAST), has been implemented and : operationally tested at the Dallas/Fort Worth Terminal Radar Approach Control : (TRACON)...

  1. Automatic transfer function design for medical visualization using visibility distributions and projective color mapping.

    PubMed

    Cai, Lile; Tay, Wei-Liang; Nguyen, Binh P; Chui, Chee-Kong; Ong, Sim-Heng

    2013-01-01

    Transfer functions play a key role in volume rendering of medical data, but transfer function manipulation is unintuitive and can be time-consuming; achieving an optimal visualization of patient anatomy or pathology is difficult. To overcome this problem, we present a system for automatic transfer function design based on visibility distribution and projective color mapping. Instead of assigning opacity directly based on voxel intensity and gradient magnitude, the opacity transfer function is automatically derived by matching the observed visibility distribution to a target visibility distribution. An automatic color assignment scheme based on projective mapping is proposed to assign colors that allow for the visual discrimination of different structures, while also reflecting the degree of similarity between them. When our method was tested on several medical volumetric datasets, the key structures within the volume were clearly visualized with minimal user intervention. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. NMR data-driven structure determination using NMR-I-TASSER in the CASD-NMR experiment

    PubMed Central

    Jang, Richard; Wang, Yan

    2015-01-01

    NMR-I-TASSER, an adaption of the I-TASSER algorithm combining NMR data for protein structure determination, recently joined the second round of the CASD-NMR experiment. Unlike many molecular dynamics-based methods, NMR-I-TASSER takes a molecular replacement-like approach to the problem by first threading the target through the PDB to identify structural templates which are then used for iterative NOE assignments and fragment structure assembly refinements. The employment of multiple templates allows NMR-I-TASSER to sample different topologies while convergence to a single structure is not required. Retroactive and blind tests of the CASD-NMR targets from Rounds 1 and 2 demonstrate that even without using NOE peak lists I-TASSER can generate correct structure topology with 15 of 20 targets having a TM-score above 0.5. With the addition of NOE-based distance restraints, NMR-I-TASSER significantly improved the I-TASSER models with all models having the TM-score above 0.5. The average RMSD was reduced from 5.29 to 2.14 Å in Round 1 and 3.18 to 1.71 Å in Round 2. There is no obvious difference in the modeling results with using raw and refined peak lists, indicating robustness of the pipeline to the NOE assignment errors. Overall, despite the low-resolution modeling the current NMR-I-TASSER pipeline provides a coarse-grained structure folding approach complementary to traditional molecular dynamics simulations, which can produce fast near-native frameworks for atomic-level structural refinement. PMID:25737244

  3. Ready to learn physics: a team-based learning model for first year university

    NASA Astrophysics Data System (ADS)

    Parappilly, Maria; Schmidt, Lisa; De Ritter, Samantha

    2015-09-01

    Team-based learning (TBL) is an established model of group work which aims to improve students' ability to apply discipline-related content. TBL consists of a readiness assurance process (RAP), student groups and application activities. While TBL has not been implemented widely in science, technology, engineering and mathematics disciplines, it has been effective in improving student learning in other disciplines. This paper describes the incorporation of TBL activities into a non-calculus based introductory level physics topic—Physics for the Modern World. Students were given pre-class preparation materials and an individual RAP online test before the workshops. The pre-workshop individual RAP test ensured that all students were exposed to concept-based questions before their workshops and motivated them to use the preparatory materials in readiness for the workshop. The students were placed into random teams and during the first part of the workshop, the teams went through a subset of the quiz questions (team RAP test) and in the remaining time, teams completed an in-class assignment. After the workshop students were allowed another attempt at the individual RAP test to see if their knowledge had improved. The ability of TBL to promote student learning of key concepts was evaluated by experiment using pre- and post- testing. The students’ perception of TBL was monitored by discussion posts and survey responses. Finally, the ability of TBL to support peer-peer interaction was evaluated by video analysis of the class. We found that the TBL process improved student learning; students did interact with each other in class; and the students had a positive view of TBL. To assess the transferability of this model to other topics, we conducted a comparison study with an environmental science topic which produced similar results. Our study supports the use of this TBL model in science topics.

  4. Use of classification trees to apportion single echo detections to species: Application to the pelagic fish community of Lake Superior

    USGS Publications Warehouse

    Yule, Daniel L.; Adams, Jean V.; Hrabik, Thomas R.; Vinson, Mark R.; Woiak, Zebadiah; Ahrenstroff, Tyler D.

    2013-01-01

    Acoustic methods are used to estimate the density of pelagic fish in large lakes with results of midwater trawling used to assign species composition. Apportionment in lakes having mixed species can be challenging because only a small fraction of the water sampled acoustically is sampled with trawl gear. Here we describe a new method where single echo detections (SEDs) are assigned to species based on classification tree models developed from catch data that separate species based on fish size and the spatial habitats they occupy. During the summer of 2011, we conducted a spatially-balanced lake-wide acoustic and midwater trawl survey of Lake Superior. A total of 51 sites in four bathymetric depth strata (0–30 m, 30–100 m, 100–200 m, and >200 m) were sampled. We developed classification tree models for each stratum and found fish length was the most important variable for separating species. To apply these trees to the acoustic data, we needed to identify a target strength to length (TS-to-L) relationship appropriate for all abundant Lake Superior pelagic species. We tested performance of 7 general (i.e., multi-species) relationships derived from three published studies. The best-performing relationship was identified by comparing predicted and observed catch compositions using a second independent Lake Superior data set. Once identified, the relationship was used to predict lengths of SEDs from the lake-wide survey, and the classification tree models were used to assign each SED to a species. Exotic rainbow smelt (Osmerus mordax) were the most common species at bathymetric depths 100 m (384 million; 6.0 kt). Cisco (Coregonus artedi) were widely distributed over all strata with their population estimated at 182 million (44 kt). The apportionment method we describe should be transferable to other large lakes provided fish are not tightly aggregated, and an appropriate TS-to-L relationship for abundant pelagic fish species can be determined.

  5. An Evaluation of a Behaviorally Based Social Skills Group for Individuals Diagnosed with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Leaf, Justin B.; Leaf, Jeremy A.; Milne, Christine; Taubman, Mitchell; Oppenheim-Leaf, Misty; Torres, Norma; Townley-Cochran, Donna; Leaf, Ronald; McEachin, John; Yoder, Paul

    2017-01-01

    In this study we evaluated a social skills group which employed a progressive applied behavior analysis model for individuals diagnosed with autism spectrum disorder. A randomized control trial was utilized; eight participants were randomly assigned to a treatment group and seven participants were randomly assigned to a waitlist control group. The…

  6. Connecting Psychological Science with Climate Change: A Persuasion and Social Influence Assignment

    ERIC Educational Resources Information Center

    Munro, Geoffrey D.; Behlen, Margaret M.

    2017-01-01

    Students often have little understanding of the role psychological science plays in informing us about the impact of human behavior when addressing climate change. We designed an assignment for a social psychology course based on Frantz and Mayer's use of the decision tree model of helping behavior to identify the psychological barriers that…

  7. Pod Nursing on a Medical/Surgical Unit: Implementation and Outcomes Evaluation

    PubMed Central

    Friese, Christopher R.; Grunawalt, Julie C.; Bhullar, Sara; Bihlmeyer, Karen; Chang, Robert; Wood, Winnie

    2014-01-01

    A medical/surgical unit at the University of Michigan Health System implemented a pod nursing model of care to improve efficiency and patient and staff satisfaction. One centralized station was replaced with 4 satellites and supplies were relocated next to patient rooms. Patients were assigned to 2 nurses who worked as partners. Three patient (satisfaction, call lights, and falls) and nurse (satisfaction and overtime) outcomes improved after implementation. Efforts should be focused on addressing patient acuity imbalances across assignments and strengthening communication among the health care team. Studies are needed to test the model in larger and more diverse settings. PMID:24662689

  8. Reinforcing Abstinence and Treatment Participation among Offenders in a Drug Diversion Program: Are Vouchers Effective?

    PubMed Central

    Hall, Elizabeth A.; Prendergast, Michael L.; Roll, John M.; Warda, Umme

    2010-01-01

    This study assessed a 26-week voucher-based intervention to reinforce abstinence and participation in treatment-related activities among substance-abusing offenders court referred to outpatient treatment under drug diversion legislation (California's Substance Abuse and Crime Prevention Act). Standard treatment consisted of criminal justice supervision and an evidence-based model for treating stimulant abuse. Participants were randomly assigned to four groups, standard treatment (ST) only, ST plus vouchers for testing negative, ST plus vouchers for performing treatment plan activities, and ST plus vouchers for testing negative and/or performing treatment plan activities. Results indicate that voucher-based reinforcement of negative urines and of treatment plan tasks (using a flat reinforcement schedule) showed no statistically significant effects on measures of retention or drug use relative to the standard treatment protocol. It is likely that criminal justice contingencies had a stronger impact on participants' treatment retention and drug use than the relatively low-value vouchers awarded as part of the treatment protocol. PMID:20463918

  9. Reading and Thinking through Writing in General Education.

    ERIC Educational Resources Information Center

    Bennet, James R.; Hodges, Karen

    1986-01-01

    Describes a writing based course in freshman world literature and summarizes tests, writing assignments, and class activities used in teaching "The Odyssey,""Metamorphoses,""Hamlet," and other works. (JG)

  10. Integrated consensus-based frameworks for unmanned vehicle routing and targeting assignment

    NASA Astrophysics Data System (ADS)

    Barnawi, Waleed T.

    Unmanned aerial vehicles (UAVs) are increasingly deployed in complex and dynamic environments to perform multiple tasks cooperatively with other UAVs that contribute to overarching mission effectiveness. Studies by the Department of Defense (DoD) indicate future operations may include anti-access/area-denial (A2AD) environments which limit human teleoperator decision-making and control. This research addresses the problem of decentralized vehicle re-routing and task reassignments through consensus-based UAV decision-making. An Integrated Consensus-Based Framework (ICF) is formulated as a solution to the combined single task assignment problem and vehicle routing problem. The multiple assignment and vehicle routing problem is solved with the Integrated Consensus-Based Bundle Framework (ICBF). The frameworks are hierarchically decomposed into two levels. The bottom layer utilizes the renowned Dijkstra's Algorithm. The top layer addresses task assignment with two methods. The single assignment approach is called the Caravan Auction Algorithm (CarA) Algorithm. This technique extends the Consensus-Based Auction Algorithm (CBAA) to provide awareness for task completion by agents and adopt abandoned tasks. The multiple assignment approach called the Caravan Auction Bundle Algorithm (CarAB) extends the Consensus-Based Bundle Algorithm (CBBA) by providing awareness for lost resources, prioritizing remaining tasks, and adopting abandoned tasks. Research questions are investigated regarding the novelty and performance of the proposed frameworks. Conclusions regarding the research questions will be provided through hypothesis testing. Monte Carlo simulations will provide evidence to support conclusions regarding the research hypotheses for the proposed frameworks. The approach provided in this research addresses current and future military operations for unmanned aerial vehicles. However, the general framework implied by the proposed research is adaptable to any unmanned vehicle. Civil applications that involve missions where human observability would be limited could benefit from the independent UAV task assignment, such as exploration and fire surveillance are also notable uses for this approach.

  11. Combining automated peak tracking in SAR by NMR with structure-based backbone assignment from 15N-NOESY

    PubMed Central

    2012-01-01

    Background Chemical shift mapping is an important technique in NMR-based drug screening for identifying the atoms of a target protein that potentially bind to a drug molecule upon the molecule's introduction in increasing concentrations. The goal is to obtain a mapping of peaks with known residue assignment from the reference spectrum of the unbound protein to peaks with unknown assignment in the target spectrum of the bound protein. Although a series of perturbed spectra help to trace a path from reference peaks to target peaks, a one-to-one mapping generally is not possible, especially for large proteins, due to errors, such as noise peaks, missing peaks, missing but then reappearing, overlapped, and new peaks not associated with any peaks in the reference. Due to these difficulties, the mapping is typically done manually or semi-automatically, which is not efficient for high-throughput drug screening. Results We present PeakWalker, a novel peak walking algorithm for fast-exchange systems that models the errors explicitly and performs many-to-one mapping. On the proteins: hBclXL, UbcH5B, and histone H1, it achieves an average accuracy of over 95% with less than 1.5 residues predicted per target peak. Given these mappings as input, we present PeakAssigner, a novel combined structure-based backbone resonance and NOE assignment algorithm that uses just 15N-NOESY, while avoiding TOCSY experiments and 13C-labeling, to resolve the ambiguities for a one-to-one mapping. On the three proteins, it achieves an average accuracy of 94% or better. Conclusions Our mathematical programming approach for modeling chemical shift mapping as a graph problem, while modeling the errors directly, is potentially a time- and cost-effective first step for high-throughput drug screening based on limited NMR data and homologous 3D structures. PMID:22536902

  12. Application of visible and near-infrared spectroscopy to classification of Miscanthus species

    DOE PAGES

    Jin, Xiaoli; Chen, Xiaoling; Xiao, Liang; ...

    2017-04-03

    Here, the feasibility of visible and near infrared (NIR) spectroscopy as tool to classify Miscanthus samples was explored in this study. Three types of Miscanthus plants, namely, M. sinensis, M. sacchariflorus and M. fIoridulus, were analyzed using a NIR spectrophotometer. Several classification models based on the NIR spectra data were developed using line discriminated analysis (LDA), partial least squares (PLS), least squares support vector machine regression (LSSVR), radial basis function (RBF) and neural network (NN). The principal component analysis (PCA) presented rough classification with overlapping samples, while the models of Line_LSSVR, RBF_LSSVR and RBF_NN presented almost same calibration and validationmore » results. Due to the higher speed of Line_LSSVR than RBF_LSSVR and RBF_NN, we selected the line_LSSVR model as a representative. In our study, the model based on line_LSSVR showed higher accuracy than LDA and PLS models. The total correct classification rates of 87.79 and 96.51% were observed based on LDA and PLS model in the testing set, respectively, while the line_LSSVR showed 99.42% of total correct classification rate. Meanwhile, the lin_LSSVR model in the testing set showed correct classification rate of 100, 100 and 96.77% for M. sinensis, M. sacchariflorus and M. fIoridulus, respectively. The lin_LSSVR model assigned 99.42% of samples to the right groups, except one M. fIoridulus sample. The results demonstrated that NIR spectra combined with a preliminary morphological classification could be an effective and reliable procedure for the classification of Miscanthus species.« less

  13. Application of visible and near-infrared spectroscopy to classification of Miscanthus species

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, Xiaoli; Chen, Xiaoling; Xiao, Liang

    Here, the feasibility of visible and near infrared (NIR) spectroscopy as tool to classify Miscanthus samples was explored in this study. Three types of Miscanthus plants, namely, M. sinensis, M. sacchariflorus and M. fIoridulus, were analyzed using a NIR spectrophotometer. Several classification models based on the NIR spectra data were developed using line discriminated analysis (LDA), partial least squares (PLS), least squares support vector machine regression (LSSVR), radial basis function (RBF) and neural network (NN). The principal component analysis (PCA) presented rough classification with overlapping samples, while the models of Line_LSSVR, RBF_LSSVR and RBF_NN presented almost same calibration and validationmore » results. Due to the higher speed of Line_LSSVR than RBF_LSSVR and RBF_NN, we selected the line_LSSVR model as a representative. In our study, the model based on line_LSSVR showed higher accuracy than LDA and PLS models. The total correct classification rates of 87.79 and 96.51% were observed based on LDA and PLS model in the testing set, respectively, while the line_LSSVR showed 99.42% of total correct classification rate. Meanwhile, the lin_LSSVR model in the testing set showed correct classification rate of 100, 100 and 96.77% for M. sinensis, M. sacchariflorus and M. fIoridulus, respectively. The lin_LSSVR model assigned 99.42% of samples to the right groups, except one M. fIoridulus sample. The results demonstrated that NIR spectra combined with a preliminary morphological classification could be an effective and reliable procedure for the classification of Miscanthus species.« less

  14. Application of visible and near-infrared spectroscopy to classification of Miscanthus species.

    PubMed

    Jin, Xiaoli; Chen, Xiaoling; Xiao, Liang; Shi, Chunhai; Chen, Liang; Yu, Bin; Yi, Zili; Yoo, Ji Hye; Heo, Kweon; Yu, Chang Yeon; Yamada, Toshihiko; Sacks, Erik J; Peng, Junhua

    2017-01-01

    The feasibility of visible and near infrared (NIR) spectroscopy as tool to classify Miscanthus samples was explored in this study. Three types of Miscanthus plants, namely, M. sinensis, M. sacchariflorus and M. fIoridulus, were analyzed using a NIR spectrophotometer. Several classification models based on the NIR spectra data were developed using line discriminated analysis (LDA), partial least squares (PLS), least squares support vector machine regression (LSSVR), radial basis function (RBF) and neural network (NN). The principal component analysis (PCA) presented rough classification with overlapping samples, while the models of Line_LSSVR, RBF_LSSVR and RBF_NN presented almost same calibration and validation results. Due to the higher speed of Line_LSSVR than RBF_LSSVR and RBF_NN, we selected the line_LSSVR model as a representative. In our study, the model based on line_LSSVR showed higher accuracy than LDA and PLS models. The total correct classification rates of 87.79 and 96.51% were observed based on LDA and PLS model in the testing set, respectively, while the line_LSSVR showed 99.42% of total correct classification rate. Meanwhile, the lin_LSSVR model in the testing set showed correct classification rate of 100, 100 and 96.77% for M. sinensis, M. sacchariflorus and M. fIoridulus, respectively. The lin_LSSVR model assigned 99.42% of samples to the right groups, except one M. fIoridulus sample. The results demonstrated that NIR spectra combined with a preliminary morphological classification could be an effective and reliable procedure for the classification of Miscanthus species.

  15. Application of visible and near-infrared spectroscopy to classification of Miscanthus species

    PubMed Central

    Shi, Chunhai; Chen, Liang; Yu, Bin; Yi, Zili; Yoo, Ji Hye; Heo, Kweon; Yu, Chang Yeon; Yamada, Toshihiko; Sacks, Erik J.; Peng, Junhua

    2017-01-01

    The feasibility of visible and near infrared (NIR) spectroscopy as tool to classify Miscanthus samples was explored in this study. Three types of Miscanthus plants, namely, M. sinensis, M. sacchariflorus and M. fIoridulus, were analyzed using a NIR spectrophotometer. Several classification models based on the NIR spectra data were developed using line discriminated analysis (LDA), partial least squares (PLS), least squares support vector machine regression (LSSVR), radial basis function (RBF) and neural network (NN). The principal component analysis (PCA) presented rough classification with overlapping samples, while the models of Line_LSSVR, RBF_LSSVR and RBF_NN presented almost same calibration and validation results. Due to the higher speed of Line_LSSVR than RBF_LSSVR and RBF_NN, we selected the line_LSSVR model as a representative. In our study, the model based on line_LSSVR showed higher accuracy than LDA and PLS models. The total correct classification rates of 87.79 and 96.51% were observed based on LDA and PLS model in the testing set, respectively, while the line_LSSVR showed 99.42% of total correct classification rate. Meanwhile, the lin_LSSVR model in the testing set showed correct classification rate of 100, 100 and 96.77% for M. sinensis, M. sacchariflorus and M. fIoridulus, respectively. The lin_LSSVR model assigned 99.42% of samples to the right groups, except one M. fIoridulus sample. The results demonstrated that NIR spectra combined with a preliminary morphological classification could be an effective and reliable procedure for the classification of Miscanthus species. PMID:28369059

  16. 12 CFR 345.28 - Assigned ratings.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...,” “satisfactory,” “needs to improve,” or “substantial noncompliance” based on the bank's performance under the lending, investment and service tests, the community development test, the small bank performance... evaluation of a bank's CRA performance is adversely affected by evidence of discriminatory or other illegal...

  17. Computer-based coding of free-text job descriptions to efficiently identify occupations in epidemiological studies.

    PubMed

    Russ, Daniel E; Ho, Kwan-Yuet; Colt, Joanne S; Armenti, Karla R; Baris, Dalsu; Chow, Wong-Ho; Davis, Faith; Johnson, Alison; Purdue, Mark P; Karagas, Margaret R; Schwartz, Kendra; Schwenn, Molly; Silverman, Debra T; Johnson, Calvin A; Friesen, Melissa C

    2016-06-01

    Mapping job titles to standardised occupation classification (SOC) codes is an important step in identifying occupational risk factors in epidemiological studies. Because manual coding is time-consuming and has moderate reliability, we developed an algorithm called SOCcer (Standardized Occupation Coding for Computer-assisted Epidemiologic Research) to assign SOC-2010 codes based on free-text job description components. Job title and task-based classifiers were developed by comparing job descriptions to multiple sources linking job and task descriptions to SOC codes. An industry-based classifier was developed based on the SOC prevalence within an industry. These classifiers were used in a logistic model trained using 14 983 jobs with expert-assigned SOC codes to obtain empirical weights for an algorithm that scored each SOC/job description. We assigned the highest scoring SOC code to each job. SOCcer was validated in 2 occupational data sources by comparing SOC codes obtained from SOCcer to expert assigned SOC codes and lead exposure estimates obtained by linking SOC codes to a job-exposure matrix. For 11 991 case-control study jobs, SOCcer-assigned codes agreed with 44.5% and 76.3% of manually assigned codes at the 6-digit and 2-digit level, respectively. Agreement increased with the score, providing a mechanism to identify assignments needing review. Good agreement was observed between lead estimates based on SOCcer and manual SOC assignments (κ 0.6-0.8). Poorer performance was observed for inspection job descriptions, which included abbreviations and worksite-specific terminology. Although some manual coding will remain necessary, using SOCcer may improve the efficiency of incorporating occupation into large-scale epidemiological studies. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  18. Storage assignment optimization in a multi-tier shuttle warehousing system

    NASA Astrophysics Data System (ADS)

    Wang, Yanyan; Mou, Shandong; Wu, Yaohua

    2016-03-01

    The current mathematical models for the storage assignment problem are generally established based on the traveling salesman problem(TSP), which has been widely applied in the conventional automated storage and retrieval system(AS/RS). However, the previous mathematical models in conventional AS/RS do not match multi-tier shuttle warehousing systems(MSWS) because the characteristics of parallel retrieval in multiple tiers and progressive vertical movement destroy the foundation of TSP. In this study, a two-stage open queuing network model in which shuttles and a lift are regarded as servers at different stages is proposed to analyze system performance in the terms of shuttle waiting period (SWP) and lift idle period (LIP) during transaction cycle time. A mean arrival time difference matrix for pairwise stock keeping units(SKUs) is presented to determine the mean waiting time and queue length to optimize the storage assignment problem on the basis of SKU correlation. The decomposition method is applied to analyze the interactions among outbound task time, SWP, and LIP. The ant colony clustering algorithm is designed to determine storage partitions using clustering items. In addition, goods are assigned for storage according to the rearranging permutation and the combination of storage partitions in a 2D plane. This combination is derived based on the analysis results of the queuing network model and on three basic principles. The storage assignment method and its entire optimization algorithm method as applied in a MSWS are verified through a practical engineering project conducted in the tobacco industry. The applying results show that the total SWP and LIP can be reduced effectively to improve the utilization rates of all devices and to increase the throughput of the distribution center.

  19. A computational framework for converting textual clinical diagnostic criteria into the quality data model.

    PubMed

    Hong, Na; Li, Dingcheng; Yu, Yue; Xiu, Qiongying; Liu, Hongfang; Jiang, Guoqian

    2016-10-01

    Constructing standard and computable clinical diagnostic criteria is an important but challenging research field in the clinical informatics community. The Quality Data Model (QDM) is emerging as a promising information model for standardizing clinical diagnostic criteria. To develop and evaluate automated methods for converting textual clinical diagnostic criteria in a structured format using QDM. We used a clinical Natural Language Processing (NLP) tool known as cTAKES to detect sentences and annotate events in diagnostic criteria. We developed a rule-based approach for assigning the QDM datatype(s) to an individual criterion, whereas we invoked a machine learning algorithm based on the Conditional Random Fields (CRFs) for annotating attributes belonging to each particular QDM datatype. We manually developed an annotated corpus as the gold standard and used standard measures (precision, recall and f-measure) for the performance evaluation. We harvested 267 individual criteria with the datatypes of Symptom and Laboratory Test from 63 textual diagnostic criteria. We manually annotated attributes and values in 142 individual Laboratory Test criteria. The average performance of our rule-based approach was 0.84 of precision, 0.86 of recall, and 0.85 of f-measure; the performance of CRFs-based classification was 0.95 of precision, 0.88 of recall and 0.91 of f-measure. We also implemented a web-based tool that automatically translates textual Laboratory Test criteria into the QDM XML template format. The results indicated that our approaches leveraging cTAKES and CRFs are effective in facilitating diagnostic criteria annotation and classification. Our NLP-based computational framework is a feasible and useful solution in developing diagnostic criteria representation and computerization. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. Navigating complex decision spaces: Problems and paradigms in sequential choice

    PubMed Central

    Walsh, Matthew M.; Anderson, John R.

    2015-01-01

    To behave adaptively, we must learn from the consequences of our actions. Doing so is difficult when the consequences of an action follow a delay. This introduces the problem of temporal credit assignment. When feedback follows a sequence of decisions, how should the individual assign credit to the intermediate actions that comprise the sequence? Research in reinforcement learning provides two general solutions to this problem: model-free reinforcement learning and model-based reinforcement learning. In this review, we examine connections between stimulus-response and cognitive learning theories, habitual and goal-directed control, and model-free and model-based reinforcement learning. We then consider a range of problems related to temporal credit assignment. These include second-order conditioning and secondary reinforcers, latent learning and detour behavior, partially observable Markov decision processes, actions with distributed outcomes, and hierarchical learning. We ask whether humans and animals, when faced with these problems, behave in a manner consistent with reinforcement learning techniques. Throughout, we seek to identify neural substrates of model-free and model-based reinforcement learning. The former class of techniques is understood in terms of the neurotransmitter dopamine and its effects in the basal ganglia. The latter is understood in terms of a distributed network of regions including the prefrontal cortex, medial temporal lobes cerebellum, and basal ganglia. Not only do reinforcement learning techniques have a natural interpretation in terms of human and animal behavior, but they also provide a useful framework for understanding neural reward valuation and action selection. PMID:23834192

  1. Comparison of Breast Density Between Synthesized Versus Standard Digital Mammography.

    PubMed

    Haider, Irfanullah; Morgan, Matthew; McGow, Anna; Stein, Matthew; Rezvani, Maryam; Freer, Phoebe; Hu, Nan; Fajardo, Laurie; Winkler, Nicole

    2018-06-12

    To evaluate perceptual difference in breast density classification using synthesized mammography (SM) compared with standard or full-field digital mammography (FFDM) for screening. This institutional review board-approved, retrospective, multireader study evaluated breast density on 200 patients who underwent baseline screening mammogram during which both SM and FFDM were obtained contemporaneously from June 1, 2016, through November 30, 2016. Qualitative breast density was independently assigned by seven readers initially evaluating FFDM alone. Then, in a separate session, these same readers assigned breast density using synthetic views alone on the same 200 patients. The readers were again blinded to each other's assignment. Qualitative density assessment was based on BI-RADS fifth edition. Interreader agreement was evaluated with κ statistic using 95% confidence intervals. Testing for homogeneity in paired proportions was performed using McNemar's test with a level of significance of .05. For patients across the SM and standard 2-D data set, diagnostic testing with McNemar's test with P = 0.32 demonstrates that the minimal density transitions across FFDM and SM are not statistically significant density shifts. Taking clinical significance into account, only 8 of 200 (4%) patients had clinically significant transition (dense versus not dense). There was substantial interreader agreement with overall κ in FFDM of 0.71 (minimum 0.53, maximum 0.81) and overall SM κ average of 0.63 (minimum 0.56, maximum 0.87). Overall subjective breast density assignment by radiologists on SM is similar to density assignment on standard 2-D mammogram. Copyright © 2018 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  2. Modelling Cognitive Style in a Peer Help Network.

    ERIC Educational Resources Information Center

    Bull, Susan; McCalla, Gord

    2002-01-01

    Explains I-Help, a computer-based peer help network where students can ask and answer questions about assignments and courses based on the metaphor of a help desk. Highlights include cognitive style; user modeling in I-Help; matching helpers to helpees; and types of questions. (Contains 64 references.) (LRW)

  3. The effects of a dynamic graphical model during simulation-based training of console operation skill

    NASA Technical Reports Server (NTRS)

    Farquhar, John D.; Regian, J. Wesley

    1993-01-01

    LOADER is a Windows-based simulation of a complex procedural task. The task requires subjects to execute long sequences of console-operation actions (e.g., button presses, switch actuations, dial rotations) to accomplish specific goals. The LOADER interface is a graphical computer-simulated console which controls railroad cars, tracks, and cranes in a fictitious railroad yard. We hypothesized that acquisition of LOADER performance skill would be supported by the representation of a dynamic graphical model linking console actions to goal and goal states in the 'railroad yard'. Twenty-nine subjects were randomly assigned to one of two treatments (i.e., dynamic model or no model). During training, both groups received identical text-based instruction in an instructional-window above the LOADER interface. One group, however, additionally saw a dynamic version of the bird's-eye view of the railroad yard. After training, both groups were tested under identical conditions. They were asked to perform the complete procedure without guidance and without access to either type of railroad yard representation. Results indicate that rather than becoming dependent on the animated rail yard model, subjects in the dynamic model condition apparently internalized the model, as evidenced by their performance after the model was removed.

  4. Simultaneous fecal microbial and metabolite profiling enables accurate classification of pediatric irritable bowel syndrome.

    PubMed

    Shankar, Vijay; Reo, Nicholas V; Paliy, Oleg

    2015-12-09

    We previously showed that stool samples of pre-adolescent and adolescent US children diagnosed with diarrhea-predominant IBS (IBS-D) had different compositions of microbiota and metabolites compared to healthy age-matched controls. Here we explored whether observed fecal microbiota and metabolite differences between these two adolescent populations can be used to discriminate between IBS and health. We constructed individual microbiota- and metabolite-based sample classification models based on the partial least squares multivariate analysis and then applied a Bayesian approach to integrate individual models into a single classifier. The resulting combined classification achieved 84 % accuracy of correct sample group assignment and 86 % prediction for IBS-D in cross-validation tests. The performance of the cumulative classification model was further validated by the de novo analysis of stool samples from a small independent IBS-D cohort. High-throughput microbial and metabolite profiling of subject stool samples can be used to facilitate IBS diagnosis.

  5. Evaluating the importance of characterizing soil structure and horizons in parameterizing a hydrologic process model

    USGS Publications Warehouse

    Mirus, Benjamin B.

    2015-01-01

    Incorporating the influence of soil structure and horizons into parameterizations of distributed surface water/groundwater models remains a challenge. Often, only a single soil unit is employed, and soil-hydraulic properties are assigned based on textural classification, without evaluating the potential impact of these simplifications. This study uses a distributed physics-based model to assess the influence of soil horizons and structure on effective parameterization. This paper tests the viability of two established and widely used hydrogeologic methods for simulating runoff and variably saturated flow through layered soils: (1) accounting for vertical heterogeneity by combining hydrostratigraphic units with contrasting hydraulic properties into homogeneous, anisotropic units and (2) use of established pedotransfer functions based on soil texture alone to estimate water retention and conductivity, without accounting for the influence of pedon structures and hysteresis. The viability of this latter method for capturing the seasonal transition from runoff-dominated to evapotranspiration-dominated regimes is also tested here. For cases tested here, event-based simulations using simplified vertical heterogeneity did not capture the state-dependent anisotropy and complex combinations of runoff generation mechanisms resulting from permeability contrasts in layered hillslopes with complex topography. Continuous simulations using pedotransfer functions that do not account for the influence of soil structure and hysteresis generally over-predicted runoff, leading to propagation of substantial water balance errors. Analysis suggests that identifying a dominant hydropedological unit provides the most acceptable simplification of subsurface layering and that modified pedotransfer functions with steeper soil-water retention curves might adequately capture the influence of soil structure and hysteresis on hydrologic response in headwater catchments.

  6. How "mere" is the mere ownership effect in memory? Evidence for semantic organization processes.

    PubMed

    Englert, Julia; Wentura, Dirk

    2016-11-01

    Memory is better for items arbitrarily assigned to the self than for items assigned to another person (mere ownership effect, MOE). In a series of six experiments, we investigated the role of semantic processes for the MOE. Following successful replication, we investigated whether the MOE was contingent upon semantic processing: For meaningless stimuli, there was no MOE. Testing for a potential role of semantic elaboration using meaningful stimuli in an encoding task without verbal labels, we found evidence of spontaneous semantic processing irrespective of self- or other-assignment. When semantic organization was manipulated, the MOE vanished if a semantic classification task was added to the self/other assignment but persisted for a perceptual classification task. Furthermore, we found greater clustering of self-assigned than of other-assigned items in free recall. Taken together, these results suggest that the MOE could be based on the organizational principle of a "me" versus "not-me" categorization. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Effect of an Individual Readiness Assurance Test on a Team Readiness Assurance Test in the Team-Based Learning of Physiology

    ERIC Educational Resources Information Center

    Gopalan, Chaya; Fox, Dainielle J.; Gaebelein, Claude J.

    2013-01-01

    We examined whether requiring an individual readiness assurance test (iRAT) before a team readiness assurance test (tRAT) would benefit students in becoming better problem solvers in physiology. It was tested in the form of tRAT scores, the time required to complete the tRAT assignment, and individual performance on the unit examinations. Students…

  8. Writing Essays on a Laptop or a Desktop Computer: Does It Matter?

    ERIC Educational Resources Information Center

    Ling, Guangming; Bridgeman, Brent

    2013-01-01

    To explore the potential effect of computer type on the Test of English as a Foreign Language-Internet-Based Test (TOEFL iBT) Writing Test, a sample of 444 international students was used. The students were randomly assigned to either a laptop or a desktop computer to write two TOEFL iBT practice essays in a simulated testing environment, followed…

  9. Automatic Assignment of Methyl-NMR Spectra of Supramolecular Machines Using Graph Theory.

    PubMed

    Pritišanac, Iva; Degiacomi, Matteo T; Alderson, T Reid; Carneiro, Marta G; Ab, Eiso; Siegal, Gregg; Baldwin, Andrew J

    2017-07-19

    Methyl groups are powerful probes for the analysis of structure, dynamics and function of supramolecular assemblies, using both solution- and solid-state NMR. Widespread application of the methodology has been limited due to the challenges associated with assigning spectral resonances to specific locations within a biomolecule. Here, we present Methyl Assignment by Graph Matching (MAGMA), for the automatic assignment of methyl resonances. A graph matching protocol examines all possibilities for each resonance in order to determine an exact assignment that includes a complete description of any ambiguity. MAGMA gives 100% accuracy in confident assignments when tested against both synthetic data, and 9 cross-validated examples using both solution- and solid-state NMR data. We show that this remarkable accuracy enables a user to distinguish between alternative protein structures. In a drug discovery application on HSP90, we show the method can rapidly and efficiently distinguish between possible ligand binding modes. By providing an exact and robust solution to methyl resonance assignment, MAGMA can facilitate significantly accelerated studies of supramolecular machines using methyl-based NMR spectroscopy.

  10. A model of tailoring effects: A randomized controlled trial examining the mechanisms of tailoring in a web-based STD screening intervention.

    PubMed

    Lustria, Mia Liza A; Cortese, Juliann; Gerend, Mary A; Schmitt, Karla; Kung, Ying Mai; McLaughlin, Casey

    2016-11-01

    This study explores the mechanisms of tailoring within the context of RU@Risk a brief Web-based intervention designed to promote sexually transmitted disease (STD) testing among young adults. This is one of a few studies to empirically examine theorized message processing mechanisms of tailoring and persuasion outcomes in a single model. Sexually active college students (N = 1065) completed a pretest, were randomly assigned to explore a tailored or nontailored website, completed a posttest, and were offered the opportunity to order a free at-home STD test kit. As intervention effects were hypothesized to work via increases in perceived risk, change in perceived risk from pretest to posttest by condition was examined. Hypothesized mechanisms of tailoring (perceived personal relevance, attention, and elaboration) were examined using structural equation modeling (SEM). All analyses controlled for demographic variables and sexual history. As predicted, perceived risk of STDs increased from pretest to posttest, but only in the tailored condition. Results revealed that exposure to the tailored (vs. nontailored) website increased perceived personal relevance, attention to, and elaboration of the message. These effects in turn were associated with greater perceived risk of STDs and intentions to get tested. Additionally, participants in the tailored condition were more likely to order a test kit. Findings provide insight into the mechanisms of tailoring with important implications for optimizing message design. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  11. Power calculations for likelihood ratio tests for offspring genotype risks, maternal effects, and parent-of-origin (POO) effects in the presence of missing parental genotypes when unaffected siblings are available.

    PubMed

    Rampersaud, E; Morris, R W; Weinberg, C R; Speer, M C; Martin, E R

    2007-01-01

    Genotype-based likelihood-ratio tests (LRT) of association that examine maternal and parent-of-origin effects have been previously developed in the framework of log-linear and conditional logistic regression models. In the situation where parental genotypes are missing, the expectation-maximization (EM) algorithm has been incorporated in the log-linear approach to allow incomplete triads to contribute to the LRT. We present an extension to this model which we call the Combined_LRT that incorporates additional information from the genotypes of unaffected siblings to improve assignment of incompletely typed families to mating type categories, thereby improving inference of missing parental data. Using simulations involving a realistic array of family structures, we demonstrate the validity of the Combined_LRT under the null hypothesis of no association and provide power comparisons under varying levels of missing data and using sibling genotype data. We demonstrate the improved power of the Combined_LRT compared with the family-based association test (FBAT), another widely used association test. Lastly, we apply the Combined_LRT to a candidate gene analysis in Autism families, some of which have missing parental genotypes. We conclude that the proposed log-linear model will be an important tool for future candidate gene studies, for many complex diseases where unaffected siblings can often be ascertained and where epigenetic factors such as imprinting may play a role in disease etiology.

  12. Hypermedia and Research Methodology: An Associative Research Model. Final Report.

    ERIC Educational Resources Information Center

    City Univ. of New York, NY. Bernard Baruch Coll.

    The purpose of the project described in this three-part report was to test the value of hypermedia for library instruction. The project included the development of a hypermedia program built around a marketing topic; implementation and testing of the program with one group of students assigned to use the new program; evaluation of the project by…

  13. Statistical Refinement of the Q-Matrix in Cognitive Diagnosis

    ERIC Educational Resources Information Center

    Chiu, Chia-Yi

    2013-01-01

    Most methods for fitting cognitive diagnosis models to educational test data and assigning examinees to proficiency classes require the Q-matrix that associates each item in a test with the cognitive skills (attributes) needed to answer it correctly. In most cases, the Q-matrix is not known but is constructed from the (fallible) judgments of…

  14. Dimensions of Intelligent Systems

    DTIC Science & Technology

    2002-08-01

    Keywords: IS, Intelligent Systems, Turing Test, Cognitive Model, situated cognition, BDI, Deep Blue, constructionism 1: Introduction Investigation of...Our social experience provides an implicit, observer bias to assign mentality and intentions to the system in a test and many would argue that...extended the intentional notions of Belief, Desire, and Intention (BDI ) to include social “properties” of Value6

  15. Two-year outcome of team-based intensive case management for patients with schizophrenia.

    PubMed

    Aberg-Wistedt, A; Cressell, T; Lidberg, Y; Liljenberg, B; Osby, U

    1995-12-01

    Two-year outcomes of patients with schizophrenic disorders who were assigned to an intensive, team-based case management program and patients who received standard psychiatric services were assessed. The case management model featured increased staff contact time with patients, rehabilitation plans based on patients' expressed needs, and patients' attendance at team meetings where their rehabilitation plan was discussed. Forty patients were randomly assigned to either the case management group or the control group that received standard services. Patients' use of emergency and inpatient services, their quality of life, the size of their social networks, and their relatives' burden of care were assessed at assignment to the study groups and at two-year follow-up. Patients in the case management group had significantly fewer emergency visits compared with the two years before the study, and their relatives reported significantly reduced burden of care associated with relationships with psychiatric services over the two-year period. The size of patients' social networks increased for the case management group and decreased for the control group. A team-based intensive case management model is an effective intervention in the rehabilitation of patients with chronic schizophrenia.

  16. Traffic and Driving Simulator Based on Architecture of Interactive Motion.

    PubMed

    Paz, Alexander; Veeramisti, Naveen; Khaddar, Romesh; de la Fuente-Mella, Hanns; Modorcea, Luiza

    2015-01-01

    This study proposes an architecture for an interactive motion-based traffic simulation environment. In order to enhance modeling realism involving actual human beings, the proposed architecture integrates multiple types of simulation, including: (i) motion-based driving simulation, (ii) pedestrian simulation, (iii) motorcycling and bicycling simulation, and (iv) traffic flow simulation. The architecture has been designed to enable the simulation of the entire network; as a result, the actual driver, pedestrian, and bike rider can navigate anywhere in the system. In addition, the background traffic interacts with the actual human beings. This is accomplished by using a hybrid mesomicroscopic traffic flow simulation modeling approach. The mesoscopic traffic flow simulation model loads the results of a user equilibrium traffic assignment solution and propagates the corresponding traffic through the entire system. The microscopic traffic flow simulation model provides background traffic around the vicinities where actual human beings are navigating the system. The two traffic flow simulation models interact continuously to update system conditions based on the interactions between actual humans and the fully simulated entities. Implementation efforts are currently in progress and some preliminary tests of individual components have been conducted. The implementation of the proposed architecture faces significant challenges ranging from multiplatform and multilanguage integration to multievent communication and coordination.

  17. Traffic and Driving Simulator Based on Architecture of Interactive Motion

    PubMed Central

    Paz, Alexander; Veeramisti, Naveen; Khaddar, Romesh; de la Fuente-Mella, Hanns; Modorcea, Luiza

    2015-01-01

    This study proposes an architecture for an interactive motion-based traffic simulation environment. In order to enhance modeling realism involving actual human beings, the proposed architecture integrates multiple types of simulation, including: (i) motion-based driving simulation, (ii) pedestrian simulation, (iii) motorcycling and bicycling simulation, and (iv) traffic flow simulation. The architecture has been designed to enable the simulation of the entire network; as a result, the actual driver, pedestrian, and bike rider can navigate anywhere in the system. In addition, the background traffic interacts with the actual human beings. This is accomplished by using a hybrid mesomicroscopic traffic flow simulation modeling approach. The mesoscopic traffic flow simulation model loads the results of a user equilibrium traffic assignment solution and propagates the corresponding traffic through the entire system. The microscopic traffic flow simulation model provides background traffic around the vicinities where actual human beings are navigating the system. The two traffic flow simulation models interact continuously to update system conditions based on the interactions between actual humans and the fully simulated entities. Implementation efforts are currently in progress and some preliminary tests of individual components have been conducted. The implementation of the proposed architecture faces significant challenges ranging from multiplatform and multilanguage integration to multievent communication and coordination. PMID:26491711

  18. Holistic model-based monitoring of the human health status in an urban environment system: pilot study in Verona city, Italy.

    PubMed

    Tarocco, S; Amoruso, I; Caravello, G

    2011-06-01

    In recent decades the global health paradigm gained an increasing systemic characterization. The ecosystem health theory states that a healthy ecosystem, whether natural or artificial, significantly contributes to the good health status of the human population. The present study describes an interdisciplinary monitoring model that retrospectively analyzes the intersection between the urban environment and citizens. The model analyzes both the biophysical and the anthropic subsystems through the application of landscape ecology and environmental quality indexes along with human health indicators. Particularly, ecological quality of landscape pattern, atmospheric pollution, outdoor noise levels and local health indicators were assessed. Verona municipality was chosen as study area to test the preliminary efficiency of the model. Territory was split into two superimposed layers of land units, which were further geo-referentiated with Geographical Information System (GIS) technology. Interdependence of any of the analyzed traits was further investigated with Fisher exact test. Landscape composition was assessed and an Average Ecological Quality (AEQ) score assigned to each land unit. A direct proportionality emerged for concentrations of considered air pollutants and traffic levels: a spatial model for the atmospheric pollution was drawn. A map depicting the distribution of traffic-related noise levels was also drawn. From chosen indicators, a quality class score was assigned to every minor and major land unit. Age-standardised rates about hospitalizations for the municipal population and specific rates for the over-65s/1000 inhabitants were calculated. Quality class assignement for each health indicator was graphically rendered. After direct standardisation of rates for the population sample, data were compared with two reference populations, the Regional population and the Local Socio-sanitary Unit (ULSS20) population. Standardised hospitalization rates for the whole municipal population always resulted lower than the ULSS20 rates, except for auditory pathologies. It was notable that rates of hospitalizations for cancerous diseases for Verona municipal population were four times and two times lower than the ULSS20 and the Regional population ones, respectively. Contingency table were made for the health main indicator (specific rates for the over-65s/1000 inhabitants) and the environmental quality key factors of landscape ecological quality, outdoor noise level and air pollution. H0 of independence was rejected for respiratory pathologies and air pollution and for the triad cardiocirculatory pathologies, air pollution and landscape ecological quality at (a = 0.05). Fisher exact test confirmed the non-independence of cardiocirculatory diseases and biophysical environment and the analogous association for respiratory pathologies when comparison was made with global environmental quality index. The first testing of the model suggests some possible elements of implementation and integration which could further enhance it. Among them, the subjective investigation of the health status assumes a primary role. On the whole the monitoring model seems to effectively represent the real complexity of the urban environment systems and should be regarded as an important contribution to the new way of health research.

  19. A multi-objective approach to improve SWAT model calibration in alpine catchments

    NASA Astrophysics Data System (ADS)

    Tuo, Ye; Marcolini, Giorgia; Disse, Markus; Chiogna, Gabriele

    2018-04-01

    Multi-objective hydrological model calibration can represent a valuable solution to reduce model equifinality and parameter uncertainty. The Soil and Water Assessment Tool (SWAT) model is widely applied to investigate water quality and water management issues in alpine catchments. However, the model calibration is generally based on discharge records only, and most of the previous studies have defined a unique set of snow parameters for an entire basin. Only a few studies have considered snow observations to validate model results or have taken into account the possible variability of snow parameters for different subbasins. This work presents and compares three possible calibration approaches. The first two procedures are single-objective calibration procedures, for which all parameters of the SWAT model were calibrated according to river discharge alone. Procedures I and II differ from each other by the assumption used to define snow parameters: The first approach assigned a unique set of snow parameters to the entire basin, whereas the second approach assigned different subbasin-specific sets of snow parameters to each subbasin. The third procedure is a multi-objective calibration, in which we considered snow water equivalent (SWE) information at two different spatial scales (i.e. subbasin and elevation band), in addition to discharge measurements. We tested these approaches in the Upper Adige river basin where a dense network of snow depth measurement stations is available. Only the set of parameters obtained with this multi-objective procedure provided an acceptable prediction of both river discharge and SWE. These findings offer the large community of SWAT users a strategy to improve SWAT modeling in alpine catchments.

  20. Self-regulated learning in a dynamic coaching model for supporting college students with traumatic brain injury: two case reports.

    PubMed

    Kennedy, Mary R T; Krause, Miriam O

    2011-01-01

    To describe a program that integrates self-regulated learning theory with supported education for college students with traumatic brain injury using a dynamic coaching model; to demonstrate the feasibility of developing and implementing such a program; and to identify individualized outcomes. Case study comparisons. University setting. Two severely injured students with cognitive impairments. A dynamic coaching model of supported education which incorporated self-regulated learning was provided for students with traumatic brain injury while attending college. Outcomes were both short and long term including decontextualized standardized test scores, self-reported academic challenges, number and specificity of reported strategies, grades on assignments, number of credits completed versus attempted, and changes in academic status and campus life. Students improved on graded assignments after strategy instruction and reported using more strategies by the end of the year. Students completed most of the credits they attempted, were in good academic standing, and made positive academic decisions. Performance on decontextualized tests pre- and postintervention was variable. It is feasible to deliver a hybrid supported education program that is dynamically responsive to individual students' needs and learning styles. Reasons for including both functional and standardized test outcomes are discussed.

  1. Consultation-based academic interventions for children with ADHD: effects on reading and mathematics achievement.

    PubMed

    DuPaul, George J; Jitendra, Asha K; Volpe, Robert J; Tresco, Katy E; Lutz, J Gary; Vile Junod, Rosemary E; Cleary, Kristi S; Flammer, Lizette M; Mannella, Mark C

    2006-10-01

    The purpose of this investigation was to evaluate the relative efficacy of two consultation-based models for designing academic interventions to enhance the educational functioning of children with attention-deficit/hyperactivity disorder (ADHD). Children (N=167) meeting DSM-IV criteria for ADHD were randomly assigned to one of two consultation groups: Individualized Academic Intervention (IAI; interventions designed using a data-based decision-making model that involved ongoing feedback to teachers) and Generic Academic Intervention (GAI; interventions designed based on consultant-teacher collaboration, representing "consultation as usual"). Teachers implemented academic interventions over 15 months. Academic outcomes (e.g., standardized achievement test, and teacher ratings of academic skills) were assessed on four occasions (baseline, 3 months, 12 months, 15 months). Hierarchical linear modeling analyses indicated significant positive growth for 8 of the 14 dependent variables; however, trajectories did not differ significantly across consultation groups. Interventions in the IAI group were delivered with significantly greater integrity; however, groups did not differ with respect to teacher ratings of treatment acceptability. The results of this study provide partial support for the effectiveness of consultation-based academic interventions in enhancing educational functioning in children with ADHD; however, the relative advantages of an individualized model over "consultation as usual" have yet to be established.

  2. Functional insights from proteome-wide structural modeling of Treponema pallidum subspecies pallidum, the causative agent of syphilis.

    PubMed

    Houston, Simon; Lithgow, Karen Vivien; Osbak, Kara Krista; Kenyon, Chris Richard; Cameron, Caroline E

    2018-05-16

    Syphilis continues to be a major global health threat with 11 million new infections each year, and a global burden of 36 million cases. The causative agent of syphilis, Treponema pallidum subspecies pallidum, is a highly virulent bacterium, however the molecular mechanisms underlying T. pallidum pathogenesis remain to be definitively identified. This is due to the fact that T. pallidum is currently uncultivatable, inherently fragile and thus difficult to work with, and phylogenetically distinct with no conventional virulence factor homologs found in other pathogens. In fact, approximately 30% of its predicted protein-coding genes have no known orthologs or assigned functions. Here we employed a structural bioinformatics approach using Phyre2-based tertiary structure modeling to improve our understanding of T. pallidum protein function on a proteome-wide scale. Phyre2-based tertiary structure modeling generated high-confidence predictions for 80% of the T. pallidum proteome (780/978 predicted proteins). Tertiary structure modeling also inferred the same function as primary structure-based annotations from genome sequencing pipelines for 525/605 proteins (87%), which represents 54% (525/978) of all T. pallidum proteins. Of the 175 T. pallidum proteins modeled with high confidence that were not assigned functions in the previously annotated published proteome, 167 (95%) were able to be assigned predicted functions. Twenty-one of the 175 hypothetical proteins modeled with high confidence were also predicted to exhibit significant structural similarity with proteins experimentally confirmed to be required for virulence in other pathogens. Phyre2-based structural modeling is a powerful bioinformatics tool that has provided insight into the potential structure and function of the majority of T. pallidum proteins and helped validate the primary structure-based annotation of more than 50% of all T. pallidum proteins with high confidence. This work represents the first T. pallidum proteome-wide structural modeling study and is one of few studies to apply this approach for the functional annotation of a whole proteome.

  3. Basic research on design analysis methods for rotorcraft vibrations

    NASA Technical Reports Server (NTRS)

    Hanagud, S.

    1991-01-01

    The objective of the present work was to develop a method for identifying physically plausible finite element system models of airframe structures from test data. The assumed models were based on linear elastic behavior with general (nonproportional) damping. Physical plausibility of the identified system matrices was insured by restricting the identification process to designated physical parameters only and not simply to the elements of the system matrices themselves. For example, in a large finite element model the identified parameters might be restricted to the moduli for each of the different materials used in the structure. In the case of damping, a restricted set of damping values might be assigned to finite elements based on the material type and on the fabrication processes used. In this case, different damping values might be associated with riveted, bolted and bonded elements. The method itself is developed first, and several approaches are outlined for computing the identified parameter values. The method is applied first to a simple structure for which the 'measured' response is actually synthesized from an assumed model. Both stiffness and damping parameter values are accurately identified. The true test, however, is the application to a full-scale airframe structure. In this case, a NASTRAN model and actual measured modal parameters formed the basis for the identification of a restricted set of physically plausible stiffness and damping parameters.

  4. Does Head Start differentially benefit children with risks targeted by the program’s service model?☆

    PubMed Central

    Miller, Elizabeth B.; Farkas, George; Duncan, Greg J.

    2015-01-01

    Data from the Head Start Impact Study (N = 3540) were used to test for differential benefits of Head Start after one program year and after kindergarten on pre-academic and behavior outcomes for children at risk in the domains targeted by the program’s comprehensive services. Although random assignment to Head Start produced positive treatment main effects on children’s pre-academic skills and behavior problems, residualized growth models showed that random assignment to Head Start did not differentially benefit the pre-academic skills of children with risk factors targeted by the Head Start service model. The models showed detrimental impacts of Head Start for maternal-reported behavior problems of high-risk children, but slightly more positive impacts for teacher-reported behavior. Policy implications for Head Start are discussed. PMID:26379369

  5. Mobile diabetes intervention study: testing a personalized treatment/behavioral communication intervention for blood glucose control.

    PubMed

    Quinn, Charlene C; Gruber-Baldini, Ann L; Shardell, Michelle; Weed, Kelly; Clough, Suzanne S; Peeples, Malinda; Terrin, Michael; Bronich-Hall, Lauren; Barr, Erik; Lender, Dan

    2009-07-01

    National data find glycemic control is within target (A1c<7.0%) for 37% of patients with diabetes, and only 7% meet recommended glycemic, lipid, and blood pressure goals. To compare active interventions and usual care for glucose control in a randomized clinical trial (RCT) among persons with diabetes cared for by primary care physicians (PCPs) over the course of 1 year. Physician practices (n=36) in 4 geographic areas are randomly assigned to 1 of 4 study groups. The intervention is a diabetes communication system, using mobile phones and patient/physician portals to allow patient-specific treatment and communication. All physicians receive American Diabetes Association (ADA) Guidelines for diabetes care. Patients with poor diabetes control (A1c> or =7.5%) at baseline (n=260) are enrolled in study groups based on PCP randomization. All study patients receive blood glucose (BG) meters and a year's supply of testing materials. Patients in three treatment groups select one of two mobile phone models, receive one-year unlimited mobile phone data and service plan, register on the web-based individual patient portal and receive study treatment phone software based on study assignment. Control group patients receive usual care from their PCP. The primary outcome is mean change in A1c over a 12-month intervention period. Traditional methods of disease management have not achieved adequate control for BG and other conditions important to persons with diabetes. Tools to improve communication between patients and PCPs may improve patient outcomes and be satisfactory to patients and physicians. This RCT is ongoing.

  6. A Statistical Discrimination Experiment for Eurasian Events Using a Twenty-Seven-Station Network

    DTIC Science & Technology

    1980-07-08

    to test the effectiveness of a multivariate method of analysis for distinguishing earthquakes from explosions. The data base for the experiment...to test the effectiveness of a multivariate method of analysis for distinguishing earthquakes from explosions. The data base for the experiment...the weight assigned to each variable whenever a new one is added. Jennrich, R. I. (1977). Stepwise discriminant analysis , in Statistical Methods for

  7. System Engineering of Aerospace and Advanced Technology Programs at AN Astronautics Company

    NASA Astrophysics Data System (ADS)

    Kennedy, Mike O.

    The purpose of this Record of Study is to document an internship with the Martin Marietta Astronautics Group in Denver, Colorado that was performed in partial fulfillment of the requirements for the Doctor of Engineering degree at Texas A&M University, and to demonstrate that the internship objectives have been met. The internship included assignments with two Martin Marietta companies, on three different programs and in four areas of engineering. The Record of Study takes a first-hand look at system engineering, SDI and advanced program management, and the way Martin Marietta conducts business. The five internship objectives were related to assignments in system modeling, system integration, engineering analysis and technical management. In support of the first objective, the effects of thermally and mechanically induced mirror surface distortions upon the wavefront intensity field of a high energy laser beam passing through the optical train of a space-based laser system were modeled. To satisfy the second objective, the restrictive as opposed to the broad interpretation of the 1972 ABM Treaty, and the capability of the Strategic Defense Initiative Zenith Star Program to comply with the Treaty were evaluated. For the third objective, the capability of Martin Marietta to develop an automated analysis system to integrate and analyze Superconducting Super Collider detector designs was investigated. For the fourth objective, the thermal models that were developed in support of the Small Intercontinental Ballistic Missile flight tests were described. And in response to the fifth objective, the technical management role of the Product Integrity Engineer assigned to the Zenith Star spacecraft's Beam Control and Transfer Subsystem was discussed. This Record of Study explores the relationships between the engineering, business, security and social concerns associated with the practice of engineering and the management of programs by a major defense contractor.

  8. Teaching Composition Skills with Weekly Multiple Choice Tests in Lieu of Theme Writing. Final Report.

    ERIC Educational Resources Information Center

    Scannell, Dale P.; Haugh, Oscar M.

    The purpose of the study was to compare the effectiveness with which composition skills could be taught by the traditional theme-assignment approach and by an experimental method using weekly multiple-choice composition tests in lieu of theme writing. The weekly tests were based on original but typical first-draft compositions and covered problems…

  9. Modeling Teaching with a Computer-Based Concordancer in a TESL Preservice Teacher Education Program.

    ERIC Educational Resources Information Center

    Gan, Siowck-Lee; And Others

    1996-01-01

    This study modeled teaching with a computer-based concordancer in a Teaching English-as-a-Second-Language program. Preservice teachers were randomly assigned to work with computer concordancing software or vocabulary exercises to develop word attack skills. Pretesting and posttesting indicated that computer concordancing was more effective in…

  10. Implementing a Learning Model for a Practical Subject in Distance Education.

    ERIC Educational Resources Information Center

    Weller, M. J.; Hopgood, A. A.

    1997-01-01

    Artificial Intelligence for Technology, a distance learning course at the Open University, is based on a learning model that combines conceptualization, construction, and dialog. This allows a practical emphasis which has been difficult to implement in distance education. The course uses commercial software, real-world-based assignments, and a…

  11. Investigating the Relationships among Metacognitive Strategy Training, Willingness to Read English Medical Texts, and Reading Comprehension Ability Using Structural Equation Modeling

    ERIC Educational Resources Information Center

    Hassanpour, Masoumeh; Ghonsooly, Behzad; Nooghabi, Mehdi Jabbari; Shafiee, Mohammad Naser

    2017-01-01

    This quasi-experimental study examined the relationship between students' metacognitive awareness and willingness to read English medical texts. So, a model was proposed and tested using structural equation modeling (SEM) with R software. Participants included 98 medical students of two classes. One class was assigned as the control group and the…

  12. “If It’s Not Working, Why Would They Be Testing It?”: mental models of HIV vaccine trials and preventive misconception among men who have sex with men in India

    PubMed Central

    2013-01-01

    Background Informed consent based on comprehension of potential risks and benefits is fundamental to the ethical conduct of clinical research. We explored mental models of candidate HIV vaccines and clinical trials that may impact on the feasibility and ethics of biomedical HIV prevention trials among men who have sex with men (MSM) in India. Methods A community-based research project was designed and implemented in partnership with community-based organizations serving MSM in Chennai and Mumbai. We conducted 12 focus groups (n = 68) with diverse MSM and 14 key informant interviews with MSM community leaders/service providers using a semi-structured interview guide to explore knowledge and beliefs about HIV vaccines and clinical trials. Focus groups (60–90 minutes) and interviews (45–60 minutes) were conducted in participants’ native language (Tamil in Chennai; Marathi or Hindi in Mumbai), audio-taped, transcribed and translated into English. We explored focus group and interview data using thematic analysis and a constant comparative method, with a focus on mental models of HIV vaccines and clinical trials. Results A mental model of HIV vaccine-induced seropositivity as “having HIV” resulted in fears of vaccine-induced infection and HIV stigma. Some participants feared inactivated vaccines might “drink blood” and “come alive”. Pervasive preventive misconception was based on a mental model of prevention trials as interventions, overestimation of likely efficacy of candidate vaccines and likelihood of being assigned to the experimental group, with expectations of protective benefits and decreased condom use. Widespread misunderstanding and lack of acceptance of placebo and random assignment supported perceptions of clinical trials as “cheating”. Key informants expressed concerns that volunteers from vulnerable Indian communities were being used as “experimental rats” to benefit high-income countries. Conclusions Evidence-informed interventions that engage with shared mental models among potential trial volunteers, along with policies and funding mechanisms that ensure local access to products that demonstrate efficacy in trials, may support the safe and ethical implementation of HIV vaccine trials in India. PMID:23919283

  13. Assessing the skeletal age from a hand radiograph: automating the Tanner-Whitehouse method

    NASA Astrophysics Data System (ADS)

    Niemeijer, Meindert; van Ginneken, Bram; Maas, Casper A.; Beek, Frederik J. A.; Viergever, Max A.

    2003-05-01

    The skeletal maturity of children is usually assessed from a standard radiograph of the left hand and wrist. An established clinical method to determine the skeletal maturity is the Tanner-Whitehouse (TW2) method. This method divides the skeletal development into several stages (labelled A, B, ...,I). We are developing an automated system based on this method. In this work we focus on assigning a stage to one region of interest (ROI), the middle phalanx of the third finger. We classify each ROI as follows. A number of ROIs which have been assigned a certain stage by a radiologist are used to construct a mean image for that stage. For a new input ROI, landmarks are detected by using an Active Shape Model. These are used to align the mean images with the input image. Subsequently the correlation between each transformed mean stage image and the input is calculated. The input ROI can be assigned to the stage with the highest correlation directly, or the values can be used as features in a classifier. The method was tested on 71 cases ranging from stage E to I. The ROI was staged correctly in 73.2% of all cases and in 97.2% of all incorrectly staged cases the error was not more than one stage.

  14. The UCERF3 grand inversion: Solving for the long‐term rate of ruptures in a fault system

    USGS Publications Warehouse

    Page, Morgan T.; Field, Edward H.; Milner, Kevin; Powers, Peter M.

    2014-01-01

    We present implementation details, testing, and results from a new inversion‐based methodology, known colloquially as the “grand inversion,” developed for the Uniform California Earthquake Rupture Forecast (UCERF3). We employ a parallel simulated annealing algorithm to solve for the long‐term rate of all ruptures that extend through the seismogenic thickness on major mapped faults in California while simultaneously satisfying available slip‐rate, paleoseismic event‐rate, and magnitude‐distribution constraints. The inversion methodology enables the relaxation of fault segmentation and allows for the incorporation of multifault ruptures, which are needed to remove magnitude‐distribution misfits that were present in the previous model, UCERF2. The grand inversion is more objective than past methodologies, as it eliminates the need to prescriptively assign rupture rates. It also provides a means to easily update the model as new data become available. In addition to UCERF3 model results, we present verification of the grand inversion, including sensitivity tests, tuning of equation set weights, convergence metrics, and a synthetic test. These tests demonstrate that while individual rupture rates are poorly resolved by the data, integrated quantities such as magnitude–frequency distributions and, most importantly, hazard metrics, are much more robust.

  15. US hydropower resource assessment for Hawaii

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Francfort, J.E.

    1996-09-01

    US DOE is developing an estimate of the undeveloped hydropower potential in US. The Hydropower Evaluation Software (HES) is a computer model developed by INEL for this purpose. HES measures the undeveloped hydropower resources available in US, using uniform criteria for measurement. The software was tested using hydropower information and data provided by Southwestern Power Administration. It is a menu-driven program that allows the PC user to assign environmental attributes to potential hydropower sites, calculate development suitability factors for each site based on the environmental attributes, and generate reports. This report describes the resource assessment results for the State ofmore » Hawaii.« less

  16. Treatment of Obsessive Compulsive Disorder in Young Children: An Intervention Model and Case Series

    ERIC Educational Resources Information Center

    Ginsburg, Golda S.; Burstein, Marcy; Becker, Kimberly D.; Drake, Kelly L.

    2011-01-01

    This article presents an intervention model for young children with obsessive-compulsive disorder (OCD). The intervention, designed to reduce compulsive behavior and improve parenting practices, was tested using a multiple baseline design with 7 children (M = 6 years old; 57% female) in which participants were randomly assigned to 1, 2, or 3 weeks…

  17. Characteristics of HIV-infected U.S. Army soldiers linked in molecular transmission clusters, 2001-2012

    PubMed Central

    Jagodzinski, Linda L.; Liu, Ying; Pham, Peter T.; Kijak, Gustavo H.; Tovanabutra, Sodsai; McCutchan, Francine E.; Scoville, Stephanie L.; Cersovsky, Steven B.; Michael, Nelson L.; Scott, Paul T.; Peel, Sheila A.

    2017-01-01

    Objective Recent surveillance data suggests the United States (U.S.) Army HIV epidemic is concentrated among men who have sex with men. To identify potential targets for HIV prevention strategies, the relationship between demographic and clinical factors and membership within transmission clusters based on baseline pol sequences of HIV-infected Soldiers from 2001 through 2012 were analyzed. Methods We conducted a retrospective analysis of baseline partial pol sequences, demographic and clinical characteristics available for all Soldiers in active service and newly-diagnosed with HIV-1 infection from January 1, 2001 through December 31, 2012. HIV-1 subtype designations and transmission clusters were identified from phylogenetic analysis of sequences. Univariate and multivariate logistic regression models were used to evaluate and adjust for the association between characteristics and cluster membership. Results Among 518 of 995 HIV-infected Soldiers with available partial pol sequences, 29% were members of a transmission cluster. Assignment to a southern U.S. region at diagnosis and year of diagnosis were independently associated with cluster membership after adjustment for other significant characteristics (p<0.10) of age, race, year of diagnosis, region of duty assignment, sexually transmitted infections, last negative HIV test, antiretroviral therapy, and transmitted drug resistance. Subtyping of the pol fragment indicated HIV-1 subtype B infection predominated (94%) among HIV-infected Soldiers. Conclusion These findings identify areas to explore as HIV prevention targets in the U.S. Army. An increased frequency of current force testing may be justified, especially among Soldiers assigned to duty in installations with high local HIV prevalence such as southern U.S. states. PMID:28759645

  18. Whole Protein Native Fitness Potentials

    NASA Astrophysics Data System (ADS)

    Faraggi, Eshel; Kloczkowski, Andrzej

    2013-03-01

    Protein structure prediction can be separated into two tasks: sample the configuration space of the protein chain, and assign a fitness between these hypothetical models and the native structure of the protein. One of the more promising developments in this area is that of knowledge based energy functions. However, standard approaches using pair-wise interactions have shown shortcomings demonstrated by the superiority of multi-body-potentials. These shortcomings are due to residue pair-wise interaction being dependent on other residues along the chain. We developed a method that uses whole protein information filtered through machine learners to score protein models based on their likeness to native structures. For all models we calculated parameters associated with the distance to the solvent and with distances between residues. These parameters, in addition to energy estimates obtained by using a four-body-potential, DFIRE, and RWPlus were used as training for machine learners to predict the fitness of the models. Testing on CASP 9 targets showed that our method is superior to DFIRE, RWPlus, and the four-body potential, which are considered standards in the field.

  19. Evaluation of Automated Model Calibration Techniques for Residential Building Energy Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, J.; Polly, B.; Collis, J.

    2013-09-01

    This simulation study adapts and applies the general framework described in BESTEST-EX (Judkoff et al 2010) for self-testing residential building energy model calibration methods. BEopt/DOE-2.2 is used to evaluate four mathematical calibration methods in the context of monthly, daily, and hourly synthetic utility data for a 1960's-era existing home in a cooling-dominated climate. The home's model inputs are assigned probability distributions representing uncertainty ranges, random selections are made from the uncertainty ranges to define 'explicit' input values, and synthetic utility billing data are generated using the explicit input values. The four calibration methods evaluated in this study are: an ASHRAEmore » 1051-RP-based approach (Reddy and Maor 2006), a simplified simulated annealing optimization approach, a regression metamodeling optimization approach, and a simple output ratio calibration approach. The calibration methods are evaluated for monthly, daily, and hourly cases; various retrofit measures are applied to the calibrated models and the methods are evaluated based on the accuracy of predicted savings, computational cost, repeatability, automation, and ease of implementation.« less

  20. Evaluation of Automated Model Calibration Techniques for Residential Building Energy Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    and Ben Polly, Joseph Robertson; Polly, Ben; Collis, Jon

    2013-09-01

    This simulation study adapts and applies the general framework described in BESTEST-EX (Judkoff et al 2010) for self-testing residential building energy model calibration methods. BEopt/DOE-2.2 is used to evaluate four mathematical calibration methods in the context of monthly, daily, and hourly synthetic utility data for a 1960's-era existing home in a cooling-dominated climate. The home's model inputs are assigned probability distributions representing uncertainty ranges, random selections are made from the uncertainty ranges to define "explicit" input values, and synthetic utility billing data are generated using the explicit input values. The four calibration methods evaluated in this study are: an ASHRAEmore » 1051-RP-based approach (Reddy and Maor 2006), a simplified simulated annealing optimization approach, a regression metamodeling optimization approach, and a simple output ratio calibration approach. The calibration methods are evaluated for monthly, daily, and hourly cases; various retrofit measures are applied to the calibrated models and the methods are evaluated based on the accuracy of predicted savings, computational cost, repeatability, automation, and ease of implementation.« less

  1. Image categorization for marketing purposes

    NASA Astrophysics Data System (ADS)

    Almishari, Mishari I.; Lee, Haengju; Gnanasambandam, Nathan

    2011-03-01

    Images meant for marketing and promotional purposes (i.e. coupons) represent a basic component in incentivizing customers to visit shopping outlets and purchase discounted commodities. They also help department stores in attracting more customers and potentially, speeding up their cash flow. While coupons are available from various sources - print, web, etc. categorizing these monetary instruments is a benefit to the users. We are interested in an automatic categorizer system that aggregates these coupons from different sources (web, digital coupons, paper coupons, etc) and assigns a type to each of these coupons in an efficient manner. While there are several dimensions to this problem, in this paper we study the problem of accurately categorizing/classifying the coupons. We propose and evaluate four different techniques for categorizing the coupons namely, word-based model, n-gram-based model, externally weighing model, weight decaying model which take advantage of known machine learning algorithms. We evaluate these techniques and they achieve high accuracies in the range of 73.1% to 93.2%. We provide various examples of accuracy optimizations that can be performed and show a progressive increase in categorization accuracy for our test dataset.

  2. Writing Assignments with a Metacognitive Component Enhance Learning in a Large Introductory Biology Course

    PubMed Central

    Mynlieff, Michelle; Manogaran, Anita L.; St. Maurice, Martin

    2014-01-01

    Writing assignments, including note taking and written recall, should enhance retention of knowledge, whereas analytical writing tasks with metacognitive aspects should enhance higher-order thinking. In this study, we assessed how certain writing-intensive “interventions,” such as written exam corrections and peer-reviewed writing assignments using Calibrated Peer Review and including a metacognitive component, improve student learning. We designed and tested the possible benefits of these approaches using control and experimental variables across and between our three-section introductory biology course. Based on assessment, students who corrected exam questions showed significant improvement on postexam assessment compared with their nonparticipating peers. Differences were also observed between students participating in written and discussion-based exercises. Students with low ACT scores benefited equally from written and discussion-based exam corrections, whereas students with midrange to high ACT scores benefited more from written than discussion-based exam corrections. Students scored higher on topics learned via peer-reviewed writing assignments relative to learning in an active classroom discussion or traditional lecture. However, students with low ACT scores (17–23) did not show the same benefit from peer-reviewed written essays as the other students. These changes offer significant student learning benefits with minimal additional effort by the instructors. PMID:26086661

  3. Topic modeling for cluster analysis of large biological and medical datasets

    PubMed Central

    2014-01-01

    Background The big data moniker is nowhere better deserved than to describe the ever-increasing prodigiousness and complexity of biological and medical datasets. New methods are needed to generate and test hypotheses, foster biological interpretation, and build validated predictors. Although multivariate techniques such as cluster analysis may allow researchers to identify groups, or clusters, of related variables, the accuracies and effectiveness of traditional clustering methods diminish for large and hyper dimensional datasets. Topic modeling is an active research field in machine learning and has been mainly used as an analytical tool to structure large textual corpora for data mining. Its ability to reduce high dimensionality to a small number of latent variables makes it suitable as a means for clustering or overcoming clustering difficulties in large biological and medical datasets. Results In this study, three topic model-derived clustering methods, highest probable topic assignment, feature selection and feature extraction, are proposed and tested on the cluster analysis of three large datasets: Salmonella pulsed-field gel electrophoresis (PFGE) dataset, lung cancer dataset, and breast cancer dataset, which represent various types of large biological or medical datasets. All three various methods are shown to improve the efficacy/effectiveness of clustering results on the three datasets in comparison to traditional methods. A preferable cluster analysis method emerged for each of the three datasets on the basis of replicating known biological truths. Conclusion Topic modeling could be advantageously applied to the large datasets of biological or medical research. The three proposed topic model-derived clustering methods, highest probable topic assignment, feature selection and feature extraction, yield clustering improvements for the three different data types. Clusters more efficaciously represent truthful groupings and subgroupings in the data than traditional methods, suggesting that topic model-based methods could provide an analytic advancement in the analysis of large biological or medical datasets. PMID:25350106

  4. Topic modeling for cluster analysis of large biological and medical datasets.

    PubMed

    Zhao, Weizhong; Zou, Wen; Chen, James J

    2014-01-01

    The big data moniker is nowhere better deserved than to describe the ever-increasing prodigiousness and complexity of biological and medical datasets. New methods are needed to generate and test hypotheses, foster biological interpretation, and build validated predictors. Although multivariate techniques such as cluster analysis may allow researchers to identify groups, or clusters, of related variables, the accuracies and effectiveness of traditional clustering methods diminish for large and hyper dimensional datasets. Topic modeling is an active research field in machine learning and has been mainly used as an analytical tool to structure large textual corpora for data mining. Its ability to reduce high dimensionality to a small number of latent variables makes it suitable as a means for clustering or overcoming clustering difficulties in large biological and medical datasets. In this study, three topic model-derived clustering methods, highest probable topic assignment, feature selection and feature extraction, are proposed and tested on the cluster analysis of three large datasets: Salmonella pulsed-field gel electrophoresis (PFGE) dataset, lung cancer dataset, and breast cancer dataset, which represent various types of large biological or medical datasets. All three various methods are shown to improve the efficacy/effectiveness of clustering results on the three datasets in comparison to traditional methods. A preferable cluster analysis method emerged for each of the three datasets on the basis of replicating known biological truths. Topic modeling could be advantageously applied to the large datasets of biological or medical research. The three proposed topic model-derived clustering methods, highest probable topic assignment, feature selection and feature extraction, yield clustering improvements for the three different data types. Clusters more efficaciously represent truthful groupings and subgroupings in the data than traditional methods, suggesting that topic model-based methods could provide an analytic advancement in the analysis of large biological or medical datasets.

  5. Decision support for hospital bed management using adaptable individual length of stay estimations and shared resources.

    PubMed

    Schmidt, Robert; Geisler, Sandra; Spreckelsen, Cord

    2013-01-07

    Elective patient admission and assignment planning is an important task of the strategic and operational management of a hospital and early on became a central topic of clinical operations research. The management of hospital beds is an important subtask. Various approaches have been proposed, involving the computation of efficient assignments with regard to the patients' condition, the necessity of the treatment, and the patients' preferences. However, these approaches are mostly based on static, unadaptable estimates of the length of stay and, thus, do not take into account the uncertainty of the patient's recovery. Furthermore, the effect of aggregated bed capacities have not been investigated in this context. Computer supported bed management, combining an adaptable length of stay estimation with the treatment of shared resources (aggregated bed capacities) has not yet been sufficiently investigated. The aim of our work is: 1) to define a cost function for patient admission taking into account adaptable length of stay estimations and aggregated resources, 2) to define a mathematical program formally modeling the assignment problem and an architecture for decision support, 3) to investigate four algorithmic methodologies addressing the assignment problem and one base-line approach, and 4) to evaluate these methodologies w.r.t. cost outcome, performance, and dismissal ratio. The expected free ward capacity is calculated based on individual length of stay estimates, introducing Bernoulli distributed random variables for the ward occupation states and approximating the probability densities. The assignment problem is represented as a binary integer program. Four strategies for solving the problem are applied and compared: an exact approach, using the mixed integer programming solver SCIP; and three heuristic strategies, namely the longest expected processing time, the shortest expected processing time, and random choice. A baseline approach serves to compare these optimization strategies with a simple model of the status quo. All the approaches are evaluated by a realistic discrete event simulation: the outcomes are the ratio of successful assignments and dismissals, the computation time, and the model's cost factors. A discrete event simulation of 226,000 cases shows a reduction of the dismissal rate compared to the baseline by more than 30 percentage points (from a mean dismissal ratio of 74.7% to 40.06% comparing the status quo with the optimization strategies). Each of the optimization strategies leads to an improved assignment. The exact approach has only a marginal advantage over the heuristic strategies in the model's cost factors (≤3%). Moreover,this marginal advantage was only achieved at the price of a computational time fifty times that of the heuristic models (an average computing time of 141 s using the exact method, vs. 2.6 s for the heuristic strategy). In terms of its performance and the quality of its solution, the heuristic strategy RAND is the preferred method for bed assignment in the case of shared resources. Future research is needed to investigate whether an equally marked improvement can be achieved in a large scale clinical application study, ideally one comprising all the departments involved in admission and assignment planning.

  6. Working with Sparse Data in Rated Language Tests: Generalizability Theory Applications

    ERIC Educational Resources Information Center

    Lin, Chih-Kai

    2017-01-01

    Sparse-rated data are common in operational performance-based language tests, as an inevitable result of assigning examinee responses to a fraction of available raters. The current study investigates the precision of two generalizability-theory methods (i.e., the rating method and the subdividing method) specifically designed to accommodate the…

  7. The incremental validity of a computerised assessment added to clinical rating scales to differentiate adult ADHD from autism spectrum disorder.

    PubMed

    Groom, Madeleine J; Young, Zoe; Hall, Charlotte L; Gillott, Alinda; Hollis, Chris

    2016-09-30

    There is a clinical need for objective evidence-based measures that are sensitive and specific to ADHD when compared with other neurodevelopmental disorders. This study evaluated the incremental validity of adding an objective measure of activity and computerised cognitive assessment to clinical rating scales to differentiate adult ADHD from Autism spectrum disorders (ASD). Adults with ADHD (n=33) or ASD (n=25) performed the QbTest, comprising a Continuous Performance Test with motion-tracker to record physical activity. QbTest parameters measuring inattention, impulsivity and hyperactivity were combined to provide a summary score ('QbTotal'). Binary stepwise logistic regression measured the probability of assignment to the ADHD or ASD group based on scores on the Conners Adult ADHD Rating Scale-subscale E (CAARS-E) and Autism Quotient (AQ10) in the first step and then QbTotal added in the second step. The model fit was significant at step 1 (CAARS-E, AQ10) with good group classification accuracy. These predictors were retained and QbTotal was added, resulting in a significant improvement in model fit and group classification accuracy. All predictors were significant. ROC curves indicated superior specificity of QbTotal. The findings present preliminary evidence that adding QbTest to clinical rating scales may improve the differentiation of ADHD and ASD in adults. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  8. From the Kochen-Specker theorem to noncontextuality inequalities without assuming determinism.

    PubMed

    Kunjwal, Ravi; Spekkens, Robert W

    2015-09-11

    The Kochen-Specker theorem demonstrates that it is not possible to reproduce the predictions of quantum theory in terms of a hidden variable model where the hidden variables assign a value to every projector deterministically and noncontextually. A noncontextual value assignment to a projector is one that does not depend on which other projectors-the context-are measured together with it. Using a generalization of the notion of noncontextuality that applies to both measurements and preparations, we propose a scheme for deriving inequalities that test whether a given set of experimental statistics is consistent with a noncontextual model. Unlike previous inequalities inspired by the Kochen-Specker theorem, we do not assume that the value assignments are deterministic and therefore in the face of a violation of our inequality, the possibility of salvaging noncontextuality by abandoning determinism is no longer an option. Our approach is operational in the sense that it does not presume quantum theory: a violation of our inequality implies the impossibility of a noncontextual model for any operational theory that can account for the experimental observations, including any successor to quantum theory.

  9. Three-dimensional finite element models of the human pubic symphysis with viscohyperelastic soft tissues.

    PubMed

    Li, Zuoping; Alonso, Jorge E; Kim, Jong-Eun; Davidson, James S; Etheridge, Brandon S; Eberhardt, Alan W

    2006-09-01

    Three-dimensional finite element (FE) models of human pubic symphyses were constructed from computed tomography image data of one male and one female cadaver pelvis. The pubic bones, interpubic fibrocartilaginous disc and four pubic ligaments were segmented semi-automatically and meshed with hexahedral elements using automatic mesh generation schemes. A two-term viscoelastic Prony series, determined by curve fitting results of compressive creep experiments, was used to model the rate-dependent effects of the interpubic disc and the pubic ligaments. Three-parameter Mooney-Rivlin material coefficients were calculated for the discs using a heuristic FE approach based on average experimental joint compression data. Similarly, a transversely isotropic hyperelastic material model was applied to the ligaments to capture average tensile responses. Linear elastic isotropic properties were assigned to bone. The applicability of the resulting models was tested in bending simulations in four directions and in tensile tests of varying load rates. The model-predicted results correlated reasonably with the joint bending stiffnesses and rate-dependent tensile responses measured in experiments, supporting the validity of the estimated material coefficients and overall modeling approach. This study represents an important and necessary step in the eventual development of biofidelic pelvis models to investigate symphysis response under high-energy impact conditions, such as motor vehicle collisions.

  10. Forecasting the impact of transport improvements on commuting and residential choice

    NASA Astrophysics Data System (ADS)

    Elhorst, J. Paul; Oosterhaven, Jan

    2006-03-01

    This paper develops a probabilistic, competing-destinations, assignment model that predicts changes in the spatial pattern of the working population as a result of transport improvements. The choice of residence is explained by a new non-parametric model, which represents an alternative to the popular multinominal logit model. Travel times between zones are approximated by a normal distribution function with different mean and variance for each pair of zones, whereas previous models only use average travel times. The model’s forecast error of the spatial distribution of the Dutch working population is 7% when tested on 1998 base-year data. To incorporate endogenous changes in its causal variables, an almost ideal demand system is estimated to explain the choice of transport mode, and a new economic geography inter-industry model (RAEM) is estimated to explain the spatial distribution of employment. In the application, the model is used to forecast the impact of six mutually exclusive Dutch core-periphery railway proposals in the projection year 2020.

  11. Smoke detection

    DOEpatents

    Warmack, Robert J. Bruce; Wolf, Dennis A.; Frank, Steven Shane

    2016-09-06

    Various apparatus and methods for smoke detection are disclosed. In one embodiment, a method of training a classifier for a smoke detector comprises inputting sensor data from a plurality of tests into a processor. The sensor data is processed to generate derived signal data corresponding to the test data for respective tests. The derived signal data is assigned into categories comprising at least one fire group and at least one non-fire group. Linear discriminant analysis (LDA) training is performed by the processor. The derived signal data and the assigned categories for the derived signal data are inputs to the LDA training. The output of the LDA training is stored in a computer readable medium, such as in a smoke detector that uses LDA to determine, based on the training, whether present conditions indicate the existence of a fire.

  12. Smoke detection

    DOEpatents

    Warmack, Robert J. Bruce; Wolf, Dennis A.; Frank, Steven Shane

    2015-10-27

    Various apparatus and methods for smoke detection are disclosed. In one embodiment, a method of training a classifier for a smoke detector comprises inputting sensor data from a plurality of tests into a processor. The sensor data is processed to generate derived signal data corresponding to the test data for respective tests. The derived signal data is assigned into categories comprising at least one fire group and at least one non-fire group. Linear discriminant analysis (LDA) training is performed by the processor. The derived signal data and the assigned categories for the derived signal data are inputs to the LDA training. The output of the LDA training is stored in a computer readable medium, such as in a smoke detector that uses LDA to determine, based on the training, whether present conditions indicate the existence of a fire.

  13. Understanding evidence-based diagnosis.

    PubMed

    Kohn, Michael A

    2014-01-01

    The real meaning of the word "diagnosis" is naming the disease that is causing a patient's illness. The cognitive process of assigning this name is a mysterious combination of pattern recognition and the hypothetico-deductive approach that is only remotely related to the mathematical process of using test results to update the probability of a disease. What I refer to as "evidence-based diagnosis" is really evidence-based use of medical tests to guide treatment decisions. Understanding how to use test results to update the probability of disease can help us interpret test results more rationally. Also, evidence-based diagnosis reminds us to consider the costs and risks of testing and the dangers of over-diagnosis and over-treatment, in addition to the costs and risks of missing serious disease.

  14. Parallelization of a Fully-Distributed Hydrologic Model using Sub-basin Partitioning

    NASA Astrophysics Data System (ADS)

    Vivoni, E. R.; Mniszewski, S.; Fasel, P.; Springer, E.; Ivanov, V. Y.; Bras, R. L.

    2005-12-01

    A primary obstacle towards advances in watershed simulations has been the limited computational capacity available to most models. The growing trend of model complexity, data availability and physical representation has not been matched by adequate developments in computational efficiency. This situation has created a serious bottleneck which limits existing distributed hydrologic models to small domains and short simulations. In this study, we present novel developments in the parallelization of a fully-distributed hydrologic model. Our work is based on the TIN-based Real-time Integrated Basin Simulator (tRIBS), which provides continuous hydrologic simulation using a multiple resolution representation of complex terrain based on a triangulated irregular network (TIN). While the use of TINs reduces computational demand, the sequential version of the model is currently limited over large basins (>10,000 km2) and long simulation periods (>1 year). To address this, a parallel MPI-based version of the tRIBS model has been implemented and tested using high performance computing resources at Los Alamos National Laboratory. Our approach utilizes domain decomposition based on sub-basin partitioning of the watershed. A stream reach graph based on the channel network structure is used to guide the sub-basin partitioning. Individual sub-basins or sub-graphs of sub-basins are assigned to separate processors to carry out internal hydrologic computations (e.g. rainfall-runoff transformation). Routed streamflow from each sub-basin forms the major hydrologic data exchange along the stream reach graph. Individual sub-basins also share subsurface hydrologic fluxes across adjacent boundaries. We demonstrate how the sub-basin partitioning provides computational feasibility and efficiency for a set of test watersheds in northeastern Oklahoma. We compare the performance of the sequential and parallelized versions to highlight the efficiency gained as the number of processors increases. We also discuss how the coupled use of TINs and parallel processing can lead to feasible long-term simulations in regional watersheds while preserving basin properties at high-resolution.

  15. Prioritizing CD4 Count Monitoring in Response to ART in Resource-Constrained Settings: A Retrospective Application of Prediction-Based Classification

    PubMed Central

    Liu, Yan; Li, Xiaohong; Johnson, Margaret; Smith, Collette; Kamarulzaman, Adeeba bte; Montaner, Julio; Mounzer, Karam; Saag, Michael; Cahn, Pedro; Cesar, Carina; Krolewiecki, Alejandro; Sanne, Ian; Montaner, Luis J.

    2012-01-01

    Background Global programs of anti-HIV treatment depend on sustained laboratory capacity to assess treatment initiation thresholds and treatment response over time. Currently, there is no valid alternative to CD4 count testing for monitoring immunologic responses to treatment, but laboratory cost and capacity limit access to CD4 testing in resource-constrained settings. Thus, methods to prioritize patients for CD4 count testing could improve treatment monitoring by optimizing resource allocation. Methods and Findings Using a prospective cohort of HIV-infected patients (n = 1,956) monitored upon antiretroviral therapy initiation in seven clinical sites with distinct geographical and socio-economic settings, we retrospectively apply a novel prediction-based classification (PBC) modeling method. The model uses repeatedly measured biomarkers (white blood cell count and lymphocyte percent) to predict CD4+ T cell outcome through first-stage modeling and subsequent classification based on clinically relevant thresholds (CD4+ T cell count of 200 or 350 cells/µl). The algorithm correctly classified 90% (cross-validation estimate = 91.5%, standard deviation [SD] = 4.5%) of CD4 count measurements <200 cells/µl in the first year of follow-up; if laboratory testing is applied only to patients predicted to be below the 200-cells/µl threshold, we estimate a potential savings of 54.3% (SD = 4.2%) in CD4 testing capacity. A capacity savings of 34% (SD = 3.9%) is predicted using a CD4 threshold of 350 cells/µl. Similar results were obtained over the 3 y of follow-up available (n = 619). Limitations include a need for future economic healthcare outcome analysis, a need for assessment of extensibility beyond the 3-y observation time, and the need to assign a false positive threshold. Conclusions Our results support the use of PBC modeling as a triage point at the laboratory, lessening the need for laboratory-based CD4+ T cell count testing; implementation of this tool could help optimize the use of laboratory resources, directing CD4 testing towards higher-risk patients. However, further prospective studies and economic analyses are needed to demonstrate that the PBC model can be effectively applied in clinical settings. Please see later in the article for the Editors' Summary PMID:22529752

  16. Student Perceptions of a Form-Based Approach to Reflective Journaling

    ERIC Educational Resources Information Center

    Mabrouk, Patricia Ann

    2015-01-01

    The author describes the principal findings of a survey study looking at student perceptions of a new form-based approach to reflective journaling. A form-based journal assignment was developed for use in introductory lecture courses and tested over a two-year period in an Honors General Chemistry course for engineers with a total of 157…

  17. A Study of the Effectiveness of Web-Based Homework in Teaching Undergraduate Business Statistics

    ERIC Educational Resources Information Center

    Palocsay, Susan W.; Stevens, Scott P.

    2008-01-01

    Web-based homework (WBH) Technology can simplify the creation and grading of assignments as well as provide a feasible platform for assessment testing, but its effect on student learning in business statistics is unknown. This is particularly true of the latest software development of Web-based tutoring agents that dynamically evaluate individual…

  18. Combining In-School and Community-Based Media Efforts: Reducing Marijuana and Alcohol Uptake among Younger Adolescents

    ERIC Educational Resources Information Center

    Slater, Michael D.; Kelly, Kathleen J.; Edwards, Ruth W.; Thurman, Pamela J.; Plested, Barbara A.; Keefe, Thomas J.; Lawrence, Frank R.; Henry, Kimberly L.

    2006-01-01

    This study tests the impact of an in-school mediated communication campaign based on social marketing principles, in combination with a participatory, community-based media effort, on marijuana, alcohol and tobacco uptake among middle-school students. Eight media treatment and eight control communities throughout the US were randomly assigned to…

  19. Locating, characterizing and minimizing sources of error for a paper case-based structured oral examination in a multi-campus clerkship.

    PubMed

    Kumar, A; Bridgham, R; Potts, M; Gushurst, C; Hamp, M; Passal, D

    2001-01-01

    To determine consistency of assessment in a new paper case-based structured oral examination in a multi-community pediatrics clerkship, and to identify correctable problems in the administration of examination and assessment process. Nine paper case-based oral examinations were audio-taped. From audio-tapes five community coordinators scored examiner behaviors and graded student performance. Correlations among examiner behaviors scores were examined. Graphs identified grading patterns of evaluators. The effect of exam-giving on evaluators was assessed by t-test. Reliability of grades was calculated and the effect of reducing assessment problems was modeled. Exam-givers differed most in their "teaching-guiding" behavior, and this negatively correlated with student grades. Exam reliability was lowered mainly by evaluator differences in leniency and grading pattern; less important was absence of standardization in cases. While grade reliability was low in early use of the paper case-based oral examination, modeling of plausible effects of training and monitoring for greater uniformity in administration of the examination and assigning scores suggests that more adequate reliabilities can be attained.

  20. Video as an effective method to deliver pretest information for rapid human immunodeficiency testing.

    PubMed

    Merchant, Roland C; Clark, Melissa A; Mayer, Kenneth H; Seage Iii, George R; DeGruttola, Victor G; Becker, Bruce M

    2009-02-01

    Video-based delivery of human immunodeficiency virus (HIV) pretest information might assist in streamlining HIV screening and testing efforts in the emergency department (ED). The objectives of this study were to determine if the video "Do you know about rapid HIV testing?" is an acceptable alternative to an in-person information session on rapid HIV pretest information, in regard to comprehension of rapid HIV pretest fundamentals, and to identify patients who might have difficulties in comprehending pretest information. This was a noninferiority trial of 574 participants in an ED opt-in rapid HIV screening program who were randomly assigned to receive identical pretest information from either an animated and live-action 9.5-minute video or an in-person information session. Pretest information comprehension was assessed using a questionnaire. The video would be accepted as not inferior to the in-person information session if the 95% confidence interval (CI) of the difference (Delta) in mean scores on the questionnaire between the two information groups was less than a 10% decrease in the in-person information session arm's mean score. Linear regression models were constructed to identify patients with lower mean scores based upon study arm assignment, demographic characteristics, and history of prior HIV testing. The questionnaire mean scores were 20.1 (95% CI = 19.7 to 20.5) for the video arm and 20.8 (95% CI = 20.4 to 21.2) for the in-person information session arm. The difference in mean scores compared to the mean score for the in-person information session met the noninferiority criterion for this investigation (Delta = 0.68; 95% CI = 0.18 to 1.26). In a multivariable linear regression model, Blacks/African Americans, Hispanics, and those with Medicare and Medicaid insurance exhibited slightly lower mean scores, regardless of the pretest information delivery format. There was a strong relationship between fewer years of formal education and lower mean scores on the questionnaire. Age, gender, type of insurance, partner/marital status, and history of prior HIV testing were not predictive of scores on the questionnaire. In terms of patient comprehension of rapid HIV pretest information fundamentals, the video was an acceptable substitute to pretest information delivered by an HIV test counselor. Both the video and the in-person information session were less effective in providing pretest information for patients with fewer years of formal education.

  1. Impact of Interactive Energy-Balance Modeling on Student Learning in a Core-Curriculum Earth Science Course

    NASA Astrophysics Data System (ADS)

    Mandock, R. L.

    2008-12-01

    An interactive instructional module has been developed to study energy balance at the earth's surface. The module uses a graphical interface to model each of the major energy components involved in the partitioning of energy at this surface: net radiation, sensible and latent heat fluxes, ground heat flux, heat storage, anthropogenic heat, and advective heat transport. The graphical interface consists of an energy-balance diagram composed of sky elements, a line or box representing the air or sea surface, and arrows which indicate magnitude and direction of each of the energy fluxes. In April 2005 an energy-balance project and laboratory assignment were developed for a core-curriculum earth science course at Clark Atlanta University. The energy-balance project analyzes surface weather data from an assigned station of the Georgia Automated Environmental Monitoring Network (AEMN). The first part of the project requires the student to print two observations of the "Current Conditions" web page for the assigned station: one between the hours of midnight and 5:00 a.m., and the other between the hours of 3:00- 5:00 p.m. A satellite image of the southeastern United States must accompany each of these printouts. The second part of the project can be completed only after the student has modeled the 4 environmental scenarios taught in the energy-balance laboratory assignment. The student uses the energy-balance model to determine the energy-flux components for each of the printed weather conditions at the assigned station. On successful completion of the project, the student has become familiar with: (1) how weather observations can be used to constrain parameters in a microclimate model, (2) one common type of error in measurement made by weather sensors, (3) some of the uses and limitations of environmental models, and (4) fundamentals of the distribution of energy at the earth's surface. The project and laboratory assignment tie together many of the earth science concepts taught in the course: geology (soils), oceanography (surface mixed layer), and atmospheric science (meteorology of the lowest part of the atmosphere). Details of the project and its impact on student assessment tests and surveys will be presented.

  2. Process-Based Development of Competence Models to Computer Science Education

    ERIC Educational Resources Information Center

    Zendler, Andreas; Seitz, Cornelia; Klaudt, Dieter

    2016-01-01

    A process model ("cpm.4.CSE") is introduced that allows the development of competence models in computer science education related to curricular requirements. It includes eight subprocesses: (a) determine competence concept, (b) determine competence areas, (c) identify computer science concepts, (d) assign competence dimensions to…

  3. Improving the quality of depression and pain care in multiple sclerosis using collaborative care: The MS-care trial protocol.

    PubMed

    Ehde, Dawn M; Alschuler, Kevin N; Sullivan, Mark D; Molton, Ivan P; Ciol, Marcia A; Bombardier, Charles H; Curran, Mary C; Gertz, Kevin J; Wundes, Annette; Fann, Jesse R

    2018-01-01

    Evidence-based pharmacological and behavioral interventions are often underutilized or inaccessible to persons with multiple sclerosis (MS) who have chronic pain and/or depression. Collaborative care is an evidence-based patient-centered, integrated, system-level approach to improving the quality and outcomes of depression care. We describe the development of and randomized controlled trial testing a novel intervention, MS Care, which uses a collaborative care model to improve the care of depression and chronic pain in a MS specialty care setting. We describe a 16-week randomized controlled trial comparing the MS Care collaborative care intervention to usual care in an outpatient MS specialty center. Eligible participants with chronic pain of at least moderate intensity (≥3/10) and/or major depressive disorder are randomly assigned to MS Care or usual care. MS Care utilizes a care manager to implement and coordinate guideline-based medical and behavioral treatments with the patient, clinic providers, and pain/depression treatment experts. We will compare outcomes at post-treatment and 6-month follow up. We hypothesize that participants randomly assigned to MS Care will demonstrate significantly greater control of both pain and depression at post-treatment (primary endpoint) relative to those assigned to usual care. Secondary analyses will examine quality of care, patient satisfaction, adherence to MS care, and quality of life. Study findings will aid patients, clinicians, healthcare system leaders, and policy makers in making decisions about effective care for pain and depression in MS healthcare systems. (PCORI- IH-1304-6379; clinicaltrials.gov: NCT02137044). This trial is registered at ClinicalTrials.gov, protocol NCT02137044. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Performance of different SNP panels for parentage testing in two East Asian cattle breeds.

    PubMed

    Strucken, E M; Gudex, B; Ferdosi, M H; Lee, H K; Song, K D; Gibson, J P; Kelly, M; Piper, E K; Porto-Neto, L R; Lee, S H; Gondro, C

    2014-08-01

    The International Society for Animal Genetics (ISAG) proposed a panel of single nucleotide polymorphisms (SNPs) for parentage testing in cattle (a core panel of 100 SNPs and an additional list of 100 SNPs). However, markers specific to East Asian taurine cattle breeds were not included, and no information is available as to whether the ISAG panel performs adequately for these breeds. We tested ISAG's core (100 SNP) and full (200 SNP) panels on two East Asian taurine breeds: the Korean Hanwoo and the Japanese Wagyu, the latter from the Australian herd. Even though the power of exclusion was high at 0.99 for both ISAG panels, the core panel performed poorly with 3.01% false-positive assignments in the Hanwoo population and 3.57% in the Wagyu. The full ISAG panel identified all sire-offspring relations correctly in both populations with 0.02% of relations wrongly excluded in the Hanwoo population. Based on these results, we created and tested two population-specific marker panels: one for the Wagyu population, which showed no false-positive assignments with either 100 or 200 SNPs, and a second panel for the Hanwoo, which still had some false-positive assignments with 100 SNPs but no false positives using 200 SNPs. In conclusion, for parentage assignment in East Asian cattle breeds, only the full ISAG panel is adequate for parentage testing. If fewer markers should be used, it is advisable to use population-specific markers rather than the ISAG panel. © 2014 Stichting International Foundation for Animal Genetics.

  5. Intensity vs. Duration: Comparing the Effects of a Fluency-Based Reading Intervention Program, in After-School vs. Summer School Settings

    ERIC Educational Resources Information Center

    Katzir, Tami; Goldberg, Alyssa; Aryeh, Terry Joffe Ben; Donnelley, Katharine; Wolf, Maryanne

    2013-01-01

    Two versions of RAVE-O, a fluency-based reading intervention were examined over a 2-intervention period: a 9-month, 44-hour afterschool intervention program, and a month long, 44-hour summer intervention program. 80 children in grades 1-3 were tested on the two subtests of the Test of Word-Reading Efficiency and were assigned to one of 6 groups…

  6. Tracing Asian Seabass Individuals to Single Fish Farms Using Microsatellites

    PubMed Central

    Yue, Gen Hua; Xia, Jun Hong; Liu, Peng; Liu, Feng; Sun, Fei; Lin, Grace

    2012-01-01

    Traceability through physical labels is well established, but it is not highly reliable as physical labels can be easily changed or lost. Application of DNA markers to the traceability of food plays an increasingly important role for consumer protection and confidence building. In this study, we tested the efficiency of 16 polymorphic microsatellites and their combinations for tracing 368 fish to four populations where they originated. Using the maximum likelihood and Bayesian methods, three most efficient microsatellites were required to assign over 95% of fish to the correct populations. Selection of markers based on the assignment score estimated with the software WHICHLOCI was most effective in choosing markers for individual assignment, followed by the selection based on the allele number of individual markers. By combining rapid DNA extraction, and high-throughput genotyping of selected microsatellites, it is possible to conduct routine genetic traceability with high accuracy in Asian seabass. PMID:23285169

  7. Thermal response to firefighting activities in residential structure fires: impact of job assignment and suppression tactic.

    PubMed

    Horn, Gavin P; Kesler, Richard M; Kerber, Steve; Fent, Kenneth W; Schroeder, Tad J; Scott, William S; Fehling, Patricia C; Fernhall, Bo; Smith, Denise L

    2018-03-01

    Firefighters' thermal burden is generally attributed to high heat loads from the fire and metabolic heat generation, which may vary between job assignments and suppression tactic employed. Utilising a full-sized residential structure, firefighters were deployed in six job assignments utilising two attack tactics (1. Water applied from the interior, or 2. Exterior water application before transitioning to the interior). Environmental temperatures decreased after water application, but more rapidly with transitional attack. Local ambient temperatures for inside operation firefighters were higher than other positions (average ~10-30 °C). Rapid elevations in skin temperature were found for all job assignments other than outside command. Neck skin temperatures for inside attack firefighters were ~0.5 °C lower when the transitional tactic was employed. Significantly higher core temperatures were measured for the outside ventilation and overhaul positions than the inside positions (~0.6-0.9 °C). Firefighters working at all fireground positions must be monitored and relieved based on intensity and duration. Practitioner Summary: Testing was done to characterise the thermal burden experienced by firefighters in different job assignments who responded to controlled residential fires (with typical furnishings) using two tactics. Ambient, skin and core temperatures varied based on job assignment and tactic employed, with rapid elevations in core temperature in many roles.

  8. "A Cellular Encounter": Constructing the Cell as a Whole System Using Illustrative Models

    ERIC Educational Resources Information Center

    Cohen, Joel I.

    2014-01-01

    A standard part of biology curricula is a project-based assessment of cell structure and function. However, these are often individual assignments that promote little problem-solving or group learning and avoid the subject of organelle chemical interactions. I evaluate a model-based cell project designed to foster group and individual guided…

  9. From nationwide standardized testing to school-based alternative embedded assessment in Israel: Students' performance in the matriculation 2000 project

    NASA Astrophysics Data System (ADS)

    Dori, Yehudit J.

    2003-01-01

    Matriculation 2000 was a 5-year project aimed at moving from the nationwide traditional examination system in Israel to a school-based alternative embedded assessment. Encompassing 22 high schools from various communities in the country, the Project aimed at fostering deep understanding, higher-order thinking skills, and students' engagement in learning through alternative teaching and embedded assessment methods. This article describes research conducted during the fifth year of the Project at 2 experimental and 2 control schools. The research objective was to investigate students' learning outcomes in chemistry and biology in the Matriculation 2000 Project. The assumption was that alternative embedded assessment has some effect on students' performance. The experimental students scored significantly higher than their control group peers on low-level assignments and more so on assignments that required higher-order thinking skills. The findings indicate that given adequate support and teachers' consent and collaboration, schools can transfer from nationwide or statewide standardized testing to school-based alter-native embedded assessment.

  10. Promoting fruit and vegetable consumption. Testing an intervention based on the theory of planned behaviour.

    PubMed

    Kothe, E J; Mullan, B A; Butow, P

    2012-06-01

    This study evaluated the efficacy of a theory of planned behaviour (TPB) based intervention to increase fruit and vegetable consumption. The extent to which fruit and vegetable consumption and change in intake could be explained by the TPB was also examined. Participants were randomly assigned to two levels of intervention frequency matched for intervention content (low frequency n=92, high frequency n=102). Participants received TPB-based email messages designed to increase fruit and vegetable consumption, messages targeted attitude, subjective norm and perceived behavioural control (PBC). Baseline and post-intervention measures of TPB variables and behaviour were collected. Across the entire study cohort, fruit and vegetable consumption increased by 0.83 servings/day between baseline and follow-up. Intention, attitude, subjective norm and PBC also increased (p<.05). The TPB successfully modelled fruit and vegetable consumption at both time points but not behaviour change. The increase of fruit and vegetable consumption is a promising preliminary finding for those primarily interested in increasing fruit and vegetable consumption. However, those interested in theory development may have concerns about the use of this model to explain behaviour change in this context. More high quality experimental tests of the theory are needed to confirm this result. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. Evaluative conditioning makes slim models less desirable as standards for comparison and increases body satisfaction.

    PubMed

    Martijn, Carolien; Sheeran, Paschal; Wesseldijk, Laura W; Merrick, Hannah; Webb, Thomas L; Roefs, Anne; Jansen, Anita

    2013-04-01

    The present research tested whether an evaluative conditioning intervention makes thin-ideal models less enviable as standards for appearance-based social comparisons (Study 1), and increases body satisfaction (Study 2). Female participants were randomly assigned to intervention versus control conditions in both studies (ns = 66 and 39). Intervention participants learned to associate thin-ideal models with synonyms of fake whereas control participants completed an equivalent task that did not involve learning this association. The dependent variable in Study 1 was an implicit measure of idealization of slim models assessed via a modified Implicit Association Test (IAT). Study 2 used a validated, self-report measure of body satisfaction as the outcome variable. Intervention participants showed significantly less implicit idealization of slim models on the IAT compared to controls (Study 1). In Study 2, participants who undertook the intervention exhibited an increase in body satisfaction scores whereas no such increase was observed for control participants. The present research indicates that it is possible to overcome the characteristic impact of thin-ideal models on women's judgments of their bodies. An evaluative conditioning intervention made it less likely that slim models were perceived as targets to be emulated, and enhanced body satisfaction. 2013 APA, all rights reserved

  12. U.S. hydropower resource assessment for Idaho

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Conner, A.M.; Francfort, J.E.

    1998-08-01

    The US Department of Energy is developing an estimate of the undeveloped hydropower potential in the US. The Hydropower Evaluation Software (HES) is a computer model that was developed by the Idaho National Engineering and Environmental Laboratory for this purpose. HES measures the undeveloped hydropower resources available in the US, using uniform criteria for measurement. The software was developed and tested using hydropower information and data provided by the Southwestern Power Administration. It is a menu-driven program that allows the personal computer user to assign environmental attributes to potential hydropower sites, calculate development suitability factors for each site based onmore » the environmental attributes present, and generate reports based on these suitability factors. This report describes the resource assessment results for the State of Idaho.« less

  13. Novel selective TOCSY method enables NMR spectral elucidation of metabolomic mixtures

    NASA Astrophysics Data System (ADS)

    MacKinnon, Neil; While, Peter T.; Korvink, Jan G.

    2016-11-01

    Complex mixture analysis is routinely encountered in NMR-based investigations. With the aim of component identification, spectral complexity may be addressed chromatographically or spectroscopically, the latter being favored to reduce sample handling requirements. An attractive experiment is selective total correlation spectroscopy (sel-TOCSY), which is capable of providing tremendous spectral simplification and thereby enhancing assignment capability. Unfortunately, isolating a well resolved resonance is increasingly difficult as the complexity of the mixture increases and the assumption of single spin system excitation is no longer robust. We present TOCSY optimized mixture elucidation (TOOMIXED), a technique capable of performing spectral assignment particularly in the case where the assumption of single spin system excitation is relaxed. Key to the technique is the collection of a series of 1D sel-TOCSY experiments as a function of the isotropic mixing time (τm), resulting in a series of resonance intensities indicative of the underlying molecular structure. By comparing these τm -dependent intensity patterns with a library of pre-determined component spectra, one is able to regain assignment capability. After consideration of the technique's robustness, we tested TOOMIXED firstly on a model mixture. As a benchmark we were able to assign a molecule with high confidence in the case of selectively exciting an isolated resonance. Assignment confidence was not compromised when performing TOOMIXED on a resonance known to contain multiple overlapping signals, and in the worst case the method suggested a follow-up sel-TOCSY experiment to confirm an ambiguous assignment. TOOMIXED was then demonstrated on two realistic samples (whisky and urine), where under our conditions an approximate limit of detection of 0.6 mM was determined. Taking into account literature reports for the sel-TOCSY limit of detection, the technique should reach on the order of 10 μ M sensitivity. We anticipate this technique will be highly attractive to various analytical fields facing mixture analysis, including metabolomics, foodstuff analysis, pharmaceutical analysis, and forensics.

  14. CLUSTERnGO: a user-defined modelling platform for two-stage clustering of time-series data.

    PubMed

    Fidaner, Işık Barış; Cankorur-Cetinkaya, Ayca; Dikicioglu, Duygu; Kirdar, Betul; Cemgil, Ali Taylan; Oliver, Stephen G

    2016-02-01

    Simple bioinformatic tools are frequently used to analyse time-series datasets regardless of their ability to deal with transient phenomena, limiting the meaningful information that may be extracted from them. This situation requires the development and exploitation of tailor-made, easy-to-use and flexible tools designed specifically for the analysis of time-series datasets. We present a novel statistical application called CLUSTERnGO, which uses a model-based clustering algorithm that fulfils this need. This algorithm involves two components of operation. Component 1 constructs a Bayesian non-parametric model (Infinite Mixture of Piecewise Linear Sequences) and Component 2, which applies a novel clustering methodology (Two-Stage Clustering). The software can also assign biological meaning to the identified clusters using an appropriate ontology. It applies multiple hypothesis testing to report the significance of these enrichments. The algorithm has a four-phase pipeline. The application can be executed using either command-line tools or a user-friendly Graphical User Interface. The latter has been developed to address the needs of both specialist and non-specialist users. We use three diverse test cases to demonstrate the flexibility of the proposed strategy. In all cases, CLUSTERnGO not only outperformed existing algorithms in assigning unique GO term enrichments to the identified clusters, but also revealed novel insights regarding the biological systems examined, which were not uncovered in the original publications. The C++ and QT source codes, the GUI applications for Windows, OS X and Linux operating systems and user manual are freely available for download under the GNU GPL v3 license at http://www.cmpe.boun.edu.tr/content/CnG. sgo24@cam.ac.uk Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.

  15. Clinical and functional outcomes after 2 years in the early detection and intervention for the prevention of psychosis multisite effectiveness trial.

    PubMed

    McFarlane, William R; Levin, Bruce; Travis, Lori; Lucas, F Lee; Lynch, Sarah; Verdi, Mary; Williams, Deanna; Adelsheim, Steven; Calkins, Roderick; Carter, Cameron S; Cornblatt, Barbara; Taylor, Stephan F; Auther, Andrea M; McFarland, Bentson; Melton, Ryan; Migliorati, Margaret; Niendam, Tara; Ragland, J Daniel; Sale, Tamara; Salvador, Melina; Spring, Elizabeth

    2015-01-01

    To test effectiveness of the Early Detection, Intervention, and Prevention of Psychosis Program in preventing the onset of severe psychosis and improving functioning in a national sample of at-risk youth. In a risk-based allocation study design, 337 youth (age 12-25) at risk of psychosis were assigned to treatment groups based on severity of positive symptoms. Those at clinically higher risk (CHR) or having an early first episode of psychosis (EFEP) were assigned to receive Family-aided Assertive Community Treatment (FACT); those at clinically lower risk (CLR) were assigned to receive community care. Between-groups differences on outcome variables were adjusted statistically according to regression-discontinuity procedures and evaluated using the Global Test Procedure that combined all symptom and functional measures. A total of 337 young people (mean age: 16.6) were assigned to the treatment group (CHR + EFEP, n = 250) or comparison group (CLR, n = 87). On the primary variable, positive symptoms, after 2 years FACT, were superior to community care (2 df, p < .0001) for both CHR (p = .0034) and EFEP (p < .0001) subgroups. Rates of conversion (6.3% CHR vs 2.3% CLR) and first negative event (25% CHR vs 22% CLR) were low but did not differ. FACT was superior in the Global Test (p = .0007; p = .024 for CHR and p = .0002 for EFEP, vs CLR) and in improvement in participation in work and school (p = .025). FACT is effective in improving positive, negative, disorganized and general symptoms, Global Assessment of Functioning, work and school participation and global outcome in youth at risk for, or experiencing very early, psychosis. © The Author 2014. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center.

  16. An evaluation of the PCR-RFLP technique to aid molecular-based monitoring of felids and canids in India

    PubMed Central

    2010-01-01

    Background The order Carnivora is well represented in India, with 58 of the 250 species found globally, occurring here. However, small carnivores figure very poorly in research and conservation policies in India. This is mainly due to the dearth of tested and standardized techniques that are both cost effective and conducive to small carnivore studies in the field. In this paper we present a non-invasive genetic technique standardized for the study of Indian felids and canids with the use of PCR amplification and restriction enzyme digestion of scat collected in the field. Findings Using existing sequences of felids and canids from GenBank, we designed primers from the 16S rRNA region of the mitochondrial genome and tested these on ten species of felids and five canids. We selected restriction enzymes that would cut the selected region differentially for various species within each family. We produced a restriction digestion profile for the potential differentiation of species based on fragment patterns. To test our technique, we used felid PCR primers on scats collected from various habitats in India, representing varied environmental conditions. Amplification success with field collected scats was 52%, while 86% of the products used for restriction digestion could be accurately assigned to species. We verified this through sequencing. A comparison of costs across the various techniques currently used for scat assignment showed that this technique was the most practical and cost effective. Conclusions The species-specific key developed in this paper provides a means for detailed investigations in the future that focus on elusive carnivores in India and this approach provides a model for other studies in areas of Asia where many small carnivores co-occur. PMID:20525407

  17. Anticipation of Personal Genomics Data Enhances Interest and Learning Environment in Genomics and Molecular Biology Undergraduate Courses

    PubMed Central

    Weber, K. Scott; Jensen, Jamie L.; Johnson, Steven M.

    2015-01-01

    An important discussion at colleges is centered on determining more effective models for teaching undergraduates. As personalized genomics has become more common, we hypothesized it could be a valuable tool to make science education more hands on, personal, and engaging for college undergraduates. We hypothesized that providing students with personal genome testing kits would enhance the learning experience of students in two undergraduate courses at Brigham Young University: Advanced Molecular Biology and Genomics. These courses have an emphasis on personal genomics the last two weeks of the semester. Students taking these courses were given the option to receive personal genomics kits in 2014, whereas in 2015 they were not. Students sent their personal genomics samples in on their own and received the data after the course ended. We surveyed students in these courses before and after the two-week emphasis on personal genomics to collect data on whether anticipation of obtaining their own personal genomic data impacted undergraduate student learning. We also tested to see if specific personal genomic assignments improved the learning experience by analyzing the data from the undergraduate students who completed both the pre- and post-course surveys. Anticipation of personal genomic data significantly enhanced student interest and the learning environment based on the time students spent researching personal genomic material and their self-reported attitudes compared to those who did not anticipate getting their own data. Personal genomics homework assignments significantly enhanced the undergraduate student interest and learning based on the same criteria and a personal genomics quiz. We found that for the undergraduate students in both molecular biology and genomics courses, incorporation of personal genomic testing can be an effective educational tool in undergraduate science education. PMID:26241308

  18. Anticipation of Personal Genomics Data Enhances Interest and Learning Environment in Genomics and Molecular Biology Undergraduate Courses.

    PubMed

    Weber, K Scott; Jensen, Jamie L; Johnson, Steven M

    2015-01-01

    An important discussion at colleges is centered on determining more effective models for teaching undergraduates. As personalized genomics has become more common, we hypothesized it could be a valuable tool to make science education more hands on, personal, and engaging for college undergraduates. We hypothesized that providing students with personal genome testing kits would enhance the learning experience of students in two undergraduate courses at Brigham Young University: Advanced Molecular Biology and Genomics. These courses have an emphasis on personal genomics the last two weeks of the semester. Students taking these courses were given the option to receive personal genomics kits in 2014, whereas in 2015 they were not. Students sent their personal genomics samples in on their own and received the data after the course ended. We surveyed students in these courses before and after the two-week emphasis on personal genomics to collect data on whether anticipation of obtaining their own personal genomic data impacted undergraduate student learning. We also tested to see if specific personal genomic assignments improved the learning experience by analyzing the data from the undergraduate students who completed both the pre- and post-course surveys. Anticipation of personal genomic data significantly enhanced student interest and the learning environment based on the time students spent researching personal genomic material and their self-reported attitudes compared to those who did not anticipate getting their own data. Personal genomics homework assignments significantly enhanced the undergraduate student interest and learning based on the same criteria and a personal genomics quiz. We found that for the undergraduate students in both molecular biology and genomics courses, incorporation of personal genomic testing can be an effective educational tool in undergraduate science education.

  19. TH-A-9A-01: Active Optical Flow Model: Predicting Voxel-Level Dose Prediction in Spine SBRT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, J; Wu, Q.J.; Yin, F

    2014-06-15

    Purpose: To predict voxel-level dose distribution and enable effective evaluation of cord dose sparing in spine SBRT. Methods: We present an active optical flow model (AOFM) to statistically describe cord dose variations and train a predictive model to represent correlations between AOFM and PTV contours. Thirty clinically accepted spine SBRT plans are evenly divided into training and testing datasets. The development of predictive model consists of 1) collecting a sequence of dose maps including PTV and OAR (spinal cord) as well as a set of associated PTV contours adjacent to OAR from the training dataset, 2) classifying data into fivemore » groups based on PTV's locations relative to OAR, two “Top”s, “Left”, “Right”, and “Bottom”, 3) randomly selecting a dose map as the reference in each group and applying rigid registration and optical flow deformation to match all other maps to the reference, 4) building AOFM by importing optical flow vectors and dose values into the principal component analysis (PCA), 5) applying another PCA to features of PTV and OAR contours to generate an active shape model (ASM), and 6) computing a linear regression model of correlations between AOFM and ASM.When predicting dose distribution of a new case in the testing dataset, the PTV is first assigned to a group based on its contour characteristics. Contour features are then transformed into ASM's principal coordinates of the selected group. Finally, voxel-level dose distribution is determined by mapping from the ASM space to the AOFM space using the predictive model. Results: The DVHs predicted by the AOFM-based model and those in clinical plans are comparable in training and testing datasets. At 2% volume the dose difference between predicted and clinical plans is 4.2±4.4% and 3.3±3.5% in the training and testing datasets, respectively. Conclusion: The AOFM is effective in predicting voxel-level dose distribution for spine SBRT. Partially supported by NIH/NCI under grant #R21CA161389 and a master research grant by Varian Medical System.« less

  20. Link failure detection in a parallel computer

    DOEpatents

    Archer, Charles J.; Blocksome, Michael A.; Megerian, Mark G.; Smith, Brian E.

    2010-11-09

    Methods, apparatus, and products are disclosed for link failure detection in a parallel computer including compute nodes connected in a rectangular mesh network, each pair of adjacent compute nodes in the rectangular mesh network connected together using a pair of links, that includes: assigning each compute node to either a first group or a second group such that adjacent compute nodes in the rectangular mesh network are assigned to different groups; sending, by each of the compute nodes assigned to the first group, a first test message to each adjacent compute node assigned to the second group; determining, by each of the compute nodes assigned to the second group, whether the first test message was received from each adjacent compute node assigned to the first group; and notifying a user, by each of the compute nodes assigned to the second group, whether the first test message was received.

  1. Modeling the Psychometric Properties of Complex Performance Assessment Tasks Using Confirmatory Factor Analysis: A Multistage Model for Calibrating Tasks

    ERIC Educational Resources Information Center

    Kahraman, Nilufer; De Champlain, Andre; Raymond, Mark

    2012-01-01

    Item-level information, such as difficulty and discrimination are invaluable to the test assembly, equating, and scoring practices. Estimating these parameters within the context of large-scale performance assessments is often hindered by the use of unbalanced designs for assigning examinees to tasks and raters because such designs result in very…

  2. Development of a database for chemical mechanism assignments for volatile organic emissions.

    PubMed

    Carter, William P L

    2015-10-01

    The development of a database for making model species assignments when preparing total organic gas (TOG) emissions input for atmospheric models is described. This database currently has assignments of model species for 12 different gas-phase chemical mechanisms for over 1700 chemical compounds and covers over 3000 chemical categories used in five different anthropogenic TOG profile databases or output by two different biogenic emissions models. This involved developing a unified chemical classification system, assigning compounds to mixtures, assigning model species for the mechanisms to the compounds, and making assignments for unknown, unassigned, and nonvolatile mass. The comprehensiveness of the assignments, the contributions of various types of speciation categories to current profile and total emissions data, inconsistencies with existing undocumented model species assignments, and remaining speciation issues and areas of needed work are also discussed. The use of the system to prepare input for SMOKE, the Speciation Tool, and for biogenic models is described in the supplementary materials. The database, associated programs and files, and a users manual are available online at http://www.cert.ucr.edu/~carter/emitdb . Assigning air quality model species to the hundreds of emitted chemicals is a necessary link between emissions data and modeling effects of emissions on air quality. This is not easy and makes it difficult to implement new and more chemically detailed mechanisms in models. If done incorrectly, it is similar to errors in emissions speciation or the chemical mechanism used. Nevertheless, making such assignments is often an afterthought in chemical mechanism development and emissions processing, and existing assignments are usually undocumented and have errors and inconsistencies. This work is designed to address some of these problems.

  3. Game theoretic sensor management for target tracking

    NASA Astrophysics Data System (ADS)

    Shen, Dan; Chen, Genshe; Blasch, Erik; Pham, Khanh; Douville, Philip; Yang, Chun; Kadar, Ivan

    2010-04-01

    This paper develops and evaluates a game-theoretic approach to distributed sensor-network management for target tracking via sensor-based negotiation. We present a distributed sensor-based negotiation game model for sensor management for multi-sensor multi-target tacking situations. In our negotiation framework, each negotiation agent represents a sensor and each sensor maximizes their utility using a game approach. The greediness of each sensor is limited by the fact that the sensor-to-target assignment efficiency will decrease if too many sensor resources are assigned to a same target. It is similar to the market concept in real world, such as agreements between buyers and sellers in an auction market. Sensors are willing to switch targets so that they can obtain their highest utility and the most efficient way of applying their resources. Our sub-game perfect equilibrium-based negotiation strategies dynamically and distributedly assign sensors to targets. Numerical simulations are performed to demonstrate our sensor-based negotiation approach for distributed sensor management.

  4. Effects of a decision support intervention on decisional conflict associated with microsatellite instability testing.

    PubMed

    Hall, Michael J; Manne, Sharon L; Winkel, Gary; Chung, Daniel S; Weinberg, David S; Meropol, Neal J

    2011-02-01

    Decision support to facilitate informed consent is increasingly important for complicated medical tests. Here, we test a theoretical model of factors influencing decisional conflict in a study examining the effects of a decision support aid that was designed to assist patients at high risk for hereditary nonpolyposis colorectal cancer (CRC) deciding whether to pursue the microsatellite instability (MSI) test. Participants were 239 CRC patients at high familial risk for a genetic mutation who completed surveys before and after exposure to the intervention. Half of the sample was assigned to the CD-ROM aid and half received a brief description of the test. Structural equation modeling was employed to examine associations among the intervention, knowledge, pros and cons to having MSI testing, self-efficacy, preparedness, and decisional conflict. The goodness of fit for the model was acceptable [FIML, full information maximum likelihood, χ(2) (df = 280) = 392.24; P = 0.00]. As expected, the paths to decisional conflict were significant for postintervention pros of MSI testing (t = -2.43; P < 0.05), cons of MSI testing (t = 2.78; P < 0.05), and preparedness (t = -7.27; P < 0.01). The intervention impacted decisional conflict by increasing knowledge about the MSI test and knowledge exerted its effects on decisional conflict by increasing preparedness to make a decision about the test and by increases in perceived benefits of having the test. Increasing knowledge, preparedness, and perceived benefits of undergoing the MSI test facilitate informed decision making for this test. Understanding mechanisms underlying health decisions is critical for improving decisional support. Individuals with Lynch syndrome have an elevated lifetime risk of CRC. Risk of Lynch syndrome may be assessed with a tumor-based screening test (MSI testing or immunohistochemical tissue staining). ©2011 AACR.

  5. Utility Estimates of Disease-Specific Health States in Prostate Cancer from Three Different Perspectives.

    PubMed

    Gries, Katharine S; Regier, Dean A; Ramsey, Scott D; Patrick, Donald L

    2017-06-01

    To develop a statistical model generating utility estimates for prostate cancer specific health states, using preference weights derived from the perspectives of prostate cancer patients, men at risk for prostate cancer, and society. Utility estimate values were calculated using standard gamble (SG) methodology. Study participants valued 18 prostate-specific health states with the five attributes: sexual function, urinary function, bowel function, pain, and emotional well-being. Appropriateness of model (linear regression, mixed effects, or generalized estimating equation) to generate prostate cancer utility estimates was determined by paired t-tests to compare observed and predicted values. Mixed-corrected standard SG utility estimates to account for loss aversion were calculated based on prospect theory. 132 study participants assigned values to the health states (n = 40 men at risk for prostate cancer; n = 43 men with prostate cancer; n = 49 general population). In total, 792 valuations were elicited (six health states for each 132 participants). The most appropriate model for the classification system was a mixed effects model; correlations between the mean observed and predicted utility estimates were greater than 0.80 for each perspective. Developing a health-state classification system with preference weights for three different perspectives demonstrates the relative importance of main effects between populations. The predicted values for men with prostate cancer support the hypothesis that patients experiencing the disease state assign higher utility estimates to health states and there is a difference in valuations made by patients and the general population.

  6. Measuring the Impact of Haptic Feedback Using the SOLO Taxonomy

    ERIC Educational Resources Information Center

    Minogue, James; Jones, Gail

    2009-01-01

    The application of Biggs' and Collis' Structure of Observed Learning Outcomes taxonomy in the evaluation of student learning about cell membrane transport via a computer-based learning environment is described in this study. Pre-test-post-test comparisons of student outcome data (n = 80) were made across two groups of randomly assigned students:…

  7. A Comparison of Reliability and Precision of Subscore Reporting Methods for a State English Language Proficiency Assessment

    ERIC Educational Resources Information Center

    Longabach, Tanya; Peyton, Vicki

    2018-01-01

    K-12 English language proficiency tests that assess multiple content domains (e.g., listening, speaking, reading, writing) often have subsections based on these content domains; scores assigned to these subsections are commonly known as subscores. Testing programs face increasing customer demands for the reporting of subscores in addition to the…

  8. Effectiveness of a Classroom Mindfulness Coloring Activity for Test Anxiety in Children

    ERIC Educational Resources Information Center

    Carsley, Dana; Heath, Nancy L.; Fajnerova, Sophia

    2015-01-01

    To evaluate the effectiveness of mindfulness-based structured versus unstructured coloring on test anxiety, 52 participants (53.8% female; M[subscript age] = 10.92 years, SD = 0.82) were randomly assigned to either a structured mandala (n = 26) or free coloring condition (n = 26), and completed a standardized anxiety measure to assess anxiety…

  9. Zoonoses action plan Salmonella monitoring programme: an investigation of the sampling protocol.

    PubMed

    Snary, E L; Munday, D K; Arnold, M E; Cook, A J C

    2010-03-01

    The Zoonoses Action Plan (ZAP) Salmonella Programme was established by the British Pig Executive to monitor Salmonella prevalence in quality-assured British pigs at slaughter by testing a sample of pigs with a meat juice enzyme-linked immunosorbent assay for antibodies against group B and C(1) Salmonella. Farms were assigned a ZAP level (1 to 3) depending on the monitored prevalence, and ZAP 2 or 3 farms were required to act to reduce the prevalence. The ultimate goal was to reduce the risk of human salmonellosis attributable to British pork. A mathematical model has been developed to describe the ZAP sampling protocol. Results show that the probability of assigning a farm the correct ZAP level was high, except for farms that had a seroprevalence close to the cutoff points between different ZAP levels. Sensitivity analyses identified that the probability of assigning a farm to the correct ZAP level was dependent on the sensitivity and specificity of the test, the number of batches taken to slaughter each quarter, and the number of samples taken per batch. The variability of the predicted seroprevalence was reduced as the number of batches or samples increased and, away from the cutoff points, the probability of being assigned the correct ZAP level increased as the number of batches or samples increased. In summary, the model described here provided invaluable insight into the ZAP sampling protocol. Further work is required to understand the impact of the program for Salmonella infection in British pig farms and therefore on human health.

  10. Quantitative evaluation of expression difference in report assignments between nursing and radiologic technology departments.

    PubMed

    Nishimoto, Naoki; Yokooka, Yuki; Yagahara, Ayako; Uesugi, Masahito; Ogasawara, Katsuhiko

    2011-01-01

    Our purpose in this study was to investigate the expression differences in report assignments between students in nursing and radiologic technology departments. We have known that faculties could identify differences, such as word usage, through grading their students' assignments. However, there are no reports in the literature dealing with expression differences in vocabulary usage in medical informatics education based on statistical techniques or other quantitative measures. The report assignment asked for students' opinions in the event that they found a rare case of a disease in a hospital after they graduated from professional school. We processed student report data automatically, and we applied the space vector model and TF/IDF (term frequency/inverse document frequency) scoring to 129 report assignments. The similarity-score distributions among the assignments for these two departments were close to normal. We focused on the sets of terms that occurred exclusively in either department. For terms such as "radiation therapy" or "communication skills" that occurred in the radiologic technology department, the TF/IDF score was 8.01. The same score was obtained for terms such as "privacy guidelines" or "consent of patients" that occurred in the nursing department. These results will help faculties to provide a better education based on identified expression differences from students' background knowledge.

  11. Model-based morphological segmentation and labeling of coronary angiograms.

    PubMed

    Haris, K; Efstratiadis, S N; Maglaveras, N; Pappas, C; Gourassas, J; Louridas, G

    1999-10-01

    A method for extraction and labeling of the coronary arterial tree (CAT) using minimal user supervision in single-view angiograms is proposed. The CAT structural description (skeleton and borders) is produced, along with quantitative information for the artery dimensions and assignment of coded labels, based on a given coronary artery model represented by a graph. The stages of the method are: 1) CAT tracking and detection; 2) artery skeleton and border estimation; 3) feature graph creation; and iv) artery labeling by graph matching. The approximate CAT centerline and borders are extracted by recursive tracking based on circular template analysis. The accurate skeleton and borders of each CAT segment are computed, based on morphological homotopy modification and watershed transform. The approximate centerline and borders are used for constructing the artery segment enclosing area (ASEA), where the defined skeleton and border curves are considered as markers. Using the marked ASEA, an artery gradient image is constructed where all the ASEA pixels (except the skeleton ones) are assigned the gradient magnitude of the original image. The artery gradient image markers are imposed as its unique regional minima by the homotopy modification method, the watershed transform is used for extracting the artery segment borders, and the feature graph is updated. Finally, given the created feature graph and the known model graph, a graph matching algorithm assigns the appropriate labels to the extracted CAT using weighted maximal cliques on the association graph corresponding to the two given graphs. Experimental results using clinical digitized coronary angiograms are presented.

  12. Outcomes of Parent Education Programs Based on Reevaluation Counseling

    ERIC Educational Resources Information Center

    Wolfe, Randi B.; Hirsch, Barton J.

    2003-01-01

    We report two studies in which a parent education program based on Reevaluation Counseling was field-tested on mothers randomly assigned to treatment groups or equivalent, no-treatment comparison groups. The goal was to evaluate the program's viability, whether there were measurable effects, whether those effects were sustained over time, and…

  13. Investigating human geographic origins using dual-isotope (87Sr/86Sr, δ18O) assignment approaches.

    PubMed

    Laffoon, Jason E; Sonnemann, Till F; Shafie, Termeh; Hofman, Corinne L; Brandes, Ulrik; Davies, Gareth R

    2017-01-01

    Substantial progress in the application of multiple isotope analyses has greatly improved the ability to identify nonlocal individuals amongst archaeological populations over the past decades. More recently the development of large scale models of spatial isotopic variation (isoscapes) has contributed to improved geographic assignments of human and animal origins. Persistent challenges remain, however, in the accurate identification of individual geographic origins from skeletal isotope data in studies of human (and animal) migration and provenance. In an attempt to develop and test more standardized and quantitative approaches to geographic assignment of individual origins using isotopic data two methods, combining 87Sr/86Sr and δ18O isoscapes, are examined for the Circum-Caribbean region: 1) an Interval approach using a defined range of fixed isotopic variation per location; and 2) a Likelihood assignment approach using univariate and bivariate probability density functions. These two methods are tested with enamel isotope data from a modern sample of known origin from Caracas, Venezuela and further explored with two archaeological samples of unknown origin recovered from Cuba and Trinidad. The results emphasize both the potential and limitation of the different approaches. Validation tests on the known origin sample exclude most areas of the Circum-Caribbean region and correctly highlight Caracas as a possible place of origin with both approaches. The positive validation results clearly demonstrate the overall efficacy of a dual-isotope approach to geoprovenance. The accuracy and precision of geographic assignments may be further improved by better understanding of the relationships between environmental and biological isotope variation; continued development and refinement of relevant isoscapes; and the eventual incorporation of a broader array of isotope proxy data.

  14. The Impact of the Good Behavior Game, a Universal Classroom-Based Preventive Intervention in First and Second Grades, on High-Risk Sexual Behaviors and Drug Abuse and Dependence Disorders into Young Adulthood

    PubMed Central

    Wang, Wei; Mackenzie, Amelia C. L.; Brown, C. Hendricks; Ompad, Danielle C.; Or, Flora; Ialongo, Nicholas S.; Poduska, Jeanne M.; Windham, Amy

    2013-01-01

    The Good Behavior Game (GBG), a method of teacher classroom behavior management, was tested in first-and second-grade classrooms in 19 Baltimore City Public Schools beginning in the 1985–1986 school year. The intervention was directed at the classroom as a whole to socialize children to the student role and reduce aggressive, disruptive behaviors, confirmed antecedents of a profile of externalizing problem outcomes. This article reports on the GBG impact on the courses and interrelationships among aggressive, disruptive behavior through middle school, risky sexual behaviors, and drug abuse and dependence disorders through ages 19–21. In five poor to lower-middle class, mainly African American urban areas, classrooms within matched schools were assigned randomly to either the GBG intervention or the control condition. Balanced assignment of children to classrooms was made, and teachers were randomly assigned to intervention or control. Analyses involved multilevel growth mixture modeling. By young adulthood, significant GBG impact was found in terms of reduced high-risk sexual behaviors and drug abuse and dependence disorders among males who in first grade and through middle school were more aggressive, disruptive. A replication with the next cohort of first-grade children with the same teachers occurred during the following school year, but with minimal teacher mentoring and monitoring. Findings were not significant but generally in the predicted direction. A universal classroom-based prevention intervention in first- and second-grade classrooms can reduce drug abuse and dependence disorders and risky sexual behaviors. PMID:23070695

  15. The impact of the Good Behavior Game, a universal classroom-based preventive intervention in first and second grades, on high-risk sexual behaviors and drug abuse and dependence disorders into young adulthood.

    PubMed

    Kellam, Sheppard G; Wang, Wei; Mackenzie, Amelia C L; Brown, C Hendricks; Ompad, Danielle C; Or, Flora; Ialongo, Nicholas S; Poduska, Jeanne M; Windham, Amy

    2014-02-01

    The Good Behavior Game (GBG), a method of teacher classroom behavior management, was tested in first- and second-grade classrooms in 19 Baltimore City Public Schools beginning in the 1985-1986 school year. The intervention was directed at the classroom as a whole to socialize children to the student role and reduce aggressive, disruptive behaviors, confirmed antecedents of a profile of externalizing problem outcomes. This article reports on the GBG impact on the courses and interrelationships among aggressive, disruptive behavior through middle school, risky sexual behaviors, and drug abuse and dependence disorders through ages 19-21. In five poor to lower-middle class, mainly African American urban areas, classrooms within matched schools were assigned randomly to either the GBG intervention or the control condition. Balanced assignment of children to classrooms was made, and teachers were randomly assigned to intervention or control. Analyses involved multilevel growth mixture modeling. By young adulthood, significant GBG impact was found in terms of reduced high-risk sexual behaviors and drug abuse and dependence disorders among males who in first grade and through middle school were more aggressive, disruptive. A replication with the next cohort of first-grade children with the same teachers occurred during the following school year, but with minimal teacher mentoring and monitoring. Findings were not significant but generally in the predicted direction. A universal classroom-based prevention intervention in first- and second-grade classrooms can reduce drug abuse and dependence disorders and risky sexual behaviors.

  16. Contact Prediction for Beta and Alpha-Beta Proteins Using Integer Linear Optimization and its Impact on the First Principles 3D Structure Prediction Method ASTRO-FOLD

    PubMed Central

    Rajgaria, R.; Wei, Y.; Floudas, C. A.

    2010-01-01

    An integer linear optimization model is presented to predict residue contacts in β, α + β, and α/β proteins. The total energy of a protein is expressed as sum of a Cα – Cα distance dependent contact energy contribution and a hydrophobic contribution. The model selects contacts that assign lowest energy to the protein structure while satisfying a set of constraints that are included to enforce certain physically observed topological information. A new method based on hydrophobicity is proposed to find the β-sheet alignments. These β-sheet alignments are used as constraints for contacts between residues of β-sheets. This model was tested on three independent protein test sets and CASP8 test proteins consisting of β, α + β, α/β proteins and was found to perform very well. The average accuracy of the predictions (separated by at least six residues) was approximately 61%. The average true positive and false positive distances were also calculated for each of the test sets and they are 7.58 Å and 15.88 Å, respectively. Residue contact prediction can be directly used to facilitate the protein tertiary structure prediction. This proposed residue contact prediction model is incorporated into the first principles protein tertiary structure prediction approach, ASTRO-FOLD. The effectiveness of the contact prediction model was further demonstrated by the improvement in the quality of the protein structure ensemble generated using the predicted residue contacts for a test set of 10 proteins. PMID:20225257

  17. Development of a Dynamic Traffic Assignment Model for Northern Nevada

    DOT National Transportation Integrated Search

    2014-06-01

    The objective of this research is to build and calibrate a DTA model for Northern Nevada (RenoSparks Area) based on the network profile and travel demand information updated to date. The critical procedures include development of consistent and readi...

  18. Neural-genetic synthesis for state-space controllers based on linear quadratic regulator design for eigenstructure assignment.

    PubMed

    da Fonseca Neto, João Viana; Abreu, Ivanildo Silva; da Silva, Fábio Nogueira

    2010-04-01

    Toward the synthesis of state-space controllers, a neural-genetic model based on the linear quadratic regulator design for the eigenstructure assignment of multivariable dynamic systems is presented. The neural-genetic model represents a fusion of a genetic algorithm and a recurrent neural network (RNN) to perform the selection of the weighting matrices and the algebraic Riccati equation solution, respectively. A fourth-order electric circuit model is used to evaluate the convergence of the computational intelligence paradigms and the control design method performance. The genetic search convergence evaluation is performed in terms of the fitness function statistics and the RNN convergence, which is evaluated by landscapes of the energy and norm, as a function of the parameter deviations. The control problem solution is evaluated in the time and frequency domains by the impulse response, singular values, and modal analysis.

  19. Effectiveness of Gross Model-Based Emotion Regulation Strategies Training on Anger Reduction in Drug-Dependent Individuals and its Sustainability in Follow-up.

    PubMed

    Massah, Omid; Sohrabi, Faramarz; A'azami, Yousef; Doostian, Younes; Farhoudian, Ali; Daneshmand, Reza

    2016-03-01

    Emotion plays an important role in adapting to life changes and stressful events. Difficulty regulating emotions is one of the problems drug abusers often face, and teaching these individuals to express and manage their emotions can be effective on improving their difficult circumstances. The present study aimed to determine the effectiveness of the Gross model-based emotion regulation strategies training on anger reduction in drug-dependent individuals. The present study had a quasi-experimental design wherein pretest-posttest evaluations were applied using a control group. The population under study included addicts attending Marivan's methadone maintenance therapy centers in 2012 - 2013. Convenience sampling was used to select 30 substance-dependent individuals undergoing maintenance treatment who were then randomly assigned to the experiment and control groups. The experiment group received its training in eight two-hour sessions. Data were analyzed using analysis of co-variance and paired t-test. There was significant reduction in anger symptoms of drug-dependent individuals after gross model based emotion regulation training (ERT) (P < 0.001). Moreover, the effectiveness of the training on anger was persistent in the follow-up period. Symptoms of anger in drug-dependent individuals of this study were reduced by gross model-based emotion regulation strategies training. Based on the results of this study, we may conclude that the gross model based emotion regulation strategies training can be applied alongside other therapies to treat drug abusers undergoing rehabilitation.

  20. ["Where there's a woman, there's a Pap smear": the meanings assigned to cervical cancer prevention among women in Salvador, Bahia State, Brazil].

    PubMed

    Rico, Ana María; Iriart, Jorge Alberto Bernstein

    2013-09-01

    This study focuses on the meanings assigned to practices for cervical cancer prevention among women from low-income neighborhoods in Salvador, Bahia State, Brazil. This was a qualitative study based on content analysis of semi-structured interviews with 15 women 24 to 68 years of age. The results showed high appreciation of the Pap smear test, performed as part of routine gynecological examination (but without the patient necessarily having biomedical knowledge of its role). Besides accessibility and quality of health services, other factors influence the way the women assign meaning to cervical cancer prevention. Moral values associated with sexuality and gender influence risk perception, adoption of preventive practices, and interpretation of cervical cytology results. The ongoing practice of the Pap smear test is part of the construction of femininity, which is associated with maturity and personal responsibility for self care in a context of medicalization of the female body.

  1. Modeling and testing miniature torsion specimens for SiC joining development studies for fusion

    DOE PAGES

    Henager, Jr., C. H.; Nguyen, Ba N.; Kurtz, Richard J.; ...

    2015-08-05

    The international fusion community has designed a miniature torsion specimen for neutron irradiation studies of joined SiC and SiC/SiC composite materials. For this research, miniature torsion joints based on this specimen design were fabricated using displacement reactions between Si and TiC to produce Ti 3SiC 2 + SiC joints with SiC and tested in torsion-shear prior to and after neutron irradiation. However, many miniature torsion specimens fail out-of-plane within the SiC specimen body, which makes it problematic to assign a shear strength value to the joints and makes it difficult to compare unirradiated and irradiated strengths to determine irradiation effects.more » Finite element elastic damage and elastic–plastic damage models of miniature torsion joints are developed that indicate shear fracture is more likely to occur within the body of the joined sample and cause out-of-plane failures for miniature torsion specimens when a certain modulus and strength ratio between the joint material and the joined material exists. The model results are compared and discussed with regard to unirradiated and irradiated test data for a variety of joint materials. The unirradiated data includes Ti 3SiC 2 + SiC/CVD-SiC joints with tailored joint moduli, and includes steel/epoxy and CVD-SiC/epoxy joints. Finally, the implications for joint data based on this sample design are discussed.« less

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Warmack, Robert J. Bruce; Wolf, Dennis A.; Frank, Steven Shane

    Various apparatus and methods for smoke detection are disclosed. In one embodiment, a method of training a classifier for a smoke detector comprises inputting sensor data from a plurality of tests into a processor. The sensor data is processed to generate derived signal data corresponding to the test data for respective tests. The derived signal data is assigned into categories comprising at least one fire group and at least one non-fire group. Linear discriminant analysis (LDA) training is performed by the processor. The derived signal data and the assigned categories for the derived signal data are inputs to the LDAmore » training. The output of the LDA training is stored in a computer readable medium, such as in a smoke detector that uses LDA to determine, based on the training, whether present conditions indicate the existence of a fire.« less

  3. Experimental Evaluation of the Effects of a Research-Based Preschool Mathematics Curriculum

    ERIC Educational Resources Information Center

    Clements, Douglas H.; Sarama, Julie

    2008-01-01

    A randomized-trials design was used to evaluate the effectiveness of a preschool mathematics program based on a comprehensive model of research-based curricula development. Thirty-six preschool classrooms were assigned to experimental (Building Blocks), comparison (a different preschool mathematics curriculum), or control conditions. Children were…

  4. Measuring homework completion in behavioral activation.

    PubMed

    Busch, Andrew M; Uebelacker, Lisa A; Kalibatseva, Zornitsa; Miller, Ivan W

    2010-07-01

    The aim of this study was to develop and validate an observer-based coding system for the characterization and completion of homework assignments during Behavioral Activation (BA). Existing measures of homework completion are generally unsophisticated, and there is no current measure of homework completion designed to capture the particularities of BA. The tested scale sought to capture the type of assignment, realm of functioning targeted, extent of completion, and assignment difficulty. Homework assignments were drawn from 12 (mean age = 48, 83% female) clients in two trials of a 10-session BA manual targeting treatment-resistant depression in primary care. The two coders demonstrated acceptable or better reliability on most codes, and unreliable codes were dropped from the proposed scale. In addition, correlations between homework completion and outcome were strong, providing some support for construct validity. Ultimately, this line of research aims to develop a user-friendly, reliable measure of BA homework completion that can be completed by a therapist during session.

  5. The Utility of Writing Assignments in Undergraduate Bioscience

    PubMed Central

    Libarkin, Julie; Ording, Gabriel

    2012-01-01

    We tested the hypothesis that engagement in a few, brief writing assignments in a nonmajors science course can improve student ability to convey critical thought about science. A sample of three papers written by students (n = 30) was coded for presence and accuracy of elements related to scientific writing. Scores for different aspects of scientific writing were significantly correlated, suggesting that students recognized relationships between components of scientific thought. We found that students' ability to write about science topics and state conclusions based on data improved over the course of three writing assignments, while the abilities to state a hypothesis and draw clear connections between human activities and environmental impacts did not improve. Three writing assignments generated significant change in student ability to write scientifically, although our results suggest that three is an insufficient number to generate complete development of scientific writing skills. PMID:22383616

  6. Importance of Foliar Nitrogen Concentration to Predict Forest Productivity in the Mid-Atlantic Region

    Treesearch

    Yude Pan; John Hom; Jennifer Jenkins; Richard Birdsey

    2004-01-01

    To assess what difference it might make to include spatially defined estimates of foliar nitrogen in the regional application of a forest ecosystem model (PnET-II), we composed model predictions of wood production from extensive ground-based forest inventory analysis data across the Mid-Atlantic region. Spatial variation in foliar N concentration was assigned based on...

  7. Searching for moderators and mediators of pharmacological treatment effects in children and adolescents with anxiety disorders.

    PubMed

    Walkup, John T; Labellarte, Michael J; Riddle, Mark A; Pine, Daniel; Greenhill, Laurence; Klein, Rachel; Davies, Mark; Sweeney, Michael; Fu, Caifeng; Abikoff, Howard; Hack, Sabine; Klee, Brain; McCracken, James; Bergman, Lindsey; Piacentini, John; March, John; Compton, Scott; Robinson, James; O'Hara, Thomas; Baker, Sheryl; Vitiello, Benedetto; Ritz, Louise; Roper, Margaret

    2003-01-01

    To examine whether age, gender, ethnicity, type of anxiety disorder, severity of illness, comorbidity, intellectual level, family income, or parental education may function as moderators and whether treatment adherence, medication dose, adverse events, or blinded rater's guess of treatment assignment may function as mediators of pharmacological treatment effect in children and adolescents with anxiety disorders. The database of a recently reported double-blind placebo-controlled trial of fluvoxamine in 128 youths was analyzed. With a mixed-model random-effects regression analysis of the Pediatric Anxiety Rating Scale total score, moderators and mediators were searched by testing for a three-way interaction (strata by treatment by time). A two-way interaction (strata by time) identified predictors of treatment outcome. No significant moderators of efficacy were identified, except for lower baseline depression scores, based on parent's (but not child's) report, being associated with greater improvement (p < .001). Patients with social phobia (p < .05) and greater severity of illness (p < .001) were less likely to improve, independently of treatment assignment. Blinded rater's guess of treatment assignment acted as a possible mediator (p < .001), but improvement was attributed to fluvoxamine, regardless of actual treatment assignment. Treatment adherence tended to be associated (p = .05) with improvement. In this exploratory study, patient demographics, illness characteristics, family income, and parental education did not function as moderators of treatment effect. Social phobia and severity of illness predicted less favorable outcome. Attribution analyses indicated that study blindness remained intact. The presence of concomitant depressive symptoms deserves attention in future treatment studies of anxious children.

  8. Automated modal parameter estimation using correlation analysis and bootstrap sampling

    NASA Astrophysics Data System (ADS)

    Yaghoubi, Vahid; Vakilzadeh, Majid K.; Abrahamsson, Thomas J. S.

    2018-02-01

    The estimation of modal parameters from a set of noisy measured data is a highly judgmental task, with user expertise playing a significant role in distinguishing between estimated physical and noise modes of a test-piece. Various methods have been developed to automate this procedure. The common approach is to identify models with different orders and cluster similar modes together. However, most proposed methods based on this approach suffer from high-dimensional optimization problems in either the estimation or clustering step. To overcome this problem, this study presents an algorithm for autonomous modal parameter estimation in which the only required optimization is performed in a three-dimensional space. To this end, a subspace-based identification method is employed for the estimation and a non-iterative correlation-based method is used for the clustering. This clustering is at the heart of the paper. The keys to success are correlation metrics that are able to treat the problems of spatial eigenvector aliasing and nonunique eigenvectors of coalescent modes simultaneously. The algorithm commences by the identification of an excessively high-order model from frequency response function test data. The high number of modes of this model provides bases for two subspaces: one for likely physical modes of the tested system and one for its complement dubbed the subspace of noise modes. By employing the bootstrap resampling technique, several subsets are generated from the same basic dataset and for each of them a model is identified to form a set of models. Then, by correlation analysis with the two aforementioned subspaces, highly correlated modes of these models which appear repeatedly are clustered together and the noise modes are collected in a so-called Trashbox cluster. Stray noise modes attracted to the mode clusters are trimmed away in a second step by correlation analysis. The final step of the algorithm is a fuzzy c-means clustering procedure applied to a three-dimensional feature space to assign a degree of physicalness to each cluster. The proposed algorithm is applied to two case studies: one with synthetic data and one with real test data obtained from a hammer impact test. The results indicate that the algorithm successfully clusters similar modes and gives a reasonable quantification of the extent to which each cluster is physical.

  9. A novel approach for assigning levels to monkey and human lumbosacral spinal cord based on ventral horn morphology

    PubMed Central

    Gross, Cassandra; Ellison, Brian; Buchman, Aron S.; Terasawa, Ei

    2017-01-01

    Proper identification of spinal cord levels is crucial for clinical-pathological and imaging studies in humans, but can be a challenge given technical limitations. We have previously demonstrated in non-primate models that the contours of the spinal ventral horn are determined by the position of motoneuron pools. These positions are preserved within and among individuals and can be used to identify lumbosacral spinal levels. Here we tested the hypothesis that this approach can be extended to identify monkey and human spinal levels. In 7 rhesus monkeys, we retrogradely labeled motoneuron pools that represent rostral, middle and caudal landmarks of the lumbosacral enlargement. We then aligned the lumbosacral enlargements among animals using absolute length, segmental level or a relative scale based upon rostral and caudal landmarks. Inter-animal matching of labeled motoneurons across the lumbosacral enlargement was most precise when using internal landmarks. We then reconstructed 3 human lumbosacral spinal cords, and aligned these based upon homologous internal landmarks. Changes in shape of the ventral horn were consistent among human subjects using this relative scale, despite marked differences in absolute length or age. These data suggest that the relative position of spinal motoneuron pools is conserved across species, including primates. Therefore, in clinical-pathological or imaging studies in humans, one can assign spinal cord levels to even single sections by matching ventral horn shape to standardized series. PMID:28542213

  10. Rapid and accurate taxonomic classification of insect (class Insecta) cytochrome c oxidase subunit 1 (COI) DNA barcode sequences using a naïve Bayesian classifier

    PubMed Central

    Porter, Teresita M; Gibson, Joel F; Shokralla, Shadi; Baird, Donald J; Golding, G Brian; Hajibabaei, Mehrdad

    2014-01-01

    Current methods to identify unknown insect (class Insecta) cytochrome c oxidase (COI barcode) sequences often rely on thresholds of distances that can be difficult to define, sequence similarity cut-offs, or monophyly. Some of the most commonly used metagenomic classification methods do not provide a measure of confidence for the taxonomic assignments they provide. The aim of this study was to use a naïve Bayesian classifier (Wang et al. Applied and Environmental Microbiology, 2007; 73: 5261) to automate taxonomic assignments for large batches of insect COI sequences such as data obtained from high-throughput environmental sequencing. This method provides rank-flexible taxonomic assignments with an associated bootstrap support value, and it is faster than the blast-based methods commonly used in environmental sequence surveys. We have developed and rigorously tested the performance of three different training sets using leave-one-out cross-validation, two field data sets, and targeted testing of Lepidoptera, Diptera and Mantodea sequences obtained from the Barcode of Life Data system. We found that type I error rates, incorrect taxonomic assignments with a high bootstrap support, were already relatively low but could be lowered further by ensuring that all query taxa are actually present in the reference database. Choosing bootstrap support cut-offs according to query length and summarizing taxonomic assignments to more inclusive ranks can also help to reduce error while retaining the maximum number of assignments. Additionally, we highlight gaps in the taxonomic and geographic representation of insects in public sequence databases that will require further work by taxonomists to improve the quality of assignments generated using any method.

  11. A finite element model of a six-year-old child for simulating pedestrian accidents.

    PubMed

    Meng, Yunzhu; Pak, Wansoo; Guleyupoglu, Berkan; Koya, Bharath; Gayzik, F Scott; Untaroiu, Costin D

    2017-01-01

    Child pedestrian protection deserves more attention in vehicle safety design since they are the most vulnerable road users who face the highest mortality rate. Pediatric Finite Element (FE) models could be used to simulate and understand the pedestrian injury mechanisms during crashes in order to mitigate them. Thus, the objective of the study was to develop a computationally efficient (simplified) six-year-old (6YO-PS) pedestrian FE model and validate it based on the latest published pediatric data. The 6YO-PS FE model was developed by morphing the existing GHBMC adult pedestrian model. Retrospective scan data were used to locally adjust the geometry as needed for accuracy. Component test simulations focused only the lower extremities and pelvis, which are the first body regions impacted during pedestrian accidents. Three-point bending test simulations were performed on the femur and tibia with adult material properties and then updated using child material properties. Pelvis impact and knee bending tests were also simulated. Finally, a series of pediatric Car-to-Pedestrian Collision (CPC) were simulated with pre-impact velocities ranging from 20km/h up to 60km/h. The bone models assigned pediatric material properties showed lower stiffness and a good match in terms of fracture force to the test data (less than 6% error). The pelvis impact force predicted by the child model showed a similar trend with test data. The whole pedestrian model was stable during CPC simulations and predicted common pedestrian injuries. Overall, the 6YO-PS FE model developed in this study showed good biofidelity at component level (lower extremity and pelvis) and stability in CPC simulations. While more validations would improve it, the current model could be used to investigate the lower limb injury mechanisms and in the prediction of the impact parameters as specified in regulatory testing protocols. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Development of a new integrated local trajectory planning and tracking control framework for autonomous ground vehicles

    NASA Astrophysics Data System (ADS)

    Li, Xiaohui; Sun, Zhenping; Cao, Dongpu; Liu, Daxue; He, Hangen

    2017-03-01

    This study proposes a novel integrated local trajectory planning and tracking control (ILTPTC) framework for autonomous vehicles driving along a reference path with obstacles avoidance. For this ILTPTC framework, an efficient state-space sampling-based trajectory planning scheme is employed to smoothly follow the reference path. A model-based predictive path generation algorithm is applied to produce a set of smooth and kinematically-feasible paths connecting the initial state with the sampling terminal states. A velocity control law is then designed to assign a speed value at each of the points along the generated paths. An objective function considering both safety and comfort performance is carefully formulated for assessing the generated trajectories and selecting the optimal one. For accurately tracking the optimal trajectory while overcoming external disturbances and model uncertainties, a combined feedforward and feedback controller is developed. Both simulation analyses and vehicle testing are performed to verify the effectiveness of the proposed ILTPTC framework, and future research is also briefly discussed.

  13. Rhythmic brain stimulation reduces anxiety-related behavior in a mouse model based on meditation training.

    PubMed

    Weible, Aldis P; Piscopo, Denise M; Rothbart, Mary K; Posner, Michael I; Niell, Cristopher M

    2017-03-07

    Meditation training induces changes at both the behavioral and neural levels. A month of meditation training can reduce self-reported anxiety and other dimensions of negative affect. It also can change white matter as measured by diffusion tensor imaging and increase resting-state midline frontal theta activity. The current study tests the hypothesis that imposing rhythms in the mouse anterior cingulate cortex (ACC), by using optogenetics to induce oscillations in activity, can produce behavioral changes. Mice were randomly assigned to groups and were given twenty 30-min sessions of light pulses delivered at 1, 8, or 40 Hz over 4 wk or were assigned to a no-laser control condition. Before and after the month all mice were administered a battery of behavioral tests. In the light/dark box, mice receiving cortical stimulation had more light-side entries, spent more time in the light, and made more vertical rears than mice receiving rhythmic cortical suppression or no manipulation. These effects on light/dark box exploratory behaviors are associated with reduced anxiety and were most pronounced following stimulation at 1 and 8 Hz. No effects were seen related to basic motor behavior or exploration during tests of novel object and location recognition. These data support a relationship between lower-frequency oscillations in the mouse ACC and the expression of anxiety-related behaviors, potentially analogous to effects seen with human practitioners of some forms of meditation.

  14. Preparing for fieldwork: Students' perceptions of their readiness to provide evidence-based practice.

    PubMed

    Evenson, Mary E

    2013-01-01

    The purpose of this study was to explore students' perceptions of their confidence to use research evidence to complete a client case analysis assignment in preparation for participation in fieldwork and future practice. A convenience sample of 42 entry-level occupational therapy Masters students, included 41 females and one male, ages 24 to 35. A quasi-experimental pretest-posttest design was used. Students participated in a problem-based learning approach supported by educational technology. Measures included a pre- and post-semester confidence survey, a post-semester satisfaction survey, and an assignment rubric. Based on paired t-tests and Wilcoxin Signed Ranks Tests, statistically significant differences in pre- and post-test scores were noted for all 18 items on the confidence survey (p< 0.001). Significant increases in students' confidence were noted for verbal and written communication of descriptive, assessment, and intervention evidence, along with increased confidence to effectively use assessment evidence. Results suggest that problem-based learning methods were significantly associated with students' perceptions of their confidence to use research evidence to analyze a client case. These results cannot necessarily be generalized due to the limitations of using non-standardized measures with a convenience sample, without a control group, within the context of a single course as part of one academic program curriculum.

  15. Comparison of dkgB-linked intergenic sequence ribotyping to DNA microarray hybridization for assigning serotype to Salmonella enterica

    PubMed Central

    Guard, Jean; Sanchez-Ingunza, Roxana; Morales, Cesar; Stewart, Tod; Liljebjelke, Karen; Kessel, JoAnn; Ingram, Kim; Jones, Deana; Jackson, Charlene; Fedorka-Cray, Paula; Frye, Jonathan; Gast, Richard; Hinton, Arthur

    2012-01-01

    Two DNA-based methods were compared for the ability to assign serotype to 139 isolates of Salmonella enterica ssp. I. Intergenic sequence ribotyping (ISR) evaluated single nucleotide polymorphisms occurring in a 5S ribosomal gene region and flanking sequences bordering the gene dkgB. A DNA microarray hybridization method that assessed the presence and the absence of sets of genes was the second method. Serotype was assigned for 128 (92.1%) of submissions by the two DNA methods. ISR detected mixtures of serotypes within single colonies and it cost substantially less than Kauffmann–White serotyping and DNA microarray hybridization. Decreasing the cost of serotyping S. enterica while maintaining reliability may encourage routine testing and research. PMID:22998607

  16. Boron environments in Pyrex® glass--a high resolution, Double-Rotation NMR and thermodynamic modelling study.

    PubMed

    Howes, A P; Vedishcheva, N M; Samoson, A; Hanna, J V; Smith, M E; Holland, D; Dupree, R

    2011-07-07

    It is shown, using the important technological glass Pyrex® as an example, that 1D and 2D (11)B Double-Rotation (DOR) NMR experiments, in combination with thermodynamic modelling, are able to provide unique structural information about complex glasses. (11)B DOR NMR has been applied to Pyrex® glass in order to remove both dipolar and quadrupolar broadening of the NMR lines, leading to high resolution spectra that allow unambiguous, accurate peak fitting to be carried out, of particular importance in the case of the 3-coordinated [BO(3)] (B3) trigonal planar environments. The data obtained are of sufficient quality that they can be used to test the distributions of borate and borosilicate superstructural units predicted by the thermodynamics-based Model of Associated Solutions. The model predicts the dominant boron-containing chemical groupings in Pyrex® glass to be those associated with B(2)O(3) and sodium tetraborate (with smaller amounts of sodium triborate, sodium diborate, sodium pentaborate, danburite and reedmergnerite). Excellent agreement is found between model and experiment provided the (11)B peaks with isotropic chemical shifts of -1.4 ppm and 0.5 ppm are assigned to B4 species from borosilicate units ([B(OSi)(4)] and [B(OSi)(3)(OB)]) and borate superstructural units (mainly triborate rings with some pentaborate and diborate) respectively. The peaks with isotropic shifts of 14 ppm and 18.1 ppm are then assigned to B3 in borate superstructural units (mainly triborate and pentaborate along with connecting B3) and boroxol rings respectively. The assignments of the DOR NMR peaks, are supported by the presence of cross-peaks in (11)B spin-diffusion DOR NMR spectra which can be used to develop a structural model in which B(2)O(3)-like regions are linked, via borate and borosilicate superstructural units, to the majority silica network. Pyrex® is thus shown to have a heterogeneous structure, with distinct molecular groupings that are far removed from a random distribution of network polyhedra with only short-range order. This journal is © the Owner Societies 2011

  17. A Game Theoretic Optimization Method for Energy Efficient Global Connectivity in Hybrid Wireless Sensor Networks

    PubMed Central

    Lee, JongHyup; Pak, Dohyun

    2016-01-01

    For practical deployment of wireless sensor networks (WSN), WSNs construct clusters, where a sensor node communicates with other nodes in its cluster, and a cluster head support connectivity between the sensor nodes and a sink node. In hybrid WSNs, cluster heads have cellular network interfaces for global connectivity. However, when WSNs are active and the load of cellular networks is high, the optimal assignment of cluster heads to base stations becomes critical. Therefore, in this paper, we propose a game theoretic model to find the optimal assignment of base stations for hybrid WSNs. Since the communication and energy cost is different according to cellular systems, we devise two game models for TDMA/FDMA and CDMA systems employing power prices to adapt to the varying efficiency of recent wireless technologies. The proposed model is defined on the assumptions of the ideal sensing field, but our evaluation shows that the proposed model is more adaptive and energy efficient than local selections. PMID:27589743

  18. Neutrino masses and mixing from S4 flavor twisting

    NASA Astrophysics Data System (ADS)

    Ishimori, Hajime; Shimizu, Yusuke; Tanimoto, Morimitsu; Watanabe, Atsushi

    2011-02-01

    We discuss a neutrino mass model based on the S4 discrete symmetry where the symmetry breaking is triggered by the boundary conditions of the bulk right-handed neutrino in the fifth spacial dimension. The three generations of the left-handed lepton doublets and the right-handed neutrinos are assigned to be the triplets of S4. The magnitudes of the lepton mixing angles, especially the reactor angle, are related to the neutrino mass patterns, and the model will be tested in future neutrino experiments, e.g., an early discovery of the reactor angle favors the normal hierarchy. For the inverted hierarchy, the lepton mixing is predicted to be almost the tribimaximal mixing. The size of the extra dimension has a connection to the possible mass spectrum; a small (large) volume corresponds to the normal (inverted) mass hierarchy.

  19. Changing the approach to treatment choice in epilepsy using big data.

    PubMed

    Devinsky, Orrin; Dilley, Cynthia; Ozery-Flato, Michal; Aharonov, Ranit; Goldschmidt, Ya'ara; Rosen-Zvi, Michal; Clark, Chris; Fritz, Patty

    2016-03-01

    A UCB-IBM collaboration explored the application of machine learning to large claims databases to construct an algorithm for antiepileptic drug (AED) choice for individual patients. Claims data were collected between January 2006 and September 2011 for patients with epilepsy > 16 years of age. A subset of patient claims with a valid index date of AED treatment change (new, add, or switch) were used to train the AED prediction model by retrospectively evaluating an index date treatment for subsequent treatment change. Based on the trained model, a model-predicted AED regimen with the lowest likelihood of treatment change was assigned to each patient in the group of test claims, and outcomes were evaluated to test model validity. The model had 72% area under receiver operator characteristic curve, indicating good predictive power. Patients who were given the model-predicted AED regimen had significantly longer survival rates (time until a treatment change event) and lower expected health resource utilization on average than those who received another treatment. The actual prescribed AED regimen at the index date matched the model-predicted AED regimen in only 13% of cases; there were large discrepancies in the frequency of use of certain AEDs/combinations between model-predicted AED regimens and those actually prescribed. Chances of treatment success were improved if patients received the model-predicted treatment. Using the model's prediction system may enable personalized, evidence-based epilepsy care, accelerating the match between patients and their ideal therapy, thereby delivering significantly better health outcomes for patients and providing health-care savings by applying resources more efficiently. Our goal will be to strengthen the predictive power of the model by integrating diverse data sets and potentially moving to prospective data collection. Crown Copyright © 2015. Published by Elsevier Inc. All rights reserved.

  20. Object memory effects on figure assignment: conscious object recognition is not necessary or sufficient.

    PubMed

    Peterson, M A; de Gelder, B; Rapcsak, S Z; Gerhardstein, P C; Bachoud-Lévi, A

    2000-01-01

    In three experiments we investigated whether conscious object recognition is necessary or sufficient for effects of object memories on figure assignment. In experiment 1, we examined a brain-damaged participant, AD, whose conscious object recognition is severely impaired. AD's responses about figure assignment do reveal effects from memories of object structure, indicating that conscious object recognition is not necessary for these effects, and identifying the figure-ground test employed here as a new implicit test of access to memories of object structure. In experiments 2 and 3, we tested a second brain-damaged participant, WG, for whom conscious object recognition was relatively spared. Nevertheless, effects from memories of object structure on figure assignment were not evident in WG's responses about figure assignment in experiment 2, indicating that conscious object recognition is not sufficient for effects of object memories on figure assignment. WG's performance sheds light on AD's performance, and has implications for the theoretical understanding of object memory effects on figure assignment.

  1. Cell transmission model of dynamic assignment for urban rail transit networks.

    PubMed

    Xu, Guangming; Zhao, Shuo; Shi, Feng; Zhang, Feilian

    2017-01-01

    For urban rail transit network, the space-time flow distribution can play an important role in evaluating and optimizing the space-time resource allocation. For obtaining the space-time flow distribution without the restriction of schedules, a dynamic assignment problem is proposed based on the concept of continuous transmission. To solve the dynamic assignment problem, the cell transmission model is built for urban rail transit networks. The priority principle, queuing process, capacity constraints and congestion effects are considered in the cell transmission mechanism. Then an efficient method is designed to solve the shortest path for an urban rail network, which decreases the computing cost for solving the cell transmission model. The instantaneous dynamic user optimal state can be reached with the method of successive average. Many evaluation indexes of passenger flow can be generated, to provide effective support for the optimization of train schedules and the capacity evaluation for urban rail transit network. Finally, the model and its potential application are demonstrated via two numerical experiments using a small-scale network and the Beijing Metro network.

  2. The Internet: A Learning Environment.

    ERIC Educational Resources Information Center

    McGreal, Rory

    1997-01-01

    The Internet environment is suitable for many types of learning activities and teaching and learning styles. Every World Wide Web-based course should provide: home page; introduction; course overview; course requirements, vital information; roles and responsibilities; assignments; schedule; resources; sample tests; teacher biography; course…

  3. Contributions au probleme d'affectation des types d'avion

    NASA Astrophysics Data System (ADS)

    Belanger, Nicolas

    In this thesis, we approach the problem of assigning aircraft types to flights (what is called aircraft fleet assignment) in a strategic planning context. The literature mentions many studies considering this problem on a daily flight schedule basis, but the proposed models do no allow to consider many elements that are either necessary to assure the practical feasibility of the solutions, or relevant to get more beneficial solutions. After describing the practical context of the problem (Chapter 1) and presenting the literature on the subject (Chapter 2), we propose new models and solution approaches to improve the quality of' the solutions obtained. The general scheme of the thesis is presented in Chapter 3. We summarize here the models and solution approaches that we propose; and present the main elements of our conclusions. First, in Chapter 4, we consider the problem of aircraft fleet Assignment over a weekly flight schedule, integrating into the objective an homogeneity factor for driving the choice of the aircraft types for the flights with the same flight number over the week. We present an integer linear model based on a time-space multicommodity network. This model includes, among others, decision variables relative to the aircraft type assigned to each flight and to the dominant aircraft type assigned to each flight number. We present in Chapter 5 the results of a research project made in collaboration with Air Canada within a consulting contract. The project aimed at analyzing the relevance for the planners of using an optimization software to help them to first identify non profitable flight legs in the network, and second to efficiently establish the aircraft fleet assignment. In this chapter, we propose an iterative approach to take into account the fact that the passenger demand is not known on a leg basis, but rather on an origin-destination and departure time basis. Finally, in Chapter 6, we propose a model and a solution approach that aim at solving the fleet assignment problem over a periodic schedule in the case where there is a flexibility on the flight departure times and the fleet size must be minimized. Moreover, the objective of this model includes the impact on the passenger demand for each flight of the variation of the flight departure times and the closing of the departure times of consecutive flights connecting the same pairs of stations. (Abstract shortened by UMI.)

  4. The use of Multiple Representations to Enhance Student Mental Model Development of a Complex Earth System in an Introductory Geoscience Course

    NASA Astrophysics Data System (ADS)

    Sell, K. S.; Heather, M. R.; Herbert, B. E.

    2004-12-01

    Exposing earth system science (ESS) concepts into introductory geoscience courses may present new and unique cognitive learning issues for students including understanding the role of positive and negative feedbacks in system responses to perturbations, spatial heterogeneity, and temporal dynamics, especially when systems exhibit complex behavior. Implicit learning goals of typical introductory undergraduate geoscience courses are more focused on building skill-sets and didactic knowledge in learners than developing a deeper understanding of the dynamics and processes of complex earth systems through authentic inquiry. Didactic teaching coupled with summative assessment of factual knowledge tends to limit student¡¦s understanding of the nature of science, their belief in the relevancy of science to their lives, and encourages memorization and regurgitation; this is especially true among the non-science majors who compose the majority of students in introductory courses within the large university setting. Students organize scientific knowledge and reason about earth systems by manipulating internally constructed mental models. This pilot study focuses on characterizing the impact of inquiry-based learning with multiple representations to foster critical thinking and mental model development about authentic environmental issues of coastal systems in an introductory geoscience course. The research was conducted in nine introductory physical geology laboratory sections (N ˜ 150) at Texas A&M University as part of research connected with the Information Technology in Science (ITS) Center. Participants were randomly placed into experimental and control groups. Experimental groups were exposed to multiple representations including both web-based learning materials (i.e. technology-supported visualizations and analysis of multiple datasets) and physical models, whereas control groups were provided with the traditional ¡workbook style¡" laboratory assignments. Assessment of pre- and post-test results was performed to provide indications of content knowledge and mental model expression improvements between groups. A rubric was used as the assessment instrument to evaluate student products (Cronbach alpha: 0.84 ¡V 0.98). Characterization of student performance based on a Student¡¦s t-test indicates that significant differences (p < 0.05) in pre-post achievement occurred primarily within the experimental group; this illustrates that the use of multiple representations had an impact on student learning of ESS concepts, particularly in regard to mental model constructions. Analysis of variance also suggests that student mental model constructions were significantly different (p < 0.10) between test groups. Factor analysis extracted three principle components (eigenvalue > 1) which show similar clustering of variables that influence cognition, indicating that the cognitive processes driving student understanding of geoscience do not vary among student test groups. Categories of cognition include critical thinking skills (percent variance = 22.16%), understanding of the nature of science (percent variance = 25.16%), and ability to interpret results (percent variance = 28.89%). Lower numbers of students completed all of the required assignments of this research than expected (65.3%), restricting the quality of the results and therefore the ability to make more significant interpretations; this was likely due to the non-supportive learning environment in which the research was implemented.

  5. Kappa and Rater Accuracy: Paradigms and Parameters.

    PubMed

    Conger, Anthony J

    2017-12-01

    Drawing parallels to classical test theory, this article clarifies the difference between rater accuracy and reliability and demonstrates how category marginal frequencies affect rater agreement and Cohen's kappa (κ). Category assignment paradigms are developed: comparing raters to a standard (index) versus comparing two raters to one another (concordance), using both nonstochastic and stochastic category membership. Using a probability model to express category assignments in terms of rater accuracy and random error, it is shown that observed agreement (Po) depends only on rater accuracy and number of categories; however, expected agreement (Pe) and κ depend additionally on category frequencies. Moreover, category frequencies affect Pe and κ solely through the variance of the category proportions, regardless of the specific frequencies underlying the variance. Paradoxically, some judgment paradigms involving stochastic categories are shown to yield higher κ values than their nonstochastic counterparts. Using the stated probability model, assignments to categories were generated for 552 combinations of paradigms, rater and category parameters, category frequencies, and number of stimuli. Observed means and standard errors for Po, Pe, and κ were fully consistent with theory expectations. Guidelines for interpretation of rater accuracy and reliability are offered, along with a discussion of alternatives to the basic model.

  6. Dynamic Hop Service Differentiation Model for End-to-End QoS Provisioning in Multi-Hop Wireless Networks

    NASA Astrophysics Data System (ADS)

    Youn, Joo-Sang; Seok, Seung-Joon; Kang, Chul-Hee

    This paper presents a new QoS model for end-to-end service provisioning in multi-hop wireless networks. In legacy IEEE 802.11e based multi-hop wireless networks, the fixed assignment of service classes according to flow's priority at every node causes priority inversion problem when performing end-to-end service differentiation. Thus, this paper proposes a new QoS provisioning model called Dynamic Hop Service Differentiation (DHSD) to alleviate the problem and support effective service differentiation between end-to-end nodes. Many previous works for QoS model through the 802.11e based service differentiation focus on packet scheduling on several service queues with different service rate and service priority. Our model, however, concentrates on a dynamic class selection scheme, called Per Hop Class Assignment (PHCA), in the node's MAC layer, which selects a proper service class for each packet, in accordance with queue states and service requirement, in every node along the end-to-end route of the packet. The proposed QoS solution is evaluated using the OPNET simulator. The simulation results show that the proposed model outperforms both best-effort and 802.11e based strict priority service models in mobile ad hoc environments.

  7. Inside the black box: starting to uncover the underlying decision rules used in one-by-one expert assessment of occupational exposure in case-control studies

    PubMed Central

    Wheeler, David C.; Burstyn, Igor; Vermeulen, Roel; Yu, Kai; Shortreed, Susan M.; Pronk, Anjoeka; Stewart, Patricia A.; Colt, Joanne S.; Baris, Dalsu; Karagas, Margaret R.; Schwenn, Molly; Johnson, Alison; Silverman, Debra T.; Friesen, Melissa C.

    2014-01-01

    Objectives Evaluating occupational exposures in population-based case-control studies often requires exposure assessors to review each study participants' reported occupational information job-by-job to derive exposure estimates. Although such assessments likely have underlying decision rules, they usually lack transparency, are time-consuming and have uncertain reliability and validity. We aimed to identify the underlying rules to enable documentation, review, and future use of these expert-based exposure decisions. Methods Classification and regression trees (CART, predictions from a single tree) and random forests (predictions from many trees) were used to identify the underlying rules from the questionnaire responses and an expert's exposure assignments for occupational diesel exhaust exposure for several metrics: binary exposure probability and ordinal exposure probability, intensity, and frequency. Data were split into training (n=10,488 jobs), testing (n=2,247), and validation (n=2,248) data sets. Results The CART and random forest models' predictions agreed with 92–94% of the expert's binary probability assignments. For ordinal probability, intensity, and frequency metrics, the two models extracted decision rules more successfully for unexposed and highly exposed jobs (86–90% and 57–85%, respectively) than for low or medium exposed jobs (7–71%). Conclusions CART and random forest models extracted decision rules and accurately predicted an expert's exposure decisions for the majority of jobs and identified questionnaire response patterns that would require further expert review if the rules were applied to other jobs in the same or different study. This approach makes the exposure assessment process in case-control studies more transparent and creates a mechanism to efficiently replicate exposure decisions in future studies. PMID:23155187

  8. The Shock and Vibration Bulletin. Part 3. Shock Testing, Shock Analysis

    DTIC Science & Technology

    1974-08-01

    APPROXIMATE TRANSFORMATION C.S. O’Hearne and J.W. Shipley, Martin Marietta Aerospace, Orlando, Florida LINEAR LUMPED-MASS MODELING TECHNIQUES FOR BLAST LOADED...Leppert, B.K. Wada, Jet Propulsion Laboratory, Pasadena, California, and R. Miyakawa, Martin - Marietta Aerospace, Denver, Colorado (assigned to the Jet...Wilmington, Delaware Vibration Testing and Analysis DEVELOPMENT OF SAM-D MISSILE RANDOM VIBRATION RESPONSE LOADS P.G. Hahn, Martin Marietta Aerospace

  9. Implementation and validation of an improved allele specific stutter filtering method for electropherogram interpretation.

    PubMed

    Kalafut, Tim; Schuerman, Curt; Sutton, Joel; Faris, Tom; Armogida, Luigi; Bright, Jo-Anne; Buckleton, John; Taylor, Duncan

    2018-03-31

    Modern probabilistic genotyping (PG) software is capable of modeling stutter as part of the profile weighting statistic. This allows for peaks in stutter positions to be considered as allelic or stutter or both. However, prior to running any sample through a PG calculator, the examiner must first interpret the sample, considering such things as artifacts and number of contributors (NOC or N). Stutter can play a major role both during the assignment of the number of contributors, and the assessment of inclusion and exclusion. If stutter peaks are not filtered when they should be, it can lead to the assignment of an additional contributor, causing N contributors to be assigned as N + 1. If peaks in the stutter position of a major contributor are filtered using a threshold that is too high, true alleles of minor contributors can be lost. Until now, the software used to view the electropherogram stutter filters are based on a locus specific model. Combined stutter peaks occur when a peak could be the result of both back stutter (stutter one repeat shorter than the allele) and forward stutter (stutter one repeat unit larger than the allele). This can challenge existing filters. We present here a novel stutter filter model in the ArmedXpert™ software package that uses a linear model based on allele for back stutter and applies an additive filter for combined stutter. We term this the allele specific stutter model (AM). We compared AM with a traditional model based on locus specific stutter filters (termed LM). This improved stutter model has the benefit of: Instances of over filtering were reduced 78% from 101 for a traditional model (LM) to 22 for the allele specific model (AM) when scored against each other. Instances of under filtering were reduced 80% from 85 (LM) to 17 (AM) when scored against ground truth mixtures. Published by Elsevier B.V.

  10. Using Norm-Based Appeals to Increase Response Rates in Evaluation Research: A Field Experiment

    ERIC Educational Resources Information Center

    Misra, Shalini; Stokols, Daniel; Marino, Anne Heberger

    2012-01-01

    A field experiment was conducted to test the effectiveness of norm-based persuasive messages for increasing response rates in online survey research. Participants in an interdisciplinary conference were asked to complete two successive postconference surveys and randomly assigned to one of two groups at each time point. The experimental group…

  11. Steps Counts among Middle School Students Vary with Aerobic Fitness Level

    ERIC Educational Resources Information Center

    Le Masurier, Guy C.; Corbin, Charles B.

    2006-01-01

    The purpose of this study was to examine if steps/day taken by middle school students varied based on aerobic fitness classification. Middle school students (N = 223; 112 girls, 111 boys) were assigned to three aerobic fitness categories (HIGH, MOD, LOW) based on results of the FITNESSGRAM PACER test. Four weekdays of pedometer monitoring…

  12. [Comparative study of the population structure and population assignment of sockeye salmon Oncorhynchus nerka from West Kamchatka based on RAPD-PCR and microsatellite polymorphism].

    PubMed

    Zelenina, D A; Khrustaleva, A M; Volkov, A A

    2006-05-01

    Using two types of molecular markers, a comparative analysis of the population structure of sockeye salmon from West Kamchatka as well as population assignment of each individual fish were carried out. The values of a RAPD-PCR-based population assignment test (94-100%) were somewhat higher than those based on microsatellite data (74-84%). However, these results seem quite satisfactory because of high polymorphism of the microsatellite loci examined. The UPGMA dendrograms of genetic similarity of three largest spawning populations, constructed using each of the methods, were highly reliable, which was demonstrated by high bootstrap indices (100% in the case of RAPD-PCR; 84 and 100%, in the case of microsatellite analysis), though the resultant trees differed from one another. The different topology of the trees, in our view, is explained by the fact that the employed methods explored different parts of the genome; hence, the obtained results, albeit valid, may not correlate. Thus, to enhance reliability of the results, several methods of analysis should be used concurrently.

  13. Examining the Latent Structure of the Delis-Kaplan Executive Function System.

    PubMed

    Karr, Justin E; Hofer, Scott M; Iverson, Grant L; Garcia-Barrera, Mauricio A

    2018-05-04

    The current study aimed to determine whether the Delis-Kaplan Executive Function System (D-KEFS) taps into three executive function factors (inhibition, shifting, fluency) and to assess the relationship between these factors and tests of executive-related constructs less often measured in latent variable research: reasoning, abstraction, and problem solving. Participants included 425 adults from the D-KEFS standardization sample (20-49 years old; 50.1% female; 70.1% White). Eight alternative measurement models were compared based on model fit, with test scores assigned a priori to three factors: inhibition (Color-Word Interference, Tower), shifting (Trail Making, Sorting, Design Fluency), and fluency (Verbal/Design Fluency). The Twenty Questions, Word Context, and Proverb Tests were predicted in separate structural models. The three-factor model fit the data well (CFI = 0.938; RMSEA = 0.047), although a two-factor model, with shifting and fluency merged, fit similarly well (CFI = 0.929; RMSEA = 0.048). A bifactor model fit best (CFI = 0.977; RMSEA = 0.032) and explained the most variance in shifting indicators, but rarely converged among 5,000 bootstrapped samples. When the three first-order factors simultaneously predicted the criterion variables, only shifting was uniquely predictive (p < .05; R2 = 0.246-0.408). The bifactor significantly predicted all three criterion variables (p < .001; R2 = 0.141-242). Results supported a three-factor D-KEFS model (i.e., inhibition, shifting, and fluency), although shifting and fluency were highly related (r = 0.696). The bifactor showed superior fit, but converged less often than other models. Shifting best predicted tests of reasoning, abstraction, and problem solving. These findings support the validity of D-KEFS scores for measuring executive-related constructs and provide a framework through which clinicians can interpret D-KEFS results.

  14. Estimating equivalence with quantile regression

    USGS Publications Warehouse

    Cade, B.S.

    2011-01-01

    Equivalence testing and corresponding confidence interval estimates are used to provide more enlightened statistical statements about parameter estimates by relating them to intervals of effect sizes deemed to be of scientific or practical importance rather than just to an effect size of zero. Equivalence tests and confidence interval estimates are based on a null hypothesis that a parameter estimate is either outside (inequivalence hypothesis) or inside (equivalence hypothesis) an equivalence region, depending on the question of interest and assignment of risk. The former approach, often referred to as bioequivalence testing, is often used in regulatory settings because it reverses the burden of proof compared to a standard test of significance, following a precautionary principle for environmental protection. Unfortunately, many applications of equivalence testing focus on establishing average equivalence by estimating differences in means of distributions that do not have homogeneous variances. I discuss how to compare equivalence across quantiles of distributions using confidence intervals on quantile regression estimates that detect differences in heterogeneous distributions missed by focusing on means. I used one-tailed confidence intervals based on inequivalence hypotheses in a two-group treatment-control design for estimating bioequivalence of arsenic concentrations in soils at an old ammunition testing site and bioequivalence of vegetation biomass at a reclaimed mining site. Two-tailed confidence intervals based both on inequivalence and equivalence hypotheses were used to examine quantile equivalence for negligible trends over time for a continuous exponential model of amphibian abundance. ?? 2011 by the Ecological Society of America.

  15. Leader Positivity and Follower Creativity: An Experimental Analysis

    ERIC Educational Resources Information Center

    Avey, James B.; Richmond, F. Lynn; Nixon, Don R.

    2012-01-01

    Using an experimental research design, 191 working adults were randomly assigned to two experimental conditions in order to test a theoretical model linking leader and follower positive psychological capital (PsyCap). Multiple methods were used to gather information from the participants. We found when leader PsyCap was manipulated experimentally,…

  16. Treating Adult Marijuana Dependence: A Test of the Relapse Prevention Model.

    ERIC Educational Resources Information Center

    Stephens, Robert S.; And Others

    1994-01-01

    Randomly assigned adults (n=212) seeking treatment for marijuana use to relapse prevention (RP) or social support (SSP) group discussion intervention. Data collected at 12 months posttreatment revealed substantial reductions in frequency of marijuana use and associated problems; no significant difference between treatments on days of marijuana…

  17. Long-Term Outcomes of Early Reading Intervention

    ERIC Educational Resources Information Center

    Hurry, Jane; Sylva, Kathy

    2007-01-01

    This study explores the long-term effectiveness of two differing models of early intervention for children with reading difficulties: Reading Recovery and a specific phonological training. Approximately 400 children were pre-tested, 95 were assigned to Reading Recovery, 97 to Phonological Training and the remainder acted as controls. In the short…

  18. IMU-to-Segment Assignment and Orientation Alignment for the Lower Body Using Deep Learning

    PubMed Central

    2018-01-01

    Human body motion analysis based on wearable inertial measurement units (IMUs) receives a lot of attention from both the research community and the and industrial community. This is due to the significant role in, for instance, mobile health systems, sports and human computer interaction. In sensor based activity recognition, one of the major issues for obtaining reliable results is the sensor placement/assignment on the body. For inertial motion capture (joint kinematics estimation) and analysis, the IMU-to-segment (I2S) assignment and alignment are central issues to obtain biomechanical joint angles. Existing approaches for I2S assignment usually rely on hand crafted features and shallow classification approaches (e.g., support vector machines), with no agreement regarding the most suitable features for the assignment task. Moreover, estimating the complete orientation alignment of an IMU relative to the segment it is attached to using a machine learning approach has not been shown in literature so far. This is likely due to the high amount of training data that have to be recorded to suitably represent possible IMU alignment variations. In this work, we propose online approaches for solving the assignment and alignment tasks for an arbitrary amount of IMUs with respect to a biomechanical lower body model using a deep learning architecture and windows of 128 gyroscope and accelerometer data samples. For this, we combine convolutional neural networks (CNNs) for local filter learning with long-short-term memory (LSTM) recurrent networks as well as generalized recurrent units (GRUs) for learning time dynamic features. The assignment task is casted as a classification problem, while the alignment task is casted as a regression problem. In this framework, we demonstrate the feasibility of augmenting a limited amount of real IMU training data with simulated alignment variations and IMU data for improving the recognition/estimation accuracies. With the proposed approaches and final models we achieved 98.57% average accuracy over all segments for the I2S assignment task (100% when excluding left/right switches) and an average median angle error over all segments and axes of 2.91° for the I2S alignment task. PMID:29351262

  19. Applications of random forest feature selection for fine-scale genetic population assignment.

    PubMed

    Sylvester, Emma V A; Bentzen, Paul; Bradbury, Ian R; Clément, Marie; Pearce, Jon; Horne, John; Beiko, Robert G

    2018-02-01

    Genetic population assignment used to inform wildlife management and conservation efforts requires panels of highly informative genetic markers and sensitive assignment tests. We explored the utility of machine-learning algorithms (random forest, regularized random forest and guided regularized random forest) compared with F ST ranking for selection of single nucleotide polymorphisms (SNP) for fine-scale population assignment. We applied these methods to an unpublished SNP data set for Atlantic salmon ( Salmo salar ) and a published SNP data set for Alaskan Chinook salmon ( Oncorhynchus tshawytscha ). In each species, we identified the minimum panel size required to obtain a self-assignment accuracy of at least 90% using each method to create panels of 50-700 markers Panels of SNPs identified using random forest-based methods performed up to 7.8 and 11.2 percentage points better than F ST -selected panels of similar size for the Atlantic salmon and Chinook salmon data, respectively. Self-assignment accuracy ≥90% was obtained with panels of 670 and 384 SNPs for each data set, respectively, a level of accuracy never reached for these species using F ST -selected panels. Our results demonstrate a role for machine-learning approaches in marker selection across large genomic data sets to improve assignment for management and conservation of exploited populations.

  20. Patterns of Nucleotide Diversity at Photoperiod Related Genes in Norway Spruce [Picea abies (L.) Karst.

    PubMed Central

    Källman, Thomas; De Mita, Stéphane; Larsson, Hanna; Gyllenstrand, Niclas; Heuertz, Myriam; Parducci, Laura; Suyama, Yoshihisa; Lagercrantz, Ulf; Lascoux, Martin

    2014-01-01

    The ability of plants to track seasonal changes is largely dependent on genes assigned to the photoperiod pathway, and variation in those genes is thereby important for adaptation to local day length conditions. Extensive physiological data in several temperate conifer species suggest that populations are adapted to local light conditions, but data on the genes underlying this adaptation are more limited. Here we present nucleotide diversity data from 19 genes putatively involved in photoperiodic response in Norway spruce (Picea abies). Based on similarity to model plants the genes were grouped into three categories according to their presumed position in the photoperiod pathway: photoreceptors, circadian clock genes, and downstream targets. An HKA (Hudson, Kreitman and Aquade) test showed a significant excess of diversity at photoreceptor genes, but no departure from neutrality at circadian genes and downstream targets. Departures from neutrality were also tested with Tajima's D and Fay and Wu's H statistics under three demographic scenarios: the standard neutral model, a population expansion model, and a more complex population split model. Only one gene, the circadian clock gene PaPRR3 with a highly positive Tajima's D value, deviates significantly from all tested demographic scenarios. As the PaPRR3 gene harbours multiple non-synonymous variants it appears as an excellent candidate gene for control of photoperiod response in Norway spruce. PMID:24810273

  1. Patterns of nucleotide diversity at photoperiod related genes in Norway spruce [Picea abies (L.) Karst].

    PubMed

    Källman, Thomas; De Mita, Stéphane; Larsson, Hanna; Gyllenstrand, Niclas; Heuertz, Myriam; Parducci, Laura; Suyama, Yoshihisa; Lagercrantz, Ulf; Lascoux, Martin

    2014-01-01

    The ability of plants to track seasonal changes is largely dependent on genes assigned to the photoperiod pathway, and variation in those genes is thereby important for adaptation to local day length conditions. Extensive physiological data in several temperate conifer species suggest that populations are adapted to local light conditions, but data on the genes underlying this adaptation are more limited. Here we present nucleotide diversity data from 19 genes putatively involved in photoperiodic response in Norway spruce (Picea abies). Based on similarity to model plants the genes were grouped into three categories according to their presumed position in the photoperiod pathway: photoreceptors, circadian clock genes, and downstream targets. An HKA (Hudson, Kreitman and Aquade) test showed a significant excess of diversity at photoreceptor genes, but no departure from neutrality at circadian genes and downstream targets. Departures from neutrality were also tested with Tajima's D and Fay and Wu's H statistics under three demographic scenarios: the standard neutral model, a population expansion model, and a more complex population split model. Only one gene, the circadian clock gene PaPRR3 with a highly positive Tajima's D value, deviates significantly from all tested demographic scenarios. As the PaPRR3 gene harbours multiple non-synonymous variants it appears as an excellent candidate gene for control of photoperiod response in Norway spruce.

  2. Pedagogy of the logic model: teaching undergraduates to work together to change their communities.

    PubMed

    Zimmerman, Lindsey; Kamal, Zohra; Kim, Hannah

    2013-01-01

    Undergraduate community psychology courses can empower students to address challenging problems in their local communities. Creating a logic model is an experiential way to learn course concepts by "doing." Throughout the semester, students work with peers to define a problem, develop an intervention, and plan an evaluation focused on an issue of concern to them. This report provides an overview of how to organize a community psychology course around the creation of a logic model in order for students to develop this applied skill. Two undergraduate student authors report on their experience with the logic model assignment, describing the community problem they chose to address, what they learned from the assignment, what they found challenging, and what they are doing now in their communities based on what they learned.

  3. Tuning and performance evaluation of PID controller for superheater steam temperature control of 200 MW boiler using gain phase assignment algorithm

    NASA Astrophysics Data System (ADS)

    Begum, A. Yasmine; Gireesh, N.

    2018-04-01

    In superheater, steam temperature is controlled in a cascade control loop. The cascade control loop consists of PI and PID controllers. To improve the superheater steam temperature control the controller's gains in a cascade control loop has to be tuned efficiently. The mathematical model of the superheater is derived by sets of nonlinear partial differential equations. The tuning methods taken for study here are designed for delay plus first order transfer function model. Hence from the dynamical model of the superheater, a FOPTD model is derived using frequency response method. Then by using Chien-Hrones-Reswick Tuning Algorithm and Gain-Phase Assignment Algorithm optimum controller gains has been found out based on the least value of integral time weighted absolute error.

  4. Modeling regional freight flow assignment through intermodal terminals

    DOT National Transportation Integrated Search

    2005-03-01

    An analytical model is developed to assign regional freight across a multimodal highway and railway network using geographic information systems. As part of the regional planning process, the model is an iterative procedure that assigns multimodal fr...

  5. Challenges in projecting clustering results across gene expression-profiling datasets.

    PubMed

    Lusa, Lara; McShane, Lisa M; Reid, James F; De Cecco, Loris; Ambrogi, Federico; Biganzoli, Elia; Gariboldi, Manuela; Pierotti, Marco A

    2007-11-21

    Gene expression microarray studies for several types of cancer have been reported to identify previously unknown subtypes of tumors. For breast cancer, a molecular classification consisting of five subtypes based on gene expression microarray data has been proposed. These subtypes have been reported to exist across several breast cancer microarray studies, and they have demonstrated some association with clinical outcome. A classification rule based on the method of centroids has been proposed for identifying the subtypes in new collections of breast cancer samples; the method is based on the similarity of the new profiles to the mean expression profile of the previously identified subtypes. Previously identified centroids of five breast cancer subtypes were used to assign 99 breast cancer samples, including a subset of 65 estrogen receptor-positive (ER+) samples, to five breast cancer subtypes based on microarray data for the samples. The effect of mean centering the genes (i.e., transforming the expression of each gene so that its mean expression is equal to 0) on subtype assignment by method of centroids was assessed. Further studies of the effect of mean centering and of class prevalence in the test set on the accuracy of method of centroids classifications of ER status were carried out using training and test sets for which ER status had been independently determined by ligand-binding assay and for which the proportion of ER+ and ER- samples were systematically varied. When all 99 samples were considered, mean centering before application of the method of centroids appeared to be helpful for correctly assigning samples to subtypes, as evidenced by the expression of genes that had previously been used as markers to identify the subtypes. However, when only the 65 ER+ samples were considered for classification, many samples appeared to be misclassified, as evidenced by an unexpected distribution of ER+ samples among the resultant subtypes. When genes were mean centered before classification of samples for ER status, the accuracy of the ER subgroup assignments was highly dependent on the proportion of ER+ samples in the test set; this effect of subtype prevalence was not seen when gene expression data were not mean centered. Simple corrections such as mean centering of genes aimed at microarray platform or batch effect correction can have undesirable consequences because patient population effects can easily be confused with these assay-related effects. Careful thought should be given to the comparability of the patient populations before attempting to force data comparability for purposes of assigning subtypes to independent subjects.

  6. A controlled trial of automated classification of negation from clinical notes

    PubMed Central

    Elkin, Peter L; Brown, Steven H; Bauer, Brent A; Husser, Casey S; Carruth, William; Bergstrom, Larry R; Wahner-Roedler, Dietlind L

    2005-01-01

    Background Identification of negation in electronic health records is essential if we are to understand the computable meaning of the records: Our objective is to compare the accuracy of an automated mechanism for assignment of Negation to clinical concepts within a compositional expression with Human Assigned Negation. Also to perform a failure analysis to identify the causes of poorly identified negation (i.e. Missed Conceptual Representation, Inaccurate Conceptual Representation, Missed Negation, Inaccurate identification of Negation). Methods 41 Clinical Documents (Medical Evaluations; sometimes outside of Mayo these are referred to as History and Physical Examinations) were parsed using the Mayo Vocabulary Server Parsing Engine. SNOMED-CT™ was used to provide concept coverage for the clinical concepts in the record. These records resulted in identification of Concepts and textual clues to Negation. These records were reviewed by an independent medical terminologist, and the results were tallied in a spreadsheet. Where questions on the review arose Internal Medicine Faculty were employed to make a final determination. Results SNOMED-CT was used to provide concept coverage of the 14,792 Concepts in 41 Health Records from John's Hopkins University. Of these, 1,823 Concepts were identified as negative by Human review. The sensitivity (Recall) of the assignment of negation was 97.2% (p < 0.001, Pearson Chi-Square test; when compared to a coin flip). The specificity of assignment of negation was 98.8%. The positive likelihood ratio of the negation was 81. The positive predictive value (Precision) was 91.2% Conclusion Automated assignment of negation to concepts identified in health records based on review of the text is feasible and practical. Lexical assignment of negation is a good test of true Negativity as judged by the high sensitivity, specificity and positive likelihood ratio of the test. SNOMED-CT had overall coverage of 88.7% of the concepts being negated. PMID:15876352

  7. A high throughput single nucleotide polymorphism multiplex assay for parentage assignment in New Zealand sheep.

    PubMed

    Clarke, Shannon M; Henry, Hannah M; Dodds, Ken G; Jowett, Timothy W D; Manley, Tim R; Anderson, Rayna M; McEwan, John C

    2014-01-01

    Accurate pedigree information is critical to animal breeding systems to ensure the highest rate of genetic gain and management of inbreeding. The abundance of available genomic data, together with development of high throughput genotyping platforms, means that single nucleotide polymorphisms (SNPs) are now the DNA marker of choice for genomic selection studies. Furthermore the superior qualities of SNPs compared to microsatellite markers allows for standardization between laboratories; a property that is crucial for developing an international set of markers for traceability studies. The objective of this study was to develop a high throughput SNP assay for use in the New Zealand sheep industry that gives accurate pedigree assignment and will allow a reduction in breeder input over lambing. This required two phases of development--firstly, a method of extracting quality DNA from ear-punch tissue performed in a high throughput cost efficient manner and secondly a SNP assay that has the ability to assign paternity to progeny resulting from mob mating. A likelihood based approach to infer paternity was used where sires with the highest LOD score (log of the ratio of the likelihood given parentage to likelihood given non-parentage) are assigned. An 84 "parentage SNP panel" was developed that assigned, on average, 99% of progeny to a sire in a problem where there were 3,000 progeny from 120 mob mated sires that included numerous half sib sires. In only 6% of those cases was there another sire with at least a 0.02 probability of paternity. Furthermore dam information (either recorded, or by genotyping possible dams) was absent, highlighting the SNP test's suitability for paternity testing. Utilization of this parentage SNP assay will allow implementation of progeny testing into large commercial farms where the improved accuracy of sire assignment and genetic evaluations will increase genetic gain in the sheep industry.

  8. Differentiation of low-attenuation intracranial hemorrhage and calcification using dual-energy computed tomography in a phantom system

    PubMed Central

    Nute, Jessica L.; Roux, Lucia Le; Chandler, Adam G.; Baladandayuthapani, Veera; Schellingerhout, Dawid; Cody, Dianna D.

    2015-01-01

    Objectives Calcific and hemorrhagic intracranial lesions with attenuation levels of <100 Hounsfield Units (HU) cannot currently be reliably differentiated by single-energy computed tomography (SECT). The proper differentiation of these lesion types would have a multitude of clinical applications. A phantom model was used to test the ability of dual-energy CT (DECT) to differentiate such lesions. Materials and Methods Agar gel-bound ferric oxide and hydroxyapatite were used to model hemorrhage and calcification, respectively. Gel models were scanned using SECT and DECT and organized into SECT attenuation-matched pairs at 16 attenuation levels between 0 and 100 HU. DECT data were analyzed using 3D Gaussian mixture models (GMMs), as well as a simplified threshold plane metric derived from the 3D GMM, to assign voxels to hemorrhagic or calcific categories. Accuracy was calculated by comparing predicted voxel assignments with actual voxel identities. Results We measured 6,032 voxels from each gel model, for a total of 193,024 data points (16 matched model pairs). Both the 3D GMM and its more clinically implementable threshold plane derivative yielded similar results, with >90% accuracy at matched SECT attenuation levels ≥50 HU. Conclusions Hemorrhagic and calcific lesions with attenuation levels between 50 and 100 HU were differentiable using DECT in a clinically relevant phantom system with >90% accuracy. This method warrants further testing for potential clinical applications. PMID:25162534

  9. The psychological four-color mapping problem.

    PubMed

    Francis, Gregory; Bias, Keri; Shive, Joshua

    2010-06-01

    Mathematicians have proven that four colors are sufficient to color 2-D maps so that no neighboring regions share the same color. Here we consider the psychological 4-color problem: Identifying which 4 colors should be used to make a map easy to use. We build a model of visual search for this design task and demonstrate how to apply it to the task of identifying the optimal colors for a map. We parameterized the model with a set of 7 colors using a visual search experiment in which human participants found a target region on a small map. We then used the model to predict search times for new maps and identified the color assignments that minimize or maximize average search time. The differences between these maps were predicted to be substantial. The model was then tested with a larger set of 31 colors on a map of English counties under conditions in which participants might memorize some aspects of the map. Empirical tests of the model showed that an optimally best colored version of this map is searched 15% faster than the correspondingly worst colored map. Thus, the color assignment seems to affect search times in a way predicted by the model, and this effect persists even when participants might use other sources of knowledge about target location. PsycINFO Database Record (c) 2010 APA, all rights reserved.

  10. Impact of patient-centered medical home assignment on emergency room visits among uninsured patients in a county health system.

    PubMed

    Roby, Dylan H; Pourat, Nadereh; Pirritano, Matthew J; Vrungos, Shelley M; Dajee, Himmet; Castillo, Dan; Kominski, Gerald F

    2010-08-01

    The Medical Services Initiative program--a safety net-based system of care--in Orange County included assignment of uninsured, low-income residents to a patient-centered medical home. The medical home provided case management, a team-based approach for treating disease, and increased access to primary and specialty care among other elements of a patient-centered medical home. Providers were paid an enhanced fee and pay-for-performance incentives to ensure delivery of comprehensive treatment. Medical Services Initiative enrollees who were assigned to a medical home for longer time periods were less likely to have any emergency room (ER) visits or multiple ER visits. Switching medical homes three or more times was associated with enrollees being more likely to have any ER visits or multiple ER visits. The findings provide evidence that successful implementation of the patient-centered medical home model in a county-based safety net system is possible and can reduce unnecessary ER use.

  11. Crystal Identification in Dual-Layer-Offset DOI-PET Detectors Using Stratified Peak Tracking Based on SVD and Mean-Shift Algorithm

    NASA Astrophysics Data System (ADS)

    Wei, Qingyang; Dai, Tiantian; Ma, Tianyu; Liu, Yaqiang; Gu, Yu

    2016-10-01

    An Anger-logic based pixelated PET detector block requires a crystal position map (CPM) to assign the position of each detected event to a most probable crystal index. Accurate assignments are crucial to PET imaging performance. In this paper, we present a novel automatic approach to generate the CPMs for dual-layer offset (DLO) PET detectors using a stratified peak tracking method. In which, the top and bottom layers are distinguished by their intensity difference and the peaks of the top and bottom layers are tracked based on a singular value decomposition (SVD) and mean-shift algorithm in succession. The CPM is created by classifying each pixel to its nearest peak and assigning the pixel with the crystal index of that peak. A Matlab-based graphical user interface program was developed including the automatic algorithm and a manual interaction procedure. The algorithm was tested for three DLO PET detector blocks. Results show that the proposed method exhibits good performance as well as robustness for all the three blocks. Compared to the existing methods, our approach can directly distinguish the layer and crystal indices using the information of intensity and offset grid pattern.

  12. Use of simulation to compare the performance of minimization with stratified blocked randomization.

    PubMed

    Toorawa, Robert; Adena, Michael; Donovan, Mark; Jones, Steve; Conlon, John

    2009-01-01

    Minimization is an alternative method to stratified permuted block randomization, which may be more effective at balancing treatments when there are many strata. However, its use in the regulatory setting for industry trials remains controversial, primarily due to the difficulty in interpreting conventional asymptotic statistical tests under restricted methods of treatment allocation. We argue that the use of minimization should be critically evaluated when designing the study for which it is proposed. We demonstrate by example how simulation can be used to investigate whether minimization improves treatment balance compared with stratified randomization, and how much randomness can be incorporated into the minimization before any balance advantage is no longer retained. We also illustrate by example how the performance of the traditional model-based analysis can be assessed, by comparing the nominal test size with the observed test size over a large number of simulations. We recommend that the assignment probability for the minimization be selected using such simulations. Copyright (c) 2008 John Wiley & Sons, Ltd.

  13. The solution of target assignment problem in command and control decision-making behaviour simulation

    NASA Astrophysics Data System (ADS)

    Li, Ni; Huai, Wenqing; Wang, Shaodan

    2017-08-01

    C2 (command and control) has been understood to be a critical military component to meet an increasing demand for rapid information gathering and real-time decision-making in a dynamically changing battlefield environment. In this article, to improve a C2 behaviour model's reusability and interoperability, a behaviour modelling framework was proposed to specify a C2 model's internal modules and a set of interoperability interfaces based on the C-BML (coalition battle management language). WTA (weapon target assignment) is a typical C2 autonomous decision-making behaviour modelling problem. Different from most WTA problem descriptions, here sensors were considered to be available resources of detection and the relationship constraints between weapons and sensors were also taken into account, which brought it much closer to actual application. A modified differential evolution (MDE) algorithm was developed to solve this high-dimension optimisation problem and obtained an optimal assignment plan with high efficiency. In case study, we built a simulation system to validate the proposed C2 modelling framework and interoperability interface specification. Also, a new optimisation solution was used to solve the WTA problem efficiently and successfully.

  14. Biasing Influences on Test Level Assignments for Hearing Impaired Students.

    ERIC Educational Resources Information Center

    Wolk, Steve

    1985-01-01

    Possible biasing influences of student characteristics were considered for teachers' judgments of appropriate test level assignments for about 1,300 hearing impaired special education students. Analyses indicated the presence of strong influences of race and severity of handicapping condition, as well as of sex, upon change in level assignments,…

  15. 12 CFR 563e.28 - Assigned ratings.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 5 2010-01-01 2010-01-01 false Assigned ratings. 563e.28 Section 563e.28 Banks... for Assessing Performance § 563e.28 Assigned ratings. (a) Ratings in general. Subject to paragraphs (b... performance under the lending, investment and service tests, the community development test, the small savings...

  16. Effect of a care plan based on Roy adaptation model biological dimension on stroke patients' physiologic adaptation level.

    PubMed

    Alimohammadi, Nasrollah; Maleki, Bibi; Shahriari, Mohsen; Chitsaz, Ahmad

    2015-01-01

    Stroke is a stressful event with several functional, physical, psychological, social, and economic problems that affect individuals' different living balances. With coping strategies, patients try to control these problems and return to their natural life. The aim of this study is to investigate the effect of a care plan based on Roy adaptation model biological dimension on stroke patients' physiologic adaptation level. This study is a clinical trial in which 50 patients, affected by brain stroke and being admitted in the neurology ward of Kashani and Alzahra hospitals, were randomly assigned to control and study groups in Isfahan in 2013. Roy adaptation model care plan was administered in biological dimension in the form of four sessions and phone call follow-ups for 1 month. The forms related to Roy adaptation model were completed before and after intervention in the two groups. Chi-square test and t-test were used to analyze the data through SPSS 18. There was a significant difference in mean score of adaptation in physiological dimension in the study group after intervention (P < 0.001) compared to before intervention. Comparison of the mean scores of changes of adaptation in the patients affected by brain stroke in the study and control groups showed a significant increase in physiological dimension in the study group by 47.30 after intervention (P < 0.001). The results of study showed that Roy adaptation model biological dimension care plan can result in an increase in adaptation in patients with stroke in physiological dimension. Nurses can use this model for increasing patients' adaptation.

  17. [Training of resident physicians in the recognition and treatment of an anaphylaxis case in pediatrics with simulation models].

    PubMed

    Enríquez, Diego; Lamborizio, María J; Firenze, Lorena; Jaureguizar, María de la P; Díaz Pumará, Estanislao; Szyld, Edgardo

    2017-08-01

    To evaluate the performance of resident physicians in diagnosing and treating a case of anaphylaxis, six months after participating in simulation training exercises. Initially, a group of pediatric residents were trained using simulation techniques in the management of critical pediatric cases. Based on their performance in this exercise, participants were assigned to one of 3 groups. At six months post-training, 4 residents were randomly chosen from each group to be re-tested, using the same performance measure as previously used. During the initial training session, 56 of 72 participants (78%) correctly identified and treated the case. Six months after the initial training, all 12 (100%) resident physicians who were re-tested successfully diagnosed and treated the simulated anaphylaxis case. The training through simulation techniques allowed correction or optimization of the treatment of simulated anaphylaxis cases in resident physicians evaluated after 6 months of the initial training.

  18. Predicting neutron damage using TEM with in situ ion irradiation and computer modeling

    NASA Astrophysics Data System (ADS)

    Kirk, Marquis A.; Li, Meimei; Xu, Donghua; Wirth, Brian D.

    2018-01-01

    We have constructed a computer model of irradiation defect production closely coordinated with TEM and in situ ion irradiation of Molybdenum at 80 °C over a range of dose, dose rate and foil thickness. We have reexamined our previous ion irradiation data to assign appropriate error and uncertainty based on more recent work. The spatially dependent cascade cluster dynamics model is updated with recent Molecular Dynamics results for cascades in Mo. After a careful assignment of both ion and neutron irradiation dose values in dpa, TEM data are compared for both ion and neutron irradiated Mo from the same source material. Using the computer model of defect formation and evolution based on the in situ ion irradiation of thin foils, the defect microstructure, consisting of densities and sizes of dislocation loops, is predicted for neutron irradiation of bulk material at 80 °C and compared with experiment. Reasonable agreement between model prediction and experimental data demonstrates a promising direction in understanding and predicting neutron damage using a closely coordinated program of in situ ion irradiation experiment and computer simulation.

  19. Resident training for eclampsia and magnesium toxicity management: simulation or traditional lecture?

    PubMed

    Fisher, Nelli; Bernstein, Peter S; Satin, Andrew; Pardanani, Setul; Heo, Hye; Merkatz, Irwin R; Goffman, Dena

    2010-10-01

    To compare eclampsia and magnesium toxicity management among residents randomly assigned to lecture or simulation-based education. Statified by year, residents (n = 38) were randomly assigned to 3 educational intervention groups: Simulation→Lecture, Simulation, and Lecture. Postintervention simulations were performed for all and scored using standardized lists. Maternal, fetal, eclampsia management, and magnesium toxcity scores were assigned. Mann-Whitney U, Wilcoxon rank sum and χ(2) tests were used for analysis. Postintervention maternal (16 and 15 vs 12; P < .05) and eclampsia (19 vs 16; P < .05) scores were significantly better in simulation based compared with lecture groups. Postintervention magnesium toxcitiy and fetal scores were not different among groups. Lecture added to simulation did not lead to incremental benefit when eclampsia scores were compared between Simulation→Lecture and Simulation (19 vs 19; P = nonsignificant). Simulation training is superior to traditional lecture alone for teaching crucial skills for the optimal management of both eclampsia and magnesium toxicity, 2 life-threatening obstetric emergencies. Published by Mosby, Inc.

  20. Predicting and priming thematic roles: Flexible use of verbal and nonverbal cues during relative clause comprehension.

    PubMed

    Kowalski, Alix; Huang, Yi Ting

    2017-09-01

    Relative-clause sentences (RCs) have been a key test case for psycholinguistic models of comprehension. While object-relative clauses (e.g., ORCs: "The bear that the horse . . .") are distinguished from subject-relative clauses (SRCs) after the second noun phrase (NP2; e.g., SRCs: "The bear that pushed . . ."), role assignments are often delayed until the embedded verb (e.g., ". . . pushed ate the sandwich"). This contrasts with overwhelming evidence of incremental role assignment in other garden-path sentences. The current study investigates how contextual factors modulate reliance on verbal and nonverbal cues. Using a visual-world paradigm, participants saw preceding discourse contexts that highlighted relevant roles within events (e.g., pusher, pushee). Nevertheless, role assignment for ORCs remained delayed until the embedded verb (Experiment 1). However, role assignment for ORCs occurred before the embedded verb when additional linguistic input was provided by an adverb (Experiment 2). Finally, when the likelihood of encountering RCs increased within the experimental context, role immediate assignment for ORCs was observed after NP2 (Experiment 3). Together, these findings suggest that real-time role assignment often prefers verbal cues, but can also flexibly adapt to the statistical properties of the local context. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  1. Automatic Determination of the Need for Intravenous Contrast in Musculoskeletal MRI Examinations Using IBM Watson's Natural Language Processing Algorithm.

    PubMed

    Trivedi, Hari; Mesterhazy, Joseph; Laguna, Benjamin; Vu, Thienkhai; Sohn, Jae Ho

    2018-04-01

    Magnetic resonance imaging (MRI) protocoling can be time- and resource-intensive, and protocols can often be suboptimal dependent upon the expertise or preferences of the protocoling radiologist. Providing a best-practice recommendation for an MRI protocol has the potential to improve efficiency and decrease the likelihood of a suboptimal or erroneous study. The goal of this study was to develop and validate a machine learning-based natural language classifier that can automatically assign the use of intravenous contrast for musculoskeletal MRI protocols based upon the free-text clinical indication of the study, thereby improving efficiency of the protocoling radiologist and potentially decreasing errors. We utilized a deep learning-based natural language classification system from IBM Watson, a question-answering supercomputer that gained fame after challenging the best human players on Jeopardy! in 2011. We compared this solution to a series of traditional machine learning-based natural language processing techniques that utilize a term-document frequency matrix. Each classifier was trained with 1240 MRI protocols plus their respective clinical indications and validated with a test set of 280. Ground truth of contrast assignment was obtained from the clinical record. For evaluation of inter-reader agreement, a blinded second reader radiologist analyzed all cases and determined contrast assignment based on only the free-text clinical indication. In the test set, Watson demonstrated overall accuracy of 83.2% when compared to the original protocol. This was similar to the overall accuracy of 80.2% achieved by an ensemble of eight traditional machine learning algorithms based on a term-document matrix. When compared to the second reader's contrast assignment, Watson achieved 88.6% agreement. When evaluating only the subset of cases where the original protocol and second reader were concordant (n = 251), agreement climbed further to 90.0%. The classifier was relatively robust to spelling and grammatical errors, which were frequent. Implementation of this automated MR contrast determination system as a clinical decision support tool may save considerable time and effort of the radiologist while potentially decreasing error rates, and require no change in order entry or workflow.

  2. 75 FR 72611 - Assessments, Large Bank Pricing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-24

    ... the worst risk ranking and are included in the statistical analysis. Appendix 1 to the NPR describes the statistical analysis in detail. \\12\\ The percentage approximated by factors is based on the statistical model for that particual year. Actual weights assigned to each scorecard measure are largely based...

  3. Project SMART: Preliminary Results From a Test of the Efficacy of a Swedish Internet-Based HIV Risk-Reduction Intervention for Men Who Have Sex With Men.

    PubMed

    Schonnesson, Lena Nilsson; Bowen, Anne M; Williams, Mark L

    2016-08-01

    In Sweden, 57 % of HIV transmission occurs among MSM, and other sexually transmitted infections are increasing, supporting the need for innovative interventions. The Internet is a potentially useful HIV-prevention platform, but there is a lack of such programs in Sweden. The purpose of this exploratory study was to test the efficacy of the Internet-based SMART intervention to decrease HIV sexual risks in Swedish MSM. The intervention was adapted from the Wyoming Rural AIDS Prevention Project to the Swedish context, which was guided by the Information-Motivation-Behavioral (IMB) skills model and consisted of six sessions. A total of 112 men responded to a pretest questionnaire and were randomly assigned to the SMART intervention or to a waitlist group. Fifty-four men dropped out, leaving a final sample of 58 participants. Twenty-five were assigned to the SMART intervention and 33 to a waitlist group. One month post intervention, the number of casual anal sex partners significantly decreased (t = 2.19, p = .04). Compared with the waitlist group, men in the intervention group increased their HIV knowledge (β = 0.70, p = .01), their belief of condom use as an act of responsibility (β = 1.19, p = .04), their willingness to use a condom with every new partner all the time (β = 1.39, p = .03), and their confidence in using condoms in challenging situations (β = 1.65, p = .02). Condom use was not analyzed due to the small sample size. Despite the small sample, high drop-out, and short follow-up, the study provides support for the efficacy of the Internet interventions, the SMART intervention specifically, for reducing the proportion of casual anal sex partners and improving the three cognitive components of the IMB model for Swedish MSM.

  4. Visual word ambiguity.

    PubMed

    van Gemert, Jan C; Veenman, Cor J; Smeulders, Arnold W M; Geusebroek, Jan-Mark

    2010-07-01

    This paper studies automatic image classification by modeling soft assignment in the popular codebook model. The codebook model describes an image as a bag of discrete visual words selected from a vocabulary, where the frequency distributions of visual words in an image allow classification. One inherent component of the codebook model is the assignment of discrete visual words to continuous image features. Despite the clear mismatch of this hard assignment with the nature of continuous features, the approach has been successfully applied for some years. In this paper, we investigate four types of soft assignment of visual words to image features. We demonstrate that explicitly modeling visual word assignment ambiguity improves classification performance compared to the hard assignment of the traditional codebook model. The traditional codebook model is compared against our method for five well-known data sets: 15 natural scenes, Caltech-101, Caltech-256, and Pascal VOC 2007/2008. We demonstrate that large codebook vocabulary sizes completely deteriorate the performance of the traditional model, whereas the proposed model performs consistently. Moreover, we show that our method profits in high-dimensional feature spaces and reaps higher benefits when increasing the number of image categories.

  5. Epithelial–mesenchymal transition biomarkers and support vector machine guided model in preoperatively predicting regional lymph node metastasis for rectal cancer

    PubMed Central

    Fan, X-J; Wan, X-B; Huang, Y; Cai, H-M; Fu, X-H; Yang, Z-L; Chen, D-K; Song, S-X; Wu, P-H; Liu, Q; Wang, L; Wang, J-P

    2012-01-01

    Background: Current imaging modalities are inadequate in preoperatively predicting regional lymph node metastasis (RLNM) status in rectal cancer (RC). Here, we designed support vector machine (SVM) model to address this issue by integrating epithelial–mesenchymal-transition (EMT)-related biomarkers along with clinicopathological variables. Methods: Using tissue microarrays and immunohistochemistry, the EMT-related biomarkers expression was measured in 193 RC patients. Of which, 74 patients were assigned to the training set to select the robust variables for designing SVM model. The SVM model predictive value was validated in the testing set (119 patients). Results: In training set, eight variables, including six EMT-related biomarkers and two clinicopathological variables, were selected to devise SVM model. In testing set, we identified 63 patients with high risk to RLNM and 56 patients with low risk. The sensitivity, specificity and overall accuracy of SVM in predicting RLNM were 68.3%, 81.1% and 72.3%, respectively. Importantly, multivariate logistic regression analysis showed that SVM model was indeed an independent predictor of RLNM status (odds ratio, 11.536; 95% confidence interval, 4.113–32.361; P<0.0001). Conclusion: Our SVM-based model displayed moderately strong predictive power in defining the RLNM status in RC patients, providing an important approach to select RLNM high-risk subgroup for neoadjuvant chemoradiotherapy. PMID:22538975

  6. Applying an Employee-Motivation Model to Prevent Student Plagiarism.

    ERIC Educational Resources Information Center

    Malouff, John M.; Sims, Randi L.

    1996-01-01

    A model based on Vroom's expectancy theory of employee motivation posits that instructors can prevent plagiarism by ensuring that students understand the rules of ethical writing, expect assignments to be manageable and have personal benefits, and expect plagiarism to be difficult and have important personal costs. (SK)

  7. A comparison of paper-and-pencil and computerized forms of Line Orientation and Enhanced Cued Recall Tests.

    PubMed

    Aşkar, Petek; Altun, Arif; Cangöz, Banu; Cevik, Vildan; Kaya, Galip; Türksoy, Hasan

    2012-04-01

    The purpose of this study was to assess whether a computerized battery of neuropsychological tests could produce similar results as the conventional forms. Comparisons on 77 volunteer undergraduates were carried out with two neuropsychological tests: Line Orientation Test and Enhanced Cued Recall Test. Firstly, students were assigned randomly across the test medium (paper-and-pencil versus computerized). Secondly, the groups were given the same test in the other medium after a 30-day interval between tests. Results showed that the Enhanced Cued Recall Test-Computer-based did not correlate with the Enhanced Cued Recall Test-Paper-and-pencil results. Line Orientation Test-Computer-based scores, on the other hand, did correlate significantly with the Line Orientation Test-Paper-and-pencil version. In both tests, scores were higher on paper-and-pencil tests compared to computer-based tests. Total score difference between modalities was statistically significant for both Enhanced Cued Recall Tests and for the Line Orientation Test. In both computer-based tests, it took less time for participants to complete the tests.

  8. In Vitro and In Vivo Short-Term Pulmonary Toxicity of Differently Sized Colloidal Amorphous SiO2

    PubMed Central

    Wiemann, Martin; Sauer, Ursula G.; Vennemann, Antje; Bäcker, Sandra; Keller, Johannes-Georg; Ma-Hock, Lan; Wohlleben, Wendel; Landsiedel, Robert

    2018-01-01

    In vitro prediction of inflammatory lung effects of well-dispersed nanomaterials is challenging. Here, the in vitro effects of four colloidal amorphous SiO2 nanomaterials that differed only by their primary particle size (9, 15, 30, and 55 nm) were analyzed using the rat NR8383 alveolar macrophage (AM) assay. Data were compared to effects of single doses of 15 nm and 55 nm SiO2 intratracheally instilled in rat lungs. In vitro, all four elicited the release of concentration-dependent lactate dehydrogenase, β-glucuronidase, and tumor necrosis factor alpha, and the two smaller materials also released H2O2. All effects were size-dependent. Since the colloidal SiO2 remained well-dispersed in serum-free in vitro conditions, effective particle concentrations reaching the cells were estimated using different models. Evaluating the effective concentration–based in vitro effects using the Decision-making framework for the grouping and testing of nanomaterials, all four nanomaterials were assigned as “active.” This assignment and the size dependency of effects were consistent with the outcomes of intratracheal instillation studies and available short-term rat inhalation data for 15 nm SiO2. The study confirms the applicability of the NR8383 AM assay to assessing colloidal SiO2 but underlines the need to estimate and consider the effective concentration of such well-dispersed test materials. PMID:29534009

  9. CoMoDo: identifying dynamic protein domains based on covariances of motion.

    PubMed

    Wieninger, Silke A; Ullmann, G Matthias

    2015-06-09

    Most large proteins are built of several domains, compact units which enable functional protein motions. Different domain assignment approaches exist, which mostly rely on concepts of stability, folding, and evolution. We describe the automatic assignment method CoMoDo, which identifies domains based on protein dynamics. Covariances of atomic fluctuations, here calculated by an Elastic Network Model, are used to group residues into domains of different hierarchical levels. The so-called dynamic domains facilitate the study of functional protein motions involved in biological processes like ligand binding and signal transduction. By applying CoMoDo to a large number of proteins, we demonstrate that dynamic domains exhibit features absent in the commonly assigned structural domains, which can deliver insight into the interactions between domains and between subunits of multimeric proteins. CoMoDo is distributed as free open source software at www.bisb.uni-bayreuth.de/CoMoDo.html .

  10. 40 CFR 799.9120 - TSCA acute dermal toxicity.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... identification number. A system to randomly assign animals to test groups and control groups is required. (E... source of test animals. (2) Method of randomization in assigning animals to test and control groups. (3... CONTROL ACT (CONTINUED) IDENTIFICATION OF SPECIFIC CHEMICAL SUBSTANCE AND MIXTURE TESTING REQUIREMENTS...

  11. 40 CFR 799.9120 - TSCA acute dermal toxicity.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... identification number. A system to randomly assign animals to test groups and control groups is required. (E... source of test animals. (2) Method of randomization in assigning animals to test and control groups. (3... CONTROL ACT (CONTINUED) IDENTIFICATION OF SPECIFIC CHEMICAL SUBSTANCE AND MIXTURE TESTING REQUIREMENTS...

  12. 40 CFR 799.9120 - TSCA acute dermal toxicity.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... identification number. A system to randomly assign animals to test groups and control groups is required. (E... source of test animals. (2) Method of randomization in assigning animals to test and control groups. (3... CONTROL ACT (CONTINUED) IDENTIFICATION OF SPECIFIC CHEMICAL SUBSTANCE AND MIXTURE TESTING REQUIREMENTS...

  13. 40 CFR 799.9120 - TSCA acute dermal toxicity.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... identification number. A system to randomly assign animals to test groups and control groups is required. (E... source of test animals. (2) Method of randomization in assigning animals to test and control groups. (3... CONTROL ACT (CONTINUED) IDENTIFICATION OF SPECIFIC CHEMICAL SUBSTANCE AND MIXTURE TESTING REQUIREMENTS...

  14. 40 CFR 799.9120 - TSCA acute dermal toxicity.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... identification number. A system to randomly assign animals to test groups and control groups is required. (E... source of test animals. (2) Method of randomization in assigning animals to test and control groups. (3... CONTROL ACT (CONTINUED) IDENTIFICATION OF SPECIFIC CHEMICAL SUBSTANCE AND MIXTURE TESTING REQUIREMENTS...

  15. Automation of block assignment planning using a diagram-based scenario modeling method

    NASA Astrophysics Data System (ADS)

    Hwang, In Hyuck; Kim, Youngmin; Lee, Dong Kun; Shin, Jong Gye

    2014-03-01

    Most shipbuilding scheduling research so far has focused on the load level on the dock plan. This is be¬cause the dock is the least extendable resource in shipyards, and its overloading is difficult to resolve. However, once dock scheduling is completed, making a plan that makes the best use of the rest of the resources in the shipyard to minimize any additional cost is also important. Block assignment planning is one of the midterm planning tasks; it assigns a block to the facility (factory/shop or surface plate) that will actually manufacture the block according to the block characteristics and current situation of the facility. It is one of the most heavily loaded midterm planning tasks and is carried out manu¬ally by experienced workers. In this study, a method of representing the block assignment rules using a diagram was su¬ggested through analysis of the existing manual process. A block allocation program was developed which automated the block assignment process according to the rules represented by the diagram. The planning scenario was validated through a case study that compared the manual assignment and two automated block assignment results.

  16. A hybrid system identification methodology for wireless structural health monitoring systems based on dynamic substructuring

    NASA Astrophysics Data System (ADS)

    Dragos, Kosmas; Smarsly, Kay

    2016-04-01

    System identification has been employed in numerous structural health monitoring (SHM) applications. Traditional system identification methods usually rely on centralized processing of structural response data to extract information on structural parameters. However, in wireless SHM systems the centralized processing of structural response data introduces a significant communication bottleneck. Exploiting the merits of decentralization and on-board processing power of wireless SHM systems, many system identification methods have been successfully implemented in wireless sensor networks. While several system identification approaches for wireless SHM systems have been proposed, little attention has been paid to obtaining information on the physical parameters (e.g. stiffness, damping) of the monitored structure. This paper presents a hybrid system identification methodology suitable for wireless sensor networks based on the principles of component mode synthesis (dynamic substructuring). A numerical model of the monitored structure is embedded into the wireless sensor nodes in a distributed manner, i.e. the entire model is segmented into sub-models, each embedded into one sensor node corresponding to the substructure the sensor node is assigned to. The parameters of each sub-model are estimated by extracting local mode shapes and by applying the equations of the Craig-Bampton method on dynamic substructuring. The proposed methodology is validated in a laboratory test conducted on a four-story frame structure to demonstrate the ability of the methodology to yield accurate estimates of stiffness parameters. Finally, the test results are discussed and an outlook on future research directions is provided.

  17. Efficacy of an Attachment-Based Intervention Model on Health Indices in Children with Chronic Disease and Their Mothers.

    PubMed

    Dehghani-Arani, Fateme; Besharat, Mohammad Ali; Fitton, Victoria A; Aghamohammadi, Asghar

    2018-05-07

    Studies have shown significant relationship between health conditions and attachment. This study aimed to examine an attachment-based intervention model named mother-child-disease triangle (MCDT) on health indices in children with chronic disease and their mothers. This randomized trial study included 22 volunteer children aged 12-18 years undergoing medical treatment for a chronic disease and their mothers. After evaluation by 28-form General Health Questionnaire (GHQ-28), inventory of parent and peer attachment (IPPA), 28-form Child Health Questionnaire (CHQ-28) and Illness Perception Questionnaire (IPQ), the mother-child dyads were paired on the basis of IPPA scores. These pairs were then randomly assigned to an experimental or control group. The experimental group received ten 90-min sessions of MCDT over a 7-week period. Meanwhile, the control group received ten simple conversational sessions as a dummy intervention. In accordance with this study's pre-test/post-test design, both groups were evaluated once again after completing their respective treatment. Multivariate analysis of covariance (MANCOVA) showed members of the experimental group to have significantly stronger attachment and better physiological and psychosocial health than those in the control group. These findings suggest that attachment-based interventions can be used to improve the effectiveness of treatment among children with chronic disease and their mothers.

  18. Factors affecting measured, modeled and reconstructed estimates of personal exposure to ambient ozone in southern California

    NASA Astrophysics Data System (ADS)

    Gonzales, Melissa

    To evaluate those factors which influence the assignment of ozone ( O3) exposures in an epidemiologic context a field study was conducted in the South Coast Air Basin (SoCAB) during the summer of 19% in which time, location, activity (TLA) information and direct measurements of personal O3 exposure were concurrently collected on a group of college students. Current and past O3 exposures were modeled and evaluated as a function of ambient O 3, activity and mobility patterns, indoor ventilation, and recalled TLA information collected one year later. The effect of these factors on the within- and between-subject exposure variability assigned by ecologic (EC) and microenvironment (MEV) models were examined by two-hour intervals, on weekends and weekdays, and by monitoring week compared to personal exposures measured with a passive sampling device (PSD). The students reported spending 85% of their time inside, 7% outside and 8% in- transit. More time was spent outdoors on weekends than on weekdays. Ambient O3 levels were also higher on weekends. In the study area, where a dense O3 monitoring network and the appropriate topography exist fixed-site O3 accurately assigned ambient O3 levels within a 10 mile radius. The variation in the ecologic exposure assignments was low compared to the estimated variation among PSD-measured and MEV-modeled estimates due to the low spatial variation of ambient O3 levels across the SoCAB areas visited by the students. MEV and PSD exposure estimates better captured the variability of personal exposure in any given ambient spatial regimen compared to ecologic exposure assignments. MEV exposure estimates based on recalled TLA patterns, were similar to the MEV estimates based on diary-recorded TLA patterns. For this study population, PSD-measured O3 exposures were estimated to average 32% lower than ``true'' exposure levels due to indoor/outdoor differences in the PSD collection rate. The level of detail obtained from the TLA diary is not necessary for the assignment of current of past O3 exposures in epidemiologic studies. It may be more adventitious to characterize the locations visited, and indoor and outdoor time with the greatest accuracy possible and to use these data to estimate exposure from nearest-monitor ambient O 3 measurements and sets of indoor/outdoor O3 ratios validated to reflect personal exposure within indoor microenvironments.

  19. Preliminary Testing of a Program to Prevent Type 2 Diabetes among High-Risk Youth.(research Papers)

    ERIC Educational Resources Information Center

    Grey, Margaret; Berry, Diane; Davidson, Maryanne; Galasso, Pam; Gustafson, Elaine; Melkus, Gail

    2004-01-01

    Type 2 diabetes is increasing among youth, with minority youth at highest risk. This preliminary study tested the feasibility of a school-based program to prevent type 2 diabetes in youth at risk. Forty-one participants (age 12.6 [+ or -] 1.1 years; 63% female, 51% African American, 44% Hispanic, and 5% Caucasian) were randomly assigned to one of…

  20. Examination of the Diurnal Assumptions of the Test of Variables of Attention for Elementary Students

    ERIC Educational Resources Information Center

    Hurford, David P.; Lasater, Kara A.; Erickson, Sara E.; Kiesling, Nicole E.

    2013-01-01

    Objective: To examine the diurnal assumptions of the Test of Variables of Attention (TOVA). Method: The present study assessed 122 elementary students aged 5.5 to 10.0 years who were randomly assigned to one of four different groups based on time of administration (M-M: Morning-Morning, M-A: Morning-Afternoon, A-M: Afternoon-Morning, and A-A:…

  1. Competency-Based Education, Put to the Test: An Inside Look at Learning and Assessment at Western Governors University

    ERIC Educational Resources Information Center

    Marcus, Jon

    2017-01-01

    Unlike conventional colleges and universities, Western Governors doesn't require students to spend a set number of hours in a classroom, average out their performance on assignments and tests, then hand out letter grades and credits. Using a complex system of assessments developed over the two decades the university has been operating, WGU's…

  2. Teaching to the Test…or Testing to Teach: Exams Requiring Higher Order Thinking Skills Encourage Greater Conceptual Understanding

    ERIC Educational Resources Information Center

    Jensen, Jamie L.; McDaniel, Mark A.; Woodard, Steven M.; Kummer, Tyler A.

    2014-01-01

    In order to test the effect of exam-question level on fostering student conceptual understanding, low-level and high-level quizzes and exams were administered in two sections of an introductory biology course. Each section was taught in a high-level inquiry based style but was assigned either low-level questions (memory oriented) on the quizzes…

  3. A Monte Carlo approach to the inverse problem of diffuse pollution risk in agricultural catchments

    NASA Astrophysics Data System (ADS)

    Milledge, D.; Lane, S. N.; Heathwaite, A. L.; Reaney, S.

    2012-04-01

    The hydrological and biogeochemical processes that operate in catchments influence the ecological quality of freshwater systems through delivery of fine sediment, nutrients and organic matter. As an alternative to the, often complex, reductionist models we outline a - data-driven - approach based on 'inverse modelling'. We invert SCIMAP, a parsimonious risk based model that has an explicit treatment of hydrological connectivity, and use a Bayesian approach to determine the risk that must be assigned to different land uses in a catchment in order to explain the spatial patterns of measured in-stream nutrient concentrations. First, we apply the model to a set of eleven UK catchments to show that: 1) some land use generates a consistently high or low risk of diffuse nitrate (N) and Phosphate (P) pollution; but 2) the risks associated with different land uses vary both between catchments and between P and N delivery; and 3) that the dominant sources of P and N risk in the catchment are often a function of the spatial configuration of land uses. These results suggest that on a case by case basis, inverse modelling may be used to help prioritise the focus of interventions to reduce diffuse pollution risk for freshwater ecosystems. However, a key uncertainty in this approach is the extent to which it can recover the 'true' risks associated with a land cover given error in both the input parameters and equifinality in model outcomes. We test this using a set of synthetic scenarios in which the true risks can be pre-assigned then compared with those recovered from the inverse model. We use these scenarios to identify the number of simulations and observations required to optimize recovery of the true weights, then explore the conditions under which the inverse model becomes equifinal (hampering recovery of the true weights) We find that this is strongly dependent on the covariance in land covers between subcatchments, introducing the possibility that instream sampling could be designed or subsampled to maximize identifiability of the risks associated with a given land cover.

  4. Classifying short genomic fragments from novel lineages using composition and homology

    PubMed Central

    2011-01-01

    Background The assignment of taxonomic attributions to DNA fragments recovered directly from the environment is a vital step in metagenomic data analysis. Assignments can be made using rank-specific classifiers, which assign reads to taxonomic labels from a predetermined level such as named species or strain, or rank-flexible classifiers, which choose an appropriate taxonomic rank for each sequence in a data set. The choice of rank typically depends on the optimal model for a given sequence and on the breadth of taxonomic groups seen in a set of close-to-optimal models. Homology-based (e.g., LCA) and composition-based (e.g., PhyloPythia, TACOA) rank-flexible classifiers have been proposed, but there is at present no hybrid approach that utilizes both homology and composition. Results We first develop a hybrid, rank-specific classifier based on BLAST and Naïve Bayes (NB) that has comparable accuracy and a faster running time than the current best approach, PhymmBL. By substituting LCA for BLAST or allowing the inclusion of suboptimal NB models, we obtain a rank-flexible classifier. This hybrid classifier outperforms established rank-flexible approaches on simulated metagenomic fragments of length 200 bp to 1000 bp and is able to assign taxonomic attributions to a subset of sequences with few misclassifications. We then demonstrate the performance of different classifiers on an enhanced biological phosphorous removal metagenome, illustrating the advantages of rank-flexible classifiers when representative genomes are absent from the set of reference genomes. Application to a glacier ice metagenome demonstrates that similar taxonomic profiles are obtained across a set of classifiers which are increasingly conservative in their classification. Conclusions Our NB-based classification scheme is faster than the current best composition-based algorithm, Phymm, while providing equally accurate predictions. The rank-flexible variant of NB, which we term ε-NB, is complementary to LCA and can be combined with it to yield conservative prediction sets of very high confidence. The simple parameterization of LCA and ε-NB allows for tuning of the balance between more predictions and increased precision, allowing the user to account for the sensitivity of downstream analyses to misclassified or unclassified sequences. PMID:21827705

  5. Knowledge-Based Scheduling of Arrival Aircraft in the Terminal Area

    NASA Technical Reports Server (NTRS)

    Krzeczowski, K. J.; Davis, T.; Erzberger, H.; Lev-Ram, Israel; Bergh, Christopher P.

    1995-01-01

    A knowledge based method for scheduling arrival aircraft in the terminal area has been implemented and tested in real time simulation. The scheduling system automatically sequences, assigns landing times, and assign runways to arrival aircraft by utilizing continuous updates of aircraft radar data and controller inputs. The scheduling algorithm is driven by a knowledge base which was obtained in over two thousand hours of controller-in-the-loop real time simulation. The knowledge base contains a series of hierarchical 'rules' and decision logic that examines both performance criteria, such as delay reductions, as well as workload reduction criteria, such as conflict avoidance. The objective of the algorithm is to devise an efficient plan to land the aircraft in a manner acceptable to the air traffic controllers. This paper describes the scheduling algorithms, gives examples of their use, and presents data regarding their potential benefits to the air traffic system.

  6. Knowledge-based scheduling of arrival aircraft

    NASA Technical Reports Server (NTRS)

    Krzeczowski, K.; Davis, T.; Erzberger, H.; Lev-Ram, I.; Bergh, C.

    1995-01-01

    A knowledge-based method for scheduling arrival aircraft in the terminal area has been implemented and tested in real-time simulation. The scheduling system automatically sequences, assigns landing times, and assigns runways to arrival aircraft by utilizing continuous updates of aircraft radar data and controller inputs. The scheduling algorithms is driven by a knowledge base which was obtained in over two thousand hours of controller-in-the-loop real-time simulation. The knowledge base contains a series of hierarchical 'rules' and decision logic that examines both performance criteria, such as delay reduction, as well as workload reduction criteria, such as conflict avoidance. The objective of the algorithms is to devise an efficient plan to land the aircraft in a manner acceptable to the air traffic controllers. This paper will describe the scheduling algorithms, give examples of their use, and present data regarding their potential benefits to the air traffic system.

  7. Thinking outside ISD: A management model for instructional design

    NASA Astrophysics Data System (ADS)

    Taylor, Tony Dewayne

    The purpose of this study was to examine the effectiveness of an instructional system management-level model proposed by the author designed to orchestrate the efficient development and implementation of customer requested curriculum. The three phases of systems-based model are designed to ensure delivery of high quality and timely instruction are: (1) the assessment and documentation of organizational training requirements; (2) project management control of curriculum development; and (3) the implementation of relevant instruction by competent instructors. This model also provides (4) measurable and quantifiable course evaluation results to justify return on investment and validate its importance with respect to the customer's organizational strategic objectives. The theoretical approach for this study was systems theory-based due to the nature of the instructional systems design model and the systematic design of the management model. The study was accomplished using single-case study application of qualitative style of inquiry as described by Patton (2002). Qualitative inquiry was selected to collect and analyze participant holistic perspective assessment of effectiveness, relevance, and timeliness of the instructional design management model. Participants for this study included five managers, five subject matter experts, and six students assigned to a military organization responsible for the collection of hydrographic data for the U.S. Navy. Triangulation of data sources within the qualitative framework of the study incorporated the three participant groups---managers, SMEs, and students---incorporated multiple views of the course development and implementation to validate the findings and the remove researcher bias. Qualitative coding was accomplished by importing transcribed interviews into Microsoft Excel and sorted using Auto-Filter. The coded interviews indicated effective functionality in the views of the model from each of the three participant groups. Results from a pre-test/post-test comparative analysis indicated a significant difference between the pre-test and post-test mean at the p < .001 for the six students. Although the subject of the case study was within a military training environment, the application of the proposed instructional systems managerial model can be applied to the design, development, delivery, and assessment of instructional material in any line of study where quantifiable effective learning is the goal.

  8. TAILORx Trial Shows Some Women with Breast Cancer May Forgo Chemotherapy

    Cancer.gov

    A summary of results from the Trial Assigning Individualized Options for Treatment, or TAILORx, finds that women with early-stage hormone receptor-positive breast cancer have a low risk of recurrence based on a test for the expression of 21 genes.

  9. Geographic origin and individual assignment of Shorea platyclados (Dipterocarpaceae) for forensic identification

    PubMed Central

    Diway, Bibian; Khoo, Eyen

    2017-01-01

    The development of timber tracking methods based on genetic markers can provide scientific evidence to verify the origin of timber products and fulfill the growing requirement for sustainable forestry practices. In this study, the origin of an important Dark Red Meranti wood, Shorea platyclados, was studied by using the combination of seven chloroplast DNA and 15 short tandem repeats (STRs) markers. A total of 27 natural populations of S. platyclados were sampled throughout Malaysia to establish population level and individual level identification databases. A haplotype map was generated from chloroplast DNA sequencing for population identification, resulting in 29 multilocus haplotypes, based on 39 informative intraspecific variable sites. Subsequently, a DNA profiling database was developed from 15 STRs allowing for individual identification in Malaysia. Cluster analysis divided the 27 populations into two genetic clusters, corresponding to the region of Eastern and Western Malaysia. The conservativeness tests showed that the Malaysia database is conservative after removal of bias from population subdivision and sampling effects. Independent self-assignment tests correctly assigned individuals to the database in an overall 60.60−94.95% of cases for identified populations, and in 98.99−99.23% of cases for identified regions. Both the chloroplast DNA database and the STRs appear to be useful for tracking timber originating in Malaysia. Hence, this DNA-based method could serve as an effective addition tool to the existing forensic timber identification system for ensuring the sustainably management of this species into the future. PMID:28430826

  10. [Rapid assessment of critical quality attributes of Chinese materia medica (II): strategy of NIR assignment].

    PubMed

    Pei, Yan-Ling; Wu, Zhi-Sheng; Shi, Xin-Yuan; Zhou, Lu-Wei; Qiao, Yan-Jiang

    2014-09-01

    The present paper firstly reviewed the research progress and main methods of NIR spectral assignment coupled with our research results. Principal component analysis was focused on characteristic signal extraction to reflect spectral differences. Partial least squares method was concerned with variable selection to discover characteristic absorption band. Two-dimensional correlation spectroscopy was mainly adopted for spectral assignment. Autocorrelation peaks were obtained from spectral changes, which were disturbed by external factors, such as concentration, temperature and pressure. Density functional theory was used to calculate energy from substance structure to establish the relationship between molecular energy and spectra change. Based on the above reviewed method, taking a NIR spectral assignment of chlorogenic acid as example, a reliable spectral assignment for critical quality attributes of Chinese materia medica (CMM) was established using deuterium technology and spectral variable selection. The result demonstrated the assignment consistency according to spectral features of different concentrations of chlorogenic acid and variable selection region of online NIR model in extract process. Although spectral assignment was initial using an active pharmaceutical ingredient, it is meaningful to look forward to the futurity of the complex components in CMM. Therefore, it provided methodology for NIR spectral assignment of critical quality attributes in CMM.

  11. Knowledge-based design of generate-and-patch problem solvers that solve global resource assignment problems

    NASA Technical Reports Server (NTRS)

    Voigt, Kerstin

    1992-01-01

    We present MENDER, a knowledge based system that implements software design techniques that are specialized to automatically compile generate-and-patch problem solvers that satisfy global resource assignments problems. We provide empirical evidence of the superior performance of generate-and-patch over generate-and-test: even with constrained generation, for a global constraint in the domain of '2D-floorplanning'. For a second constraint in '2D-floorplanning' we show that even when it is possible to incorporate the constraint into a constrained generator, a generate-and-patch problem solver may satisfy the constraint more rapidly. We also briefly summarize how an extended version of our system applies to a constraint in the domain of 'multiprocessor scheduling'.

  12. Comparative Protein Structure Modeling Using MODELLER

    PubMed Central

    Webb, Benjamin; Sali, Andrej

    2016-01-01

    Comparative protein structure modeling predicts the three-dimensional structure of a given protein sequence (target) based primarily on its alignment to one or more proteins of known structure (templates). The prediction process consists of fold assignment, target-template alignment, model building, and model evaluation. This unit describes how to calculate comparative models using the program MODELLER and how to use the ModBase database of such models, and discusses all four steps of comparative modeling, frequently observed errors, and some applications. Modeling lactate dehydrogenase from Trichomonas vaginalis (TvLDH) is described as an example. The download and installation of the MODELLER software is also described. PMID:27322406

  13. Investigating the Washback Effects of Task-Based Instruction on the Iranian EFL Learners' Vocabulary Learning

    ERIC Educational Resources Information Center

    Hamzeh, Alireza

    2016-01-01

    The current research was an attempt to explore the washback impact of task-based instruction (TBI) on EFL Iranian learners' vocabulary development. To this end, conducting an Oxford Placement Test (OPT), 30 out of 72 EFL Iranian learners studying in an English language institute, were randomly selected. Then, they were assigned to experimental (N…

  14. A Comparison of Team-Based Learning Formats: Can We Minimize Stress While Maximizing Results?

    ERIC Educational Resources Information Center

    Miller, Cynthia J.; Falcone, Jeff C.; Metz, Michael J.

    2015-01-01

    Team-Based Learning (TBL) is a collaborative teaching method in which students utilize course content to solve challenging problems. A modified version of TBL is used at the University of Louisville School of Medicine. Students complete questions on the Individual Readiness Assurance Test (iRAT) then gather in pre-assigned groups to retake the…

  15. The Effects of Judgment-Based Stratum Classifications on the Efficiency of Stratum Scored CATs.

    ERIC Educational Resources Information Center

    Finney, Sara J.; Smith, Russell W.; Wise, Steven L.

    Two operational item pools were used to investigate the performance of stratum computerized adaptive tests (CATs) when items were assigned to strata based on empirical estimates of item difficulty or human judgments of item difficulty. Items from the first data set consisted of 54 5-option multiple choice items from a form of the ACT mathematics…

  16. Student peer assessment in evidence-based medicine (EBM) searching skills training: an experiment

    PubMed Central

    Eldredge, Jonathan D.; Bear, David G.; Wayne, Sharon J.; Perea, Paul P.

    2013-01-01

    Background: Student peer assessment (SPA) has been used intermittently in medical education for more than four decades, particularly in connection with skills training. SPA generally has not been rigorously tested, so medical educators have limited evidence about SPA effectiveness. Methods: Experimental design: Seventy-one first-year medical students were stratified by previous test scores into problem-based learning tutorial groups, and then these assigned groups were randomized further into intervention and control groups. All students received evidence-based medicine (EBM) training. Only the intervention group members received SPA training, practice with assessment rubrics, and then application of anonymous SPA to assignments submitted by other members of the intervention group. Results: Students in the intervention group had higher mean scores on the formative test with a potential maximum score of 49 points than did students in the control group, 45.7 and 43.5, respectively (P = 0.06). Conclusions: SPA training and the application of these skills by the intervention group resulted in higher scores on formative tests compared to those in the control group, a difference approaching statistical significance. The extra effort expended by librarians, other personnel, and medical students must be factored into the decision to use SPA in any specific educational context. Implications: SPA has not been rigorously tested, particularly in medical education. Future, similarly rigorous studies could further validate use of SPA so that librarians can optimally make use of limited contact time for information skills training in medical school curricula. PMID:24163593

  17. The role of country-to-region assignments in global integrated modeling of energy, agriculture, land use, and climate

    NASA Astrophysics Data System (ADS)

    Kyle, P.; Patel, P.; Calvin, K. V.

    2014-12-01

    Global integrated assessment models used for understanding the linkages between the future energy, agriculture, and climate systems typically represent between 8 and 30 geopolitical macro-regions, balancing the benefits of geographic resolution with the costs of additional data collection, processing, analysis, and computing resources. As these models are continually being improved and updated in order to address new questions for the research and policy communities, it is worth examining the consequences of the country-to-region mapping schemes used for model results. This study presents an application of a data processing system built for the GCAM integrated assessment model that allows any country-to-region assignments, with a minimum of four geopolitical regions and a maximum of 185. We test ten different mapping schemes, including the specific mappings used in existing major integrated assessment models. We also explore the impacts of clustering nations into regions according to the similarity of the structure of each nation's energy and agricultural sectors, as indicated by multivariate analysis. Scenarios examined include a reference scenario, a low-emissions scenario, and scenarios with agricultural and buildings sector climate change impacts. We find that at the global level, the major output variables (primary energy, agricultural land use) are surprisingly similar regardless of regional assignments, but at finer geographic scales, differences are pronounced. We suggest that enhancing geographic resolution is advantageous for analysis of climate impacts on the buildings and agricultural sectors, due to the spatial heterogeneity of these drivers.

  18. An Efficacious Theory-Based Intervention for Stepfamilies

    ERIC Educational Resources Information Center

    Forgatch, Marion S.; DeGarmo, David S.; Beldavs, Zintars G.

    2005-01-01

    This article evaluates the efficacy of the Oregon model of Parent Management Training (PMTO) in the stepfamily context. Sixty-seven of 110 participants in the Marriage and Parenting in Stepfamilies (MAPS) program received a PMTO-based intervention. Participants in the randomly assigned experimental group displayed a large effect in benefits to…

  19. Effectiveness of Gross Model-Based Emotion Regulation Strategies Training on Anger Reduction in Drug-Dependent Individuals and its Sustainability in Follow-up

    PubMed Central

    Massah, Omid; Sohrabi, Faramarz; A’azami, Yousef; Doostian, Younes; Farhoudian, Ali; Daneshmand, Reza

    2016-01-01

    Background Emotion plays an important role in adapting to life changes and stressful events. Difficulty regulating emotions is one of the problems drug abusers often face, and teaching these individuals to express and manage their emotions can be effective on improving their difficult circumstances. Objectives The present study aimed to determine the effectiveness of the Gross model-based emotion regulation strategies training on anger reduction in drug-dependent individuals. Patients and Methods The present study had a quasi-experimental design wherein pretest-posttest evaluations were applied using a control group. The population under study included addicts attending Marivan’s methadone maintenance therapy centers in 2012 - 2013. Convenience sampling was used to select 30 substance-dependent individuals undergoing maintenance treatment who were then randomly assigned to the experiment and control groups. The experiment group received its training in eight two-hour sessions. Data were analyzed using analysis of co-variance and paired t-test. Results There was significant reduction in anger symptoms of drug-dependent individuals after gross model based emotion regulation training (ERT) (P < 0.001). Moreover, the effectiveness of the training on anger was persistent in the follow-up period. Conclusions Symptoms of anger in drug-dependent individuals of this study were reduced by gross model-based emotion regulation strategies training. Based on the results of this study, we may conclude that the gross model based emotion regulation strategies training can be applied alongside other therapies to treat drug abusers undergoing rehabilitation. PMID:27162759

  20. Anxiety and Health-Related Quality of Life Among Patients With Low–Tumor Burden Non-Hodgkin Lymphoma Randomly Assigned to Two Different Rituximab Dosing Regimens: Results From ECOG Trial E4402 (RESORT)

    PubMed Central

    Wagner, Lynne I.; Zhao, Fengmin; Hong, Fangxin; Williams, Michael E.; Gascoyne, Randy D.; Krauss, John C.; Advani, Ranjana H.; Go, Ronald S.; Habermann, Thomas M.; Leach, Joseph W.; O'Connor, Brian; Schuster, Stephen J.; Cella, David; Horning, Sandra J.; Kahl, Brad S.

    2015-01-01

    Purpose The purpose of this study was to compare illness-related anxiety among participants in the Rituximab Extended Schedule or Retreatment Trial (RESORT) randomly assigned to maintenance rituximab (MR) versus rituximab re-treatment (RR). A secondary objective was to examine whether the superiority of MR versus RR on anxiety depended on illness-related coping style. Patients and Methods Patients (N = 253) completed patient-reported outcome (PRO) measures at random assignment to MR or RR (baseline); at 3, 6, 12, 24, 36, and 48 months after random assignment; and at rituximab failure. PRO measures assessed illness-related anxiety and coping style, and secondary end points including general anxiety, worry and interference with emotional well-being, depression, and health-related quality of life (HRQoL). Patients were classified as using an active or avoidant illness-related coping style. Independent sample t tests and linear mixed-effects models were used to identify treatment arm differences on PRO end points and differences based on coping style. Results Illness-related anxiety was comparable between treatment arms at all time points (P > .05), regardless of coping style (active or avoidant). Illness-related anxiety and general anxiety significantly decreased over time on both arms. HRQoL scores were relatively stable and did not change significantly from baseline for both arms. An avoidant coping style was associated with significantly higher anxiety (18% and 13% exceeded clinical cutoff points at baseline and 6 months, respectively) and poorer HRQoL compared with an active coping style (P < .001), regardless of treatment arm assignment. Conclusion Surveillance until RR at progression was not associated with increased anxiety compared with MR, regardless of coping style. Avoidant coping was associated with higher anxiety and poorer HRQoL. PMID:25605841

  1. Combining in-school and community-based media efforts: reducing marijuana and alcohol uptake among younger adolescents.

    PubMed

    Slater, Michael D; Kelly, Kathleen J; Edwards, Ruth W; Thurman, Pamela J; Plested, Barbara A; Keefe, Thomas J; Lawrence, Frank R; Henry, Kimberly L

    2006-02-01

    This study tests the impact of an in-school mediated communication campaign based on social marketing principles, in combination with a participatory, community-based media effort, on marijuana, alcohol and tobacco uptake among middle-school students. Eight media treatment and eight control communities throughout the US were randomly assigned to condition. Within both media treatment and media control communities, one school received a research-based prevention curriculum and one school did not, resulting in a crossed, split-plot design. Four waves of longitudinal data were collected over 2 years in each school and were analyzed using generalized linear mixed models to account for clustering effects. Youth in intervention communities (N = 4,216) showed fewer users at final post-test for marijuana [odds ratio (OR) = 0.50, P = 0.019], alcohol (OR = 0.40, P = 0.009) and cigarettes (OR = 0.49, P = 0.039), one-tailed. Growth trajectory results were significant for marijuana (P = 0.040), marginal for alcohol (P = 0.051) and non-significant for cigarettes (P = 0.114). Results suggest that an appropriately designed in-school and community-based media effort can reduce youth substance uptake. Effectiveness does not depend on the presence of an in-school prevention curriculum.

  2. Effects of intensive glucose lowering on brain structure and function in people with type 2 diabetes (ACCORD MIND): a randomised open-label substudy.

    PubMed

    Launer, Lenore J; Miller, Michael E; Williamson, Jeff D; Lazar, Ron M; Gerstein, Hertzel C; Murray, Anne M; Sullivan, Mark; Horowitz, Karen R; Ding, Jingzhong; Marcovina, Santica; Lovato, Laura C; Lovato, James; Margolis, Karen L; O'Connor, Patrick; Lipkin, Edward W; Hirsch, Joy; Coker, Laura; Maldjian, Joseph; Sunshine, Jeffrey L; Truwit, Charles; Davatzikos, Christos; Bryan, R Nick

    2011-11-01

    People with type 2 diabetes are at risk of cognitive impairment and brain atrophy. We aimed to compare the effects on cognitive function and brain volume of intensive versus standard glycaemic control. The Memory in Diabetes (MIND) study was done in 52 clinical sites in North America as part of Action to Control Cardiovascular Risk in Diabetes (ACCORD), a double two-by-two factorial parallel group randomised trial. Participants (aged 55-80 years) with type 2 diabetes, high glycated haemoglobin A(1c) (HbA(1c)) concentrations (>7·5%; >58 mmol/mol), and a high risk of cardiovascular events were randomly assigned to receive intensive glycaemic control targeting HbA(1c) to less than 6·0% (42 mmol/mol) or a standard strategy targeting HbA(1c) to 7·0-7·9% (53-63 mmol/mol). Randomisation was via a centralised web-based system and treatment allocation was not masked from clinic staff or participants. We assessed our cognitive primary outcome, the Digit Symbol Substitution Test (DSST) score, at baseline and at 20 and 40 months. We assessed total brain volume (TBV), our primary brain structure outcome, with MRI at baseline and 40 months in a subset of participants. We included all participants with follow-up data in our primary analyses. In February, 2008, raised mortality risk led to the end of the intensive treatment and transition of those participants to standard treatment. We tested our cognitive function hypotheses with a mixed-effects model that incorporated information from both the 20 and 40 month outcome measures. We tested our MRI hypotheses with an ANCOVA model that included intracranial volume and factors used to stratify randomisation. This study is registered with ClinicalTrials.gov, number NCT00182910. We consecutively enrolled 2977 patients (mean age 62·5 years; SD 5·8) who had been randomly assigned to treatment groups in the ACCORD study. Our primary cognitive analysis was of patients with a 20-month or 40-month DSST score: 1378 assigned to receive intensive treatment and 1416 assigned to receive standard treatment. Of the 614 patients with a baseline MRI, we included 230 assigned to receive intensive treatment and 273 assigned to receive standard treatment in our primary MRI analysis at 40 months. There was no significant treatment difference in mean 40-month DSST score (difference in mean 0·32, 95% CI -0·28 to 0·91; p=0·2997). The intensive-treatment group had a greater mean TBV than the standard-treatment group (4·62, 2·0 to 7·3; p=0·0007). Although significant differences in TBV favoured the intensive treatment, cognitive outcomes were not different. Combined with the non-significant effects on other ACCORD outcomes, and increased mortality in participants in the intensive treatment group, our findings do not support the use of intensive therapy to reduce the adverse effects of diabetes on the brain in patients with similar characteristics to those of our participants. US National Institute on Aging and US National Heart, Lung, and Blood Institute. Copyright © 2011 Elsevier Ltd. All rights reserved.

  3. Testing the Effectiveness of Online Assignments in Theory of Finance

    ERIC Educational Resources Information Center

    Batu, Michael; Bower, Nancy; Lun, Esmond; Sadanand, Asha

    2018-01-01

    The authors investigated the effectiveness of online versus paper assignments using final examination scores in three cohorts of theory of finance. In particular, two cohorts were exposed to online assignments while another cohort was exposed to traditional assignments. The central result is that exposure to online assignments robustly leads to…

  4. Empirical Selection of Informative Microsatellite Markers within Co-ancestry Pig Populations Is Required for Improving the Individual Assignment Efficiency

    PubMed Central

    Li, Y. H.; Chu, H. P.; Jiang, Y. N.; Lin, C. Y.; Li, S. H.; Li, K. T.; Weng, G. J.; Cheng, C. C.; Lu, D. J.; Ju, Y. T.

    2014-01-01

    The Lanyu is a miniature pig breed indigenous to Lanyu Island, Taiwan. It is distantly related to Asian and European pig breeds. It has been inbred to generate two breeds and crossed with Landrace and Duroc to produce two hybrids for laboratory use. Selecting sets of informative genetic markers to track the genetic qualities of laboratory animals and stud stock is an important function of genetic databases. For more than two decades, Lanyu derived breeds of common ancestry and crossbreeds have been used to examine the effectiveness of genetic marker selection and optimal approaches for individual assignment. In this paper, these pigs and the following breeds: Berkshire, Duroc, Landrace and Yorkshire, Meishan and Taoyuan, TLRI Black Pig No. 1, and Kaohsiung Animal Propagation Station Black pig are studied to build a genetic reference database. Nineteen microsatellite markers (loci) provide information on genetic variation and differentiation among studied breeds. High differentiation index (FST) and Cavalli-Sforza chord distances give genetic differentiation among breeds, including Lanyu’s inbred populations. Inbreeding values (FIS) show that Lanyu and its derived inbred breeds have significant loss of heterozygosity. Individual assignment testing of 352 animals was done with different numbers of microsatellite markers in this study. The testing assigned 99% of the animals successfully into their correct reference populations based on 9 to 14 markers ranking D-scores, allelic number, expected heterozygosity (HE) or FST, respectively. All miss-assigned individuals came from close lineage Lanyu breeds. To improve individual assignment among close lineage breeds, microsatellite markers selected from Lanyu populations with high polymorphic, heterozygosity, FST and D-scores were used. Only 6 to 8 markers ranking HE, FST or allelic number were required to obtain 99% assignment accuracy. This result suggests empirical examination of assignment-error rates is required if discernible levels of co-ancestry exist. In the reference group, optimum assignment accuracy was achievable achieved through a combination of different markers by ranking the heterozygosity, FST and allelic number of close lineage populations. PMID:25049996

  5. Variation of Care Time Between Nursing Units in Classification-Based Nurse-to-Resident Ratios: A Multilevel Analysis

    PubMed Central

    Planer, Katarina; Hagel, Anja

    2018-01-01

    A validity test was conducted to determine how care level–based nurse-to-resident ratios compare with actual daily care times per resident in Germany. Stability across different long-term care facilities was tested. Care level–based nurse-to-resident ratios were compared with the standard minimum nurse-to-resident ratios. Levels of care are determined by classification authorities in long-term care insurance programs and are used to distribute resources. Care levels are a powerful tool for classifying authorities in long-term care insurance. We used observer-based measurement of assignable direct and indirect care time in 68 nursing units for 2028 residents across 2 working days. Organizational data were collected at the end of the quarter in which the observation was made. Data were collected from January to March, 2012. We used a null multilevel model with random intercepts and multilevel models with fixed and random slopes to analyze data at both the organization and resident levels. A total of 14% of the variance in total care time per day was explained by membership in nursing units. The impact of care levels on care time differed significantly between nursing units. Forty percent of residents at the lowest care level received less than the standard minimum registered nursing time per day. For facilities that have been significantly disadvantaged in the current staffing system, a higher minimum standard will function more effectively than a complex classification system without scientific controls. PMID:29442533

  6. Variation of Care Time Between Nursing Units in Classification-Based Nurse-to-Resident Ratios: A Multilevel Analysis.

    PubMed

    Brühl, Albert; Planer, Katarina; Hagel, Anja

    2018-01-01

    A validity test was conducted to determine how care level-based nurse-to-resident ratios compare with actual daily care times per resident in Germany. Stability across different long-term care facilities was tested. Care level-based nurse-to-resident ratios were compared with the standard minimum nurse-to-resident ratios. Levels of care are determined by classification authorities in long-term care insurance programs and are used to distribute resources. Care levels are a powerful tool for classifying authorities in long-term care insurance. We used observer-based measurement of assignable direct and indirect care time in 68 nursing units for 2028 residents across 2 working days. Organizational data were collected at the end of the quarter in which the observation was made. Data were collected from January to March, 2012. We used a null multilevel model with random intercepts and multilevel models with fixed and random slopes to analyze data at both the organization and resident levels. A total of 14% of the variance in total care time per day was explained by membership in nursing units. The impact of care levels on care time differed significantly between nursing units. Forty percent of residents at the lowest care level received less than the standard minimum registered nursing time per day. For facilities that have been significantly disadvantaged in the current staffing system, a higher minimum standard will function more effectively than a complex classification system without scientific controls.

  7. A Test of the Abstinence Violation Effect.

    ERIC Educational Resources Information Center

    Ruderman, Audrey J.

    According to the abstinence violation effect, highly controlled drinkers tend to overindulge following an initial slip. To investigate this relapse model, 47 male college students, ranging in age from 21 to 46, were assigned either to an unrestrained or a restrained drinker group according to their scores on the Restrained Drinking Scale. Subjects…

  8. Lifelong Learning, Income Inequality and Social Mobility in Singapore

    ERIC Educational Resources Information Center

    Lee, Millie; Morris, Paul

    2016-01-01

    Singapore has been assigned the role of a "model" nation state primarily for two reasons: its rapid rate of economic growth and its outstanding performance on cross-national tests of educational achievement, such as PISA. This has resulted in advocates of reform citing it as illustrating "best practices", especially in the…

  9. Comparing the Document Representations of Two IR-Systems: CLARIT and TOPIC.

    ERIC Educational Resources Information Center

    Paijmans, Hans

    1993-01-01

    Compares two information retrieval systems, CLARIT and TOPIC, in terms of assigned versus derived and precoordinate versus postcoordinate indexing. Models of information retrieval systems are discussed, and a test of the systems using a demonstration database of full-text articles from the "Wall Street Journal" is described. (Contains 21…

  10. Pedigree reconstruction from SNP data: parentage assignment, sibship clustering and beyond.

    PubMed

    Huisman, Jisca

    2017-09-01

    Data on hundreds or thousands of single nucleotide polymorphisms (SNPs) provide detailed information about the relationships between individuals, but currently few tools can turn this information into a multigenerational pedigree. I present the r package sequoia, which assigns parents, clusters half-siblings sharing an unsampled parent and assigns grandparents to half-sibships. Assignments are made after consideration of the likelihoods of all possible first-, second- and third-degree relationships between the focal individuals, as well as the traditional alternative of being unrelated. This careful exploration of the local likelihood surface is implemented in a fast, heuristic hill-climbing algorithm. Distinction between the various categories of second-degree relatives is possible when likelihoods are calculated conditional on at least one parent of each focal individual. Performance was tested on simulated data sets with realistic genotyping error rate and missingness, based on three different large pedigrees (N = 1000-2000). This included a complex pedigree with overlapping generations, occasional close inbreeding and some unknown birth years. Parentage assignment was highly accurate down to about 100 independent SNPs (error rate <0.1%) and fast (<1 min) as most pairs can be excluded from being parent-offspring based on opposite homozygosity. For full pedigree reconstruction, 40% of parents were assumed nongenotyped. Reconstruction resulted in low error rates (<0.3%), high assignment rates (>99%) in limited computation time (typically <1 h) when at least 200 independent SNPs were used. In three empirical data sets, relatedness estimated from the inferred pedigree was strongly correlated to genomic relatedness. © 2017 The Authors. Molecular Ecology Resources Published by John Wiley & Sons Ltd.

  11. Advisor-Teller Money Manager (ATM) therapy for substance use disorders.

    PubMed

    Rosen, Marc I; Rounsaville, Bruce J; Ablondi, Karen; Black, Anne C; Rosenheck, Robert A

    2010-07-01

    Patients with concomitant psychiatric and substance use disorders are commonly assigned representative payees or case managers to help manage their funds, but money management has not been conceptualized as a theory-based treatment. This randomized clinical trial was conducted to determine the effect of a money management-based therapy, advisor-teller money manager (ATM), on substance abuse or dependence. Ninety patients at a community mental health center who had a history of cocaine or alcohol abuse or dependence were assessed after random assignment to 36 weeks of ATM (N=47) or a control condition in which use of a financial workbook was reviewed (N=43). Patients assigned to ATM were encouraged to deposit their funds into a third-party account, plan weekly expenditures, and negotiate monthly budgets. Substance use calendars and urine toxicology tests were collected every other week for 36 weeks and again 52 weeks after randomization. Patients assigned to ATM had significantly more negative toxicologies for cocaine metabolite over time than patients in the control group, and treating clinicians rated ATM patients as significantly more likely to be abstinent from illicit drugs. Self-reported abstinence from alcohol did not significantly differ between groups. Unexpectedly, patients assigned to ATM were more likely to be assigned a representative payee or a conservator than control participants during the follow-up period (ten of 47 versus two of 43). One patient in ATM assaulted the therapist when his check had not arrived. ATM is an efficacious therapy for the treatment of cocaine abuse or dependence among people with concomitant psychiatric illness but requires protection of patient autonomy and staff safety.

  12. Diagnostic Performance of SRU and ATA Thyroid Nodule Classification Algorithms as Tested With a 1 Million Virtual Thyroid Nodule Model.

    PubMed

    Boehnke, Mitchell; Patel, Nayana; McKinney, Kristin; Clark, Toshimasa

    The Society of Radiologists in Ultrasound (SRU 2005) and American Thyroid Association (ATA 2009 and ATA 2015) have published algorithms regarding thyroid nodule management. Kwak et al. and other groups have described models that estimate thyroid nodules' malignancy risk. The aim of our study is to use Kwak's model to evaluate the tradeoffs of both sensitivity and specificity of SRU 2005, ATA 2009 and ATA 2015 management algorithms. 1,000,000 thyroid nodules were modeled in MATLAB. Ultrasound characteristics were modeled after published data. Malignancy risk was estimated per Kwak's model and assigned as a binary variable. All nodules were then assessed using the published management algorithms. With the malignancy variable as condition positivity and algorithms' recommendation for FNA as test positivity, diagnostic performance was calculated. Modeled nodule characteristics mimic those of Kwak et al. 12.8% nodules were assigned as malignant (malignancy risk range of 2.0-98%). FNA was recommended for 41% of nodules by SRU 2005, 66% by ATA 2009, and 82% by ATA 2015. Sensitivity and specificity is significantly different (< 0.0001): 49% and 60% for SRU; 81% and 36% for ATA 2009; and 95% and 20% for ATA 2015. SRU 2005, ATA 2009 and ATA 2015 algorithms are used routinely in clinical practice to determine whether thyroid nodule biopsy is indicated. We demonstrate significant differences in these algorithms' diagnostic performance, which result in a compromise between sensitivity and specificity. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Evaluating a technical university's placement test using the Rasch measurement model

    NASA Astrophysics Data System (ADS)

    Salleh, Tuan Salwani; Bakri, Norhayati; Zin, Zalhan Mohd

    2016-10-01

    This study discusses the process of validating a mathematics placement test at a technical university. The main objective is to produce a valid and reliable test to measure students' prerequisite knowledge to learn engineering technology mathematics. It is crucial to have a valid and reliable test as the results will be used in a critical decision making to assign students into different groups of Technical Mathematics 1. The placement test which consists of 50 mathematics questions were tested on 82 new diplomas in engineering technology students at a technical university. This study employed rasch measurement model to analyze the data through the Winsteps software. The results revealed that there are ten test questions lower than less able students' ability. Nevertheless, all the ten questions satisfied infit and outfit standard values. Thus, all the questions can be reused in the future placement test at the technical university.

  14. Belief state representation in the dopamine system.

    PubMed

    Babayan, Benedicte M; Uchida, Naoshige; Gershman, Samuel J

    2018-05-14

    Learning to predict future outcomes is critical for driving appropriate behaviors. Reinforcement learning (RL) models have successfully accounted for such learning, relying on reward prediction errors (RPEs) signaled by midbrain dopamine neurons. It has been proposed that when sensory data provide only ambiguous information about which state an animal is in, it can predict reward based on a set of probabilities assigned to hypothetical states (called the belief state). Here we examine how dopamine RPEs and subsequent learning are regulated under state uncertainty. Mice are first trained in a task with two potential states defined by different reward amounts. During testing, intermediate-sized rewards are given in rare trials. Dopamine activity is a non-monotonic function of reward size, consistent with RL models operating on belief states. Furthermore, the magnitude of dopamine responses quantitatively predicts changes in behavior. These results establish the critical role of state inference in RL.

  15. Particle swarm optimization algorithm for optimizing assignment of blood in blood banking system.

    PubMed

    Olusanya, Micheal O; Arasomwan, Martins A; Adewumi, Aderemi O

    2015-01-01

    This paper reports the performance of particle swarm optimization (PSO) for the assignment of blood to meet patients' blood transfusion requests for blood transfusion. While the drive for blood donation lingers, there is need for effective and efficient management of available blood in blood banking systems. Moreover, inherent danger of transfusing wrong blood types to patients, unnecessary importation of blood units from external sources, and wastage of blood products due to nonusage necessitate the development of mathematical models and techniques for effective handling of blood distribution among available blood types in order to minimize wastages and importation from external sources. This gives rise to the blood assignment problem (BAP) introduced recently in literature. We propose a queue and multiple knapsack models with PSO-based solution to address this challenge. Simulation is based on sets of randomly generated data that mimic real-world population distribution of blood types. Results obtained show the efficiency of the proposed algorithm for BAP with no blood units wasted and very low importation, where necessary, from outside the blood bank. The result therefore can serve as a benchmark and basis for decision support tools for real-life deployment.

  16. Geographical assignment of hospitalists in an urban teaching hospital: feasibility and impact on efficiency and provider satisfaction.

    PubMed

    Bryson, Christine; Boynton, Greta; Stepczynski, Anna; Garb, Jane; Kleppel, Reva; Irani, Farzan; Natanasabapathy, Siva; Stefan, Mihaela S

    2017-10-01

    To evaluate whether implementation of a geographic model of assigning hospitalists is feasible and sustainable in a large hospitalist program and assess its impact on provider satisfaction, perceived efficiency and patient outcomes. Pre (3 months) - post (12 months) intervention study conducted from June 2014 through September 2015 at a tertiary care medical center with a large hospitalist program caring for patients scattered in 4 buildings and 16 floors. Hospitalists were assigned to a particular nursing unit (geographic assignment) with a goal of having over 80% of their assigned patients located on their assigned unit. Satisfaction and perceived efficiency were assessed through a survey administered before and after the intervention. Geographic assignment percentage increased from an average of 60% in the pre-intervention period to 93% post-intervention. The number of hospitalists covering a 32 bed unit decreased from 8-10 pre to 2-3 post-intervention. A majority of physicians (87%) thought that geography had a positive impact on the overall quality of care. Respondents reported that they felt that geography increased time spent with patient/caregivers to discuss plan of care (p < 0.001); improved communication with nurses (p = 0.0009); and increased sense of teamwork with nurses/case managers (p < 0.001). Mean length of stay (4.54 vs 4.62 days), 30-day readmission rates (16.0% vs 16.6%) and patient satisfaction (79.9 vs 77.3) did not change significantly between the pre- and post-implementation period. The discharge before noon rate improved slightly (47.5% - 54.1%). Implementation of a unit-based model in a large hospitalist program is feasible and sustainable with appropriate planning and support. The geographical model of care increased provider satisfaction and perceived efficiency; it also facilitated the implementation of other key interventions such as interdisciplinary rounds.

  17. Decoding the Semantic Content of Natural Movies from Human Brain Activity

    PubMed Central

    Huth, Alexander G.; Lee, Tyler; Nishimoto, Shinji; Bilenko, Natalia Y.; Vu, An T.; Gallant, Jack L.

    2016-01-01

    One crucial test for any quantitative model of the brain is to show that the model can be used to accurately decode information from evoked brain activity. Several recent neuroimaging studies have decoded the structure or semantic content of static visual images from human brain activity. Here we present a decoding algorithm that makes it possible to decode detailed information about the object and action categories present in natural movies from human brain activity signals measured by functional MRI. Decoding is accomplished using a hierarchical logistic regression (HLR) model that is based on labels that were manually assigned from the WordNet semantic taxonomy. This model makes it possible to simultaneously decode information about both specific and general categories, while respecting the relationships between them. Our results show that we can decode the presence of many object and action categories from averaged blood-oxygen level-dependent (BOLD) responses with a high degree of accuracy (area under the ROC curve > 0.9). Furthermore, we used this framework to test whether semantic relationships defined in the WordNet taxonomy are represented the same way in the human brain. This analysis showed that hierarchical relationships between general categories and atypical examples, such as organism and plant, did not seem to be reflected in representations measured by BOLD fMRI. PMID:27781035

  18. Performance-based workload assessment: Allocation strategy and added task sensitivity

    NASA Technical Reports Server (NTRS)

    Vidulich, Michael A.

    1990-01-01

    The preliminary results of a research program investigating the use of added tasks to evaluate mental workload are reviewed. The focus of the first studies was a reappraisal of the traditional secondary task logic that encouraged the use of low-priority instructions for the added task. It was believed that such low-priority tasks would encourage subjects to split their available resources among the two tasks. The primary task would be assigned all the resources it needed, and any remaining reserve capacity would be assigned to the secondary task. If the model were correct, this approach was expected to combine sensitivity to primary task difficulty with unintrusiveness to primary task performance. The first studies of the current project demonstrated that a high-priority added task, although intrusive, could be more sensitive than the traditional low-priority secondary task. These results suggested that a more appropriate model of the attentional effects associated with added task performance might be based on capacity switching, rather than the traditional optimal allocation model.

  19. FOAM (Functional Ontology Assignments for Metagenomes): A Hidden Markov Model (HMM) database with environmental focus

    DOE PAGES

    Prestat, Emmanuel; David, Maude M.; Hultman, Jenni; ...

    2014-09-26

    A new functional gene database, FOAM (Functional Ontology Assignments for Metagenomes), was developed to screen environmental metagenomic sequence datasets. FOAM provides a new functional ontology dedicated to classify gene functions relevant to environmental microorganisms based on Hidden Markov Models (HMMs). Sets of aligned protein sequences (i.e. ‘profiles’) were tailored to a large group of target KEGG Orthologs (KOs) from which HMMs were trained. The alignments were checked and curated to make them specific to the targeted KO. Within this process, sequence profiles were enriched with the most abundant sequences available to maximize the yield of accurate classifier models. An associatedmore » functional ontology was built to describe the functional groups and hierarchy. FOAM allows the user to select the target search space before HMM-based comparison steps and to easily organize the results into different functional categories and subcategories. FOAM is publicly available at http://portal.nersc.gov/project/m1317/FOAM/.« less

  20. Symmetric reconfigurable capacity assignment in a bidirectional DWDM access network.

    PubMed

    Ortega, Beatriz; Mora, José; Puerto, Gustavo; Capmany, José

    2007-12-10

    This paper presents a novel architecture for DWDM bidirectional access networks providing symmetric dynamic capacity allocation for both downlink and uplink signals. A foldback arrayed waveguide grating incorporating an optical switch enables the experimental demonstration of flexible assignment of multiservice capacity. Different analog and digital services, such as CATV, 10 GHz-tone, 155Mb/s PRBS and UMTS signals have been transmitted in order to successfully test the system performance under different scenarios of total capacity distribution from the Central Station to different Base Stations with two reconfigurable extra channels for each down and upstream direction.

  1. Rest requirements and rest management of personnel in shift work

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hammell, B.D.; Scheuerle, A.

    1995-12-31

    A difficulty-weighted shift assignment scheme is proposed for use in prolonged and strenuous field operations such as emergency response, site testing, and short term hazardous waste remediation projects. The purpose of the work rotation plan is to increase productivity, safety, and moral of workers. Job weighting is accomplished by assigning adjustments to the mental and physical intensity of the task, the protective equipment worn, and the climatic conditions. The plan is based on medical studies of sleep deprivation, the effects of rest adjustments, and programs to reduce sleep deprivation and normalize shift schedules.

  2. Speech Correction for Children with Cleft Lip and Palate by Networking of Community-Based Care.

    PubMed

    Hanchanlert, Yotsak; Pramakhatay, Worawat; Pradubwong, Suteera; Prathanee, Benjamas

    2015-08-01

    Prevalence of cleft lip and palate (CLP) is high in Northeast Thailand. Most children with CLP face many problems, particularly compensatory articulation disorders (CAD) beyond surgery while speech services and the number of speech and language pathologists (SLPs) are limited. To determine the effectiveness of networking of Khon Kaen University (KKU) Community-Based Speech Therapy Model: Kosumphisai Hospital, Kosumphisai District and Maha Sarakham Hospital, Mueang District, Maha Sarakham Province for reduction of the number of articulations errors for children with CLP. Eleven children with CLP were recruited in 3 1-year projects of KKU Community-Based Speech Therapy Model. Articulation tests were formally assessed by qualified language pathologists (SLPs) for baseline and post treatment outcomes. Teachings on services for speech assistants (SAs) were conducted by SLPs. Assigned speech correction (SC) was performed by SAs at home and at local hospitals. Caregivers also gave SC at home 3-4 days a week. Networking of Community-Based Speech Therapy Model signficantly reduced the number of articulation errors for children with CLP in both word and sentence levels (mean difference = 6.91, 95% confidence interval = 4.15-9.67; mean difference = 5.36, 95% confidence interval = 2.99-7.73, respectively). Networking by Kosumphisai and Maha Sarakham of KKU Community-Based Speech Therapy Model was a valid and efficient method for providing speech services for children with cleft palate and could be extended to any area in Thailand and other developing countries, where have similar contexts.

  3. Development of New Open-Shell Perturbation and Coupled-Cluster Theories Based on Symmetric Spin Orbitals

    NASA Technical Reports Server (NTRS)

    Lee, Timothy J.; Arnold, James O. (Technical Monitor)

    1994-01-01

    A new spin orbital basis is employed in the development of efficient open-shell coupled-cluster and perturbation theories that are based on a restricted Hartree-Fock (RHF) reference function. The spin orbital basis differs from the standard one in the spin functions that are associated with the singly occupied spatial orbital. The occupied orbital (in the spin orbital basis) is assigned the delta(+) = 1/square root of 2(alpha+Beta) spin function while the unoccupied orbital is assigned the delta(-) = 1/square root of 2(alpha-Beta) spin function. The doubly occupied and unoccupied orbitals (in the reference function) are assigned the standard alpha and Beta spin functions. The coupled-cluster and perturbation theory wave functions based on this set of "symmetric spin orbitals" exhibit much more symmetry than those based on the standard spin orbital basis. This, together with interacting space arguments, leads to a dramatic reduction in the computational cost for both coupled-cluster and perturbation theory. Additionally, perturbation theory based on "symmetric spin orbitals" obeys Brillouin's theorem provided that spin and spatial excitations are both considered. Other properties of the coupled-cluster and perturbation theory wave functions and models will be discussed.

  4. Food Choice Questionnaire (FCQ) revisited. Suggestions for the development of an enhanced general food motivation model.

    PubMed

    Fotopoulos, Christos; Krystallis, Athanasios; Vassallo, Marco; Pagiaslis, Anastasios

    2009-02-01

    Recognising the need for a more statistically robust instrument to investigate general food selection determinants, the research validates and confirms Food Choice Questionnaire (FCQ's) factorial design, develops ad hoc a more robust FCQ version and tests its ability to discriminate between consumer segments in terms of the importance they assign to the FCQ motivational factors. The original FCQ appears to represent a comprehensive and reliable research instrument. However, the empirical data do not support the robustness of its 9-factorial design. On the other hand, segmentation results at the subpopulation level based on the enhanced FCQ version bring about an optimistic message for the FCQ's ability to predict food selection behaviour. The paper concludes that some of the basic components of the original FCQ can be used as a basis for a new general food motivation typology. The development of such a new instrument, with fewer, of higher abstraction FCQ-based dimensions and fewer items per dimension, is a right step forward; yet such a step should be theory-driven, while a rigorous statistical testing across and within population would be necessary.

  5. Longitudinal Evaluation of a Scale-up Model for Teaching Mathematics with Trajectories and Technologies: Persistence of Effects in the Third Year

    ERIC Educational Resources Information Center

    Clements, Douglas H.; Sarama, Julie; Wolfe, Christopher B.; Spitler, Mary Elaine

    2013-01-01

    Using a cluster randomized trial design, we evaluated the persistence of effects of a research-based model for scaling up educational interventions. The model was implemented in 42 schools in two city districts serving low-resource communities, randomly assigned to three conditions. In pre-kindergarten, the two experimental interventions were…

  6. Writing-to-Learn the Nature of Science in the Context of the Lewis Dot Structure Model

    ERIC Educational Resources Information Center

    Shultz, Ginger V.; Gere, Anne Ruggles

    2015-01-01

    Traditional methods for teaching the Lewis dot structure model emphasize rule-based learning and often neglect the purpose and function of the model. Thus, many students are unable to extend their understanding of molecular structures in new contexts. The assignment described here addresses this issue by asking students to read and write about the…

  7. Nutrition intervention group program based on preaction-stage-oriented change processes of the Transtheoretical Model promotes long-term reduction in dietary fat intake.

    PubMed

    Finckenor, M; Byrd-Bredbenner, C

    2000-03-01

    To develop and evaluate the long-term effectiveness of an intervention program, based on preaction-stage-oriented change processes of the Transtheoretical Model of Behavior Change, that could be delivered in a group setting to help participants lower dietary fat intake. An enhanced version of the nonequivalent control group experimental design was used. Entire sections of an undergraduate introductory nutrition science course were assigned to an experimental, pretest/posttest control, or posttest-only control group. Daily fat intake and stage of change of the experimental and pretest/posttest control groups were determined at the pretest and posttest and 1-year later at a follow-up test. Every 1 to 2 weeks during the study, stage of change of the experimental group was assessed. Daily fat intake of the experimental group was assessed at study midpoint. Daily fat intake and stage of change of the posttest-only control group was determined at the posttest. Pretest results were used to place participants of the experimental and pretest/posttest control groups in either the preaction stage (i.e., precontemplation, contemplation, or preparation) or the action/maintenance stage. The sample consisted of 38, 30, and 42 undergraduate students who were assigned to the experimental, pretest/posttest control, and posttest-only control groups, respectively. The experimental group participated in a group-based, dietary fat intake intervention that included a series of 11 lessons taught over a 14-week period. Each lesson was based on 1 or 2 of the preaction-stage-oriented change processes of the Transtheoretical Model. Data were evaluated to determine the effects of the intervention program on long-term dietary fat reduction and stage of change progression. Analysis of variance, repeated-measures analysis of variance, and paired t tests. For pretest and posttest dietary fat intake scores, stage and time were significant, and there was a significant time-by-stage interaction. Time was significant for pretest and posttest stage scores. Subjects in the preaction-stage experimental group significantly increased their mean stage of change and reduced their fat intake between the pretest and posttest; these changes persisted for 1 year. Pretest/posttest control group participants who began in a preaction stage also significantly increased their mean stage and reduced fat intake by the posttest, but these changes did not endure until the follow-up test. This intervention program produced an enduring, significant reduction in mean dietary fat consumption and a significant progression in mean stage of change of subjects in the experimental group who were in the preaction stage. It may be appropriate to design group interventions to use preaction stage processes rather than the more traditionally used action and maintenance stages change processes.

  8. The Assignment of Raters to Items: Controlling for Rater Effects.

    ERIC Educational Resources Information Center

    Sykes, Robert C.; Heidorn, Mark; Lee, Guemin

    A study was conducted to evaluate the effect of different modes (modalities) of assigning raters to test items. The impact on total constructed response (c.r.) score, and subsequently on total test score, of assigning a single versus multiple raters to an examination reading of a student's set of c.r. responses was evaluated for several mixed-item…

  9. 75 FR 24934 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-06

    ... student archived data (e.g., state mandated standardized test scores); follow-up surveys for students... experimental design that utilizes the random assignment. LIC is an English Language Arts (ELA)-based character education curriculum that is expected to have positive impacts on student academic performance, attendance...

  10. 75 FR 9189 - Notice of Proposed Information Collection Requests

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-01

    ... mandated standardized test scores); follow-up surveys for students; teacher and parent rating/observation on various student aspects (e.g., student social skills); baseline and follow-up surveys for teachers... Character (LIC) program. This study is based on an experimental design that utilizes the random assignment...

  11. Motivational Interviewing for Smoking Cessation among College Students

    ERIC Educational Resources Information Center

    Bolger, Kelly; Carter, Kimberly; Curtin, Lisa; Martz, Denise M.; Gagnon, Sandy G.; Michael, Kurt D.

    2010-01-01

    Motivational interviewing has shown some success as an intervention for college student cigarette smokers. We tested the efficacy and process of a two session motivational-interviewing-based smoking intervention compared to an assessment/information session. College student participants assigned to the motivational interviewing condition did not…

  12. SIMEDIS: a Discrete-Event Simulation Model for Testing Responses to Mass Casualty Incidents.

    PubMed

    Debacker, Michel; Van Utterbeeck, Filip; Ullrich, Christophe; Dhondt, Erwin; Hubloue, Ives

    2016-12-01

    It is recognized that the study of the disaster medical response (DMR) is a relatively new field. To date, there is no evidence-based literature that clearly defines the best medical response principles, concepts, structures and processes in a disaster setting. Much of what is known about the DMR results from descriptive studies and expert opinion. No experimental studies regarding the effects of DMR interventions on the health outcomes of disaster survivors have been carried out. Traditional analytic methods cannot fully capture the flow of disaster victims through a complex disaster medical response system (DMRS). Computer modelling and simulation enable to study and test operational assumptions in a virtual but controlled experimental environment. The SIMEDIS (Simulation for the assessment and optimization of medical disaster management) simulation model consists of 3 interacting components: the victim creation model, the victim monitoring model where the health state of each victim is monitored and adapted to the evolving clinical conditions of the victims, and the medical response model, where the victims interact with the environment and the resources at the disposal of the healthcare responders. Since the main aim of the DMR is to minimize as much as possible the mortality and morbidity of the survivors, we designed a victim-centred model in which the casualties pass through the different components and processes of a DMRS. The specificity of the SIMEDIS simulation model is the fact that the victim entities evolve in parallel through both the victim monitoring model and the medical response model. The interaction between both models is ensured through a time or medical intervention trigger. At each service point, a triage is performed together with a decision on the disposition of the victims regarding treatment and/or evacuation based on a priority code assigned to the victim and on the availability of resources at the service point. The aim of the case study is to implement the SIMEDIS model to the DMRS of an international airport and to test the medical response plan to an airplane crash simulation at the airport. In order to identify good response options, the model then was used to study the effect of a number of interventional factors on the performance of the DMRS. Our study reflects the potential of SIMEDIS to model complex systems, to test different aspects of DMR, and to be used as a tool in experimental research that might make a substantial contribution to provide the evidence base for the effectiveness and efficiency of disaster medical management.

  13. Dilution-to-extinction cultivation of leaf-inhabiting endophytic fungi in beech (Fagus sylvatica L.)--different cultivation techniques influence fungal biodiversity assessment.

    PubMed

    Unterseher, Martin; Schnittler, Martin

    2009-05-01

    Two cultivation-based isolation techniques - the incubation of leaf fragments (fragment plating) and dilution-to-extinction culturing on malt extract agar - were compared for recovery of foliar endophytic fungi from Fagus sylvatica near Greifswald, north-east Germany. Morphological-anatomical characters of vegetative and sporulating cultures and ITS sequences were used to assign morphotypes and taxonomic information to the isolates. Data analysis included species-accumulation curves, richness estimators, multivariate statistics and null model testing. Fragment plating and extinction culturing were significantly complementary with regard to species composition, because around two-thirds of the 35 fungal taxa were isolated with only one of the two cultivation techniques. The difference in outcomes highlights the need for caution in assessing fungal biodiversity based upon single isolation techniques. The efficiency of cultivation-based studies of fungal endophytes was significantly increased with the combination of the two isolation methods and estimations of species richness, when compared with a 20-years old reference study, which needed three times more isolates with fragment plating to attain the same species richness. Intensified testing and optimisation of extinction culturing in endophyte research is advocated.

  14. Numerical simulation of groundwater flow in Dar es Salaam Coastal Plain (Tanzania)

    NASA Astrophysics Data System (ADS)

    Luciani, Giulia; Sappa, Giuseppe; Cella, Antonella

    2016-04-01

    They are presented the results of a groundwater modeling study on the Coastal Aquifer of Dar es Salaam (Tanzania). Dar es Salaam is one of the fastest-growing coastal cities in Sub-Saharan Africa, with with more than 4 million of inhabitants and a population growth rate of about 8 per cent per year. The city faces periodic water shortages, due to the lack of an adequate water supply network. These two factors have determined, in the last ten years, an increasing demand of groundwater exploitation, carried on by quite a number of private wells, which have been drilled to satisfy human demand. A steady-state three dimensional groundwater model has been set up by the MODFLOW code, and calibrated with the UCODE code for inverse modeling. The aim of the model was to carry out a characterization of groundwater flow system in the Dar es Salaam Coastal Plain. The inputs applied to the model included net recharge rate, calculated from time series of precipitation data (1961-2012), estimations of average groundwater extraction, and estimations of groundwater recharge, coming from zones, outside the area under study. Parametrization of the hydraulic conductivities was realized referring to the main geological features of the study area, based on available literature data and information. Boundary conditions were assigned based on hydrogeological boundaries. The conceptual model was defined in subsequent steps, which added some hydrogeological features and excluded other ones. Calibration was performed with UCODE 2014, using 76 measures of hydraulic head, taken in 2012 referred to the same season. Data were weighted on the basis of the expected errors. Sensitivity analysis of data was performed during calibration, and permitted to identify which parameters were possible to be estimated, and which data could support parameters estimation. Calibration was evaluated based on statistical index, maps of error distribution and test of independence of residuals. Further model analysis was performed after calibration, to test model performance under a range of variations of input variables.

  15. Comparing the efficacy of multimedia modules with traditional textbooks for learning introductory physics content

    NASA Astrophysics Data System (ADS)

    Stelzer, Timothy; Gladding, Gary; Mestre, José P.; Brookes, David T.

    2009-02-01

    We compared the efficacy of multimedia learning modules with traditional textbooks for the first few topics of a calculus-based introductory electricity and magnetism course. Students were randomly assigned to three groups. One group received the multimedia learning module presentations, and the other two received the presentations via written text. All students were then tested on their learning immediately following the presentations as well as 2weeks later. The students receiving the multimedia learning modules performed significantly better on both tests than the students experiencing the text-based presentations.

  16. Supervisory Control of Discrete Event Systems Modeled by Mealy Automata with Nondeterministic Output Functions

    NASA Astrophysics Data System (ADS)

    Ushio, Toshimitsu; Takai, Shigemasa

    Supervisory control is a general framework of logical control of discrete event systems. A supervisor assigns a set of control-disabled controllable events based on observed events so that the controlled discrete event system generates specified languages. In conventional supervisory control, it is assumed that observed events are determined by internal events deterministically. But, this assumption does not hold in a discrete event system with sensor errors and a mobile system, where each observed event depends on not only an internal event but also a state just before the occurrence of the internal event. In this paper, we model such a discrete event system by a Mealy automaton with a nondeterministic output function. We introduce two kinds of supervisors: one assigns each control action based on a permissive policy and the other based on an anti-permissive one. We show necessary and sufficient conditions for the existence of each supervisor. Moreover, we discuss the relationship between the supervisors in the case that the output function is determinisitic.

  17. The conceptualization and measurement of cognitive reserve using common proxy indicators: Testing some tenable reflective and formative models.

    PubMed

    Ikanga, Jean; Hill, Elizabeth M; MacDonald, Douglas A

    2017-02-01

    The examination of cognitive reserve (CR) literature reveals a lack of consensus regarding conceptualization and pervasive problems with its measurement. This study aimed at examining the conceptual nature of CR through the analysis of reflective and formative models using eight proxies commonly employed in the CR literature. We hypothesized that all CR proxies would significantly contribute to a one-factor reflective model and that educational and occupational attainment would produce the strongest loadings on a single CR factor. The sample consisted of 149 participants (82 male/67 female), with 18.1 average years of education and ages of 45-99 years. Participants were assessed for eight proxies of CR (parent socioeconomic status, intellectual functioning, level of education, health literacy, occupational prestige, life leisure activities, physical activities, and spiritual and religious activities). Primary statistical analyses consisted of confirmatory factor analysis (CFA) to test reflective models and structural equation modeling (SEM) to evaluate multiple indicators multiple causes (MIMIC) models. CFA did not produce compelling support for a unitary CR construct when using all eight of our CR proxy variables in a reflective model but fairly cogent evidence for a one-factor model with four variable proxies. A second three-factor reflective model based upon an exploratory principal components analysis of the eight proxies was tested using CFA. Though all eight indicators significantly loaded on their assigned factors, evidence in support of overall model fit was mixed. Based upon the results involving the three-factor reflective model, two alternative formative models were developed and evaluated. While some support was obtained for both, the model in which the formative influences were specified as latent variables appeared to best account for the contributions of all eight proxies to the CR construct. While the findings provide partial support for our hypothesis regarding CR as a one-dimensional reflective construct, the results strongly suggest that the construct is more complex than what can be captured in a reflective model alone. There is a need for theory to better identify and differentiate formative from reflective indicators and to articulate the mechanisms by which CR develops and operates.

  18. Can You Build It? Using Manipulatives to Assess Student Understanding of Food-Web Concepts

    ERIC Educational Resources Information Center

    Grumbine, Richard

    2012-01-01

    This article outlines an exercise that assesses student knowledge of food-web and energy-flow concepts. Students work in teams and use manipulatives to build food-web models based on criteria assigned by the instructor. The models are then peer reviewed according to guidelines supplied by the instructor.

  19. Holistic Designs for Field Instruction in the Contemporary Social Work Curriculum.

    ERIC Educational Resources Information Center

    Skolnik, Louise; Papell, Catherine P.

    1994-01-01

    Two models for social work field instruction are presented, both introduced in a university-based laboratory setting. Both models attempt to integrate field practice with content of the holistic practice curriculum. They were derived from a holistic/multimethod assignment and a holistic/generalist orientation. Issues in field teaching are…

  20. Effect of Worked Examples on Mental Model Progression in a Computer-Based Simulation Learning Environment

    ERIC Educational Resources Information Center

    Darabi, Aubteen; Nelson, David W.; Meeker, Richard; Liang, Xinya; Boulware, Wilma

    2010-01-01

    In a diagnostic problem solving operation of a computer-simulated chemical plant, chemical engineering students were randomly assigned to two groups: one studying product-oriented worked examples, the other practicing conventional problem solving. Effects of these instructional strategies on the progression of learners' mental models were examined…

  1. Hero/Heroine Modeling for Puerto Rican Adolescents: A Preventive Mental Health Intervention.

    ERIC Educational Resources Information Center

    Malgady, Robert G.; And Others

    1990-01-01

    Developed hero/heroine intervention based on adult Puerto Rican role models to foster ethnic identity, self-concept, and adaptive coping behavior. Screened 90 Puerto Rican eighth and ninth graders for presenting behavior problems in school and randomly assigned them to intervention or control groups. After 19 sessions, intervention significantly…

  2. Estimation of sex-specific survival from capture-recapture data when sex is not always known

    USGS Publications Warehouse

    Nichols, J.D.; Kendall, W.L.; Hines, J.E.; Spendelow, J.A.

    2004-01-01

    Many animals lack obvious sexual dimorphism, making assignment of sex difficult even for observed or captured animals. For many such species it is possible to assign sex with certainty only at some occasions; for example, when they exhibit certain types of behavior. A common approach to handling this situation in capture-recapture studies has been to group capture histories into those of animals eventually identified as male and female and those for which sex was never known. Because group membership is dependent on the number of occasions at which an animal was caught or observed (known sex animals, on average, will have been observed at more occasions than unknown-sex animals), survival estimates for known-sex animals will be positively biased, and those for unknown animals will be negatively biased. In this paper, we develop capture-recapture models that incorporate sex ratio and sex assignment parameters that permit unbiased estimation in the face of this sampling problem. We demonstrate the magnitude of bias in the traditional capture-recapture approach to this sampling problem, and we explore properties of estimators from other ad hoc approaches. The model is then applied to capture-recapture data for adult Roseate Terns (Sterna dougallii) at Falkner Island, Connecticut, 1993-2002. Sex ratio among adults in this population favors females, and we tested the hypothesis that this population showed sex-specific differences in adult survival. Evidence was provided for higher survival of adult females than males, as predicted. We recommend use of this modeling approach for future capture-recapture studies in which sex cannot always be assigned to captured or observed animals. We also place this problem in the more general context of uncertainty in state classification in multistate capture-recapture models.

  3. Assignment of the Internal Vibrational Modes of C70 by Inelastic Neutron Scattering Spectroscopy and Periodic-DFT

    PubMed Central

    Refson, Keith; Parker, Stewart F

    2015-01-01

    The fullerene C70 may be considered as the shortest possible nanotube capped by a hemisphere of C60 at each end. Vibrational spectroscopy is a key tool in characterising fullerenes, and C70 has been studied several times and spectral assignments proposed. Unfortunately, many of the modes are either forbidden or have very low infrared or Raman intensity, even if allowed. Inelastic neutron scattering (INS) spectroscopy is not subject to selection rules, and all the modes are allowed. We have obtained a new INS spectrum from a large sample recorded at the highest resolution available. An advantage of INS spectroscopy is that it is straightforward to calculate the spectral intensity from a model. We demonstrate that all previous assignments are incorrect in at least some respects and propose a new assignment based on periodic density functional theory (DFT) that successfully reproduces the INS, infrared, and Raman spectra. PMID:26491642

  4. Implementing a Systematic Process for Consistent Nursing Care in a NICU: A Quality Improvement Project.

    PubMed

    McCarley, Renay Marie; Dowling, Donna A; Dolansky, Mary A; Bieda, Amy

    2018-03-01

    The global aim of this quality improvement project was to develop and implement a systematic process to assign and maintain consistent bedside nurses for infants and families. A systematic process based on a primary care nursing model was implemented to assign consistent care for a 48-bed, single-family room NICU. Four PDSA cycles were necessary to obtain agreement from the nursing staff as to the best process for assigning primary nurses. Post-intervention data revealed a 9.5 percent decrease of consistent caregivers for infants in the NICU ≤ 28 days and a 2.3 percent increase of consistent caregivers for infants in the NICU ≥ 29 days. Although these findings did not meet the goal of the specific aim, a systematic process was created to assign bedside nurses to infants. Further PDSAs will be needed to refine the process to reach the aim.

  5. Effects of rational-emotive therapy on psychophysiological and reported measures of test anxiety arousal.

    PubMed

    Barabasz, A F; Barabasz, M

    1981-07-01

    Developed audio taped lectures, taped therapy session models, and homework assignments designed to reduce irrational beliefs associated with test anxiety within Ellis' rational-emotive therapy (RET) approach. The initial sample consisted of 148 university students. Comparisons with an attention placebo counseling program, which was established to be equally credible by a post-experiment inquiry and a no-treatment group found the RET Ss to show significantly lower skin conductance responses to a test anxiety visualization and lower reported anxiety on a questionnaire. However, skin conductance responses to an alternative test anxiety visualization did not show treatment effects.

  6. Decision support for hospital bed management using adaptable individual length of stay estimations and shared resources

    PubMed Central

    2013-01-01

    Background Elective patient admission and assignment planning is an important task of the strategic and operational management of a hospital and early on became a central topic of clinical operations research. The management of hospital beds is an important subtask. Various approaches have been proposed, involving the computation of efficient assignments with regard to the patients’ condition, the necessity of the treatment, and the patients’ preferences. However, these approaches are mostly based on static, unadaptable estimates of the length of stay and, thus, do not take into account the uncertainty of the patient’s recovery. Furthermore, the effect of aggregated bed capacities have not been investigated in this context. Computer supported bed management, combining an adaptable length of stay estimation with the treatment of shared resources (aggregated bed capacities) has not yet been sufficiently investigated. The aim of our work is: 1) to define a cost function for patient admission taking into account adaptable length of stay estimations and aggregated resources, 2) to define a mathematical program formally modeling the assignment problem and an architecture for decision support, 3) to investigate four algorithmic methodologies addressing the assignment problem and one base-line approach, and 4) to evaluate these methodologies w.r.t. cost outcome, performance, and dismissal ratio. Methods The expected free ward capacity is calculated based on individual length of stay estimates, introducing Bernoulli distributed random variables for the ward occupation states and approximating the probability densities. The assignment problem is represented as a binary integer program. Four strategies for solving the problem are applied and compared: an exact approach, using the mixed integer programming solver SCIP; and three heuristic strategies, namely the longest expected processing time, the shortest expected processing time, and random choice. A baseline approach serves to compare these optimization strategies with a simple model of the status quo. All the approaches are evaluated by a realistic discrete event simulation: the outcomes are the ratio of successful assignments and dismissals, the computation time, and the model’s cost factors. Results A discrete event simulation of 226,000 cases shows a reduction of the dismissal rate compared to the baseline by more than 30 percentage points (from a mean dismissal ratio of 74.7% to 40.06% comparing the status quo with the optimization strategies). Each of the optimization strategies leads to an improved assignment. The exact approach has only a marginal advantage over the heuristic strategies in the model’s cost factors (≤3%). Moreover,this marginal advantage was only achieved at the price of a computational time fifty times that of the heuristic models (an average computing time of 141 s using the exact method, vs. 2.6 s for the heuristic strategy). Conclusions In terms of its performance and the quality of its solution, the heuristic strategy RAND is the preferred method for bed assignment in the case of shared resources. Future research is needed to investigate whether an equally marked improvement can be achieved in a large scale clinical application study, ideally one comprising all the departments involved in admission and assignment planning. PMID:23289448

  7. A general system for automatic biomedical image segmentation using intensity neighborhoods.

    PubMed

    Chen, Cheng; Ozolek, John A; Wang, Wei; Rohde, Gustavo K

    2011-01-01

    Image segmentation is important with applications to several problems in biology and medicine. While extensively researched, generally, current segmentation methods perform adequately in the applications for which they were designed, but often require extensive modifications or calibrations before being used in a different application. We describe an approach that, with few modifications, can be used in a variety of image segmentation problems. The approach is based on a supervised learning strategy that utilizes intensity neighborhoods to assign each pixel in a test image its correct class based on training data. We describe methods for modeling rotations and variations in scales as well as a subset selection for training the classifiers. We show that the performance of our approach in tissue segmentation tasks in magnetic resonance and histopathology microscopy images, as well as nuclei segmentation from fluorescence microscopy images, is similar to or better than several algorithms specifically designed for each of these applications.

  8. Multiple-Event Seismic Location Using the Markov-Chain Monte Carlo Technique

    NASA Astrophysics Data System (ADS)

    Myers, S. C.; Johannesson, G.; Hanley, W.

    2005-12-01

    We develop a new multiple-event location algorithm (MCMCloc) that utilizes the Markov-Chain Monte Carlo (MCMC) method. Unlike most inverse methods, the MCMC approach produces a suite of solutions, each of which is consistent with observations and prior estimates of data and model uncertainties. Model parameters in MCMCloc consist of event hypocenters, and travel-time predictions. Data are arrival time measurements and phase assignments. Posteriori estimates of event locations, path corrections, pick errors, and phase assignments are made through analysis of the posteriori suite of acceptable solutions. Prior uncertainty estimates include correlations between travel-time predictions, correlations between measurement errors, the probability of misidentifying one phase for another, and the probability of spurious data. Inclusion of prior constraints on location accuracy allows direct utilization of ground-truth locations or well-constrained location parameters (e.g. from InSAR) that aid in the accuracy of the solution. Implementation of a correlation structure for travel-time predictions allows MCMCloc to operate over arbitrarily large geographic areas. Transition in behavior between a multiple-event locator for tightly clustered events and a single-event locator for solitary events is controlled by the spatial correlation of travel-time predictions. We test the MCMC locator on a regional data set of Nevada Test Site nuclear explosions. Event locations and origin times are known for these events, allowing us to test the features of MCMCloc using a high-quality ground truth data set. Preliminary tests suggest that MCMCloc provides excellent relative locations, often outperforming traditional multiple-event location algorithms, and excellent absolute locations are attained when constraints from one or more ground truth event are included. When phase assignments are switched, we find that MCMCloc properly corrects the error when predicted arrival times are separated by several seconds. In cases where the predicted arrival times are within the combined uncertainty of prediction and measurement errors, MCMCloc determines the probability of one or the other phase assignment and propagates this uncertainty into all model parameters. We find that MCMCloc is a promising method for simultaneously locating large, geographically distributed data sets. Because we incorporate prior knowledge on many parameters, MCMCloc is ideal for combining trusted data with data of unknown reliability. This work was performed under the auspices of the U.S. Department of Energy by the University of California Lawrence Livermore National Laboratory under contract No. W-7405-Eng-48, Contribution UCRL-ABS-215048

  9. Research on air and missile defense task allocation based on extended contract net protocol

    NASA Astrophysics Data System (ADS)

    Zhang, Yunzhi; Wang, Gang

    2017-10-01

    Based on the background of air and missile defense distributed element corporative engagement, the interception task allocation problem of multiple weapon units with multiple targets under network condition is analyzed. Firstly, a mathematical model of task allocation is established by combat task decomposition. Secondly, the initialization assignment based on auction contract and the adjustment allocation scheme based on swap contract were introduced to the task allocation. Finally, through the simulation calculation of typical situation, the model can be used to solve the task allocation problem in complex combat environment.

  10. The Picmonic(®) Learning System: enhancing memory retention of medical sciences, using an audiovisual mnemonic Web-based learning platform.

    PubMed

    Yang, Adeel; Goel, Hersh; Bryan, Matthew; Robertson, Ron; Lim, Jane; Islam, Shehran; Speicher, Mark R

    2014-01-01

    Medical students are required to retain vast amounts of medical knowledge on the path to becoming physicians. To address this challenge, multimedia Web-based learning resources have been developed to supplement traditional text-based materials. The Picmonic(®) Learning System (PLS; Picmonic, Phoenix, AZ, USA) is a novel multimedia Web-based learning platform that delivers audiovisual mnemonics designed to improve memory retention of medical sciences. A single-center, randomized, subject-blinded, controlled study was conducted to compare the PLS with traditional text-based material for retention of medical science topics. Subjects were randomly assigned to use two different types of study materials covering several diseases. Subjects randomly assigned to the PLS group were given audiovisual mnemonics along with text-based materials, whereas subjects in the control group were given the same text-based materials with key terms highlighted. The primary endpoints were the differences in performance on immediate, 1 week, and 1 month delayed free-recall and paired-matching tests. The secondary endpoints were the difference in performance on a 1 week delayed multiple-choice test and self-reported satisfaction with the study materials. Differences were calculated using unpaired two-tailed t-tests. PLS group subjects demonstrated improvements of 65%, 161%, and 208% compared with control group subjects on free-recall tests conducted immediately, 1 week, and 1 month after study of materials, respectively. The results of performance on paired-matching tests showed an improvement of up to 331% for PLS group subjects. PLS group subjects also performed 55% greater than control group subjects on a 1 week delayed multiple choice test requiring higher-order thinking. The differences in test performance between the PLS group subjects and the control group subjects were statistically significant (P<0.001), and the PLS group subjects reported higher overall satisfaction with the material. The data of this pilot site demonstrate marked improvements in the retention of disease topics when using the PLS compared with traditional text-based materials. The use of the PLS in medical education is supported.

  11. Improvements in osteoporosis testing and care are found following the wide scale implementation of the Ontario Fracture Clinic Screening Program: An interrupted time series analysis.

    PubMed

    Beaton, Dorcas E; Mamdani, Muhammad; Zheng, Hong; Jaglal, Susan; Cadarette, Suzanne M; Bogoch, Earl R; Sale, Joanna E M; Sujic, Rebeka; Jain, Ravi

    2017-12-01

    We evaluated a system-wide impact of a health intervention to improve treatment of osteoporosis after a fragility fracture. The intervention consisted of assigning a screening coordinator to selected fracture clinics to identify, educate, and follow up with fragility fracture patients and inform their physicians of the need to evaluate bone health. Thirty-seven hospitals in the province of Ontario (Canada) were assigned a screening coordinator. Twenty-three similar hospitals were control sites. All hospitals had orthopedic services and handled moderate-to-higher volumes of fracture patients. Administrative health data were used to evaluate the impact of the intervention.Fragility fracture patients (≥50 years; hip, humerus, forearm, spine, or pelvis fracture) were identified from administrative health records. Cases were fractures treated at 1 of the 37 hospitals assigned a coordinator. Controls were the same types of fractures at the control sites. Data were assembled for 20 quarters before and 10 quarters after the implementation (from January 2002 to March 2010). To test for a shift in trends, we employed an interrupted time series analysis-a study design used to evaluate the longitudinal effects of interventions, through regression modelling. The primary outcome measure was bone mineral density (BMD) testing. Osteoporosis medication initiation and persistence rates were secondary outcomes in a subset of patients ≥66 years of age.A total of 147,071 patients were used in the analysis. BMD testing rates increased from 17.0% pre-intervention to 20.9% post-intervention at intervention sites (P < .01) compared with no change at control sites (14.9% and 14.9%, P = .33). Medication initiation improved significantly at intervention sites (21.6-23.97%; P = .02) but not at control sites (17.5-18.5%; P = .27). Persistence with bisphosphonates decreased at all sites, from 59.9% to 56.4% at intervention sites (P = .02) and more so from 62.3% to 54.2% at control sites (P < .01) using 50% proportion of days covered (PDC 50).Significant improvements in BMD testing and treatment initiation were observed after the initiation of a coordinator-based screening program to improve osteoporosis management following fragility fracture.

  12. Improvements in osteoporosis testing and care are found following the wide scale implementation of the Ontario Fracture Clinic Screening Program

    PubMed Central

    Beaton, Dorcas E.; Mamdani, Muhammad; Zheng, Hong; Jaglal, Susan; Cadarette, Suzanne M.; Bogoch, Earl R.; Sale, Joanna E. M.; Sujic, Rebeka; Jain, Ravi

    2017-01-01

    Abstract We evaluated a system-wide impact of a health intervention to improve treatment of osteoporosis after a fragility fracture. The intervention consisted of assigning a screening coordinator to selected fracture clinics to identify, educate, and follow up with fragility fracture patients and inform their physicians of the need to evaluate bone health. Thirty-seven hospitals in the province of Ontario (Canada) were assigned a screening coordinator. Twenty-three similar hospitals were control sites. All hospitals had orthopedic services and handled moderate-to-higher volumes of fracture patients. Administrative health data were used to evaluate the impact of the intervention. Fragility fracture patients (≥50 years; hip, humerus, forearm, spine, or pelvis fracture) were identified from administrative health records. Cases were fractures treated at 1 of the 37 hospitals assigned a coordinator. Controls were the same types of fractures at the control sites. Data were assembled for 20 quarters before and 10 quarters after the implementation (from January 2002 to March 2010). To test for a shift in trends, we employed an interrupted time series analysis—a study design used to evaluate the longitudinal effects of interventions, through regression modelling. The primary outcome measure was bone mineral density (BMD) testing. Osteoporosis medication initiation and persistence rates were secondary outcomes in a subset of patients ≥66 years of age. A total of 147,071 patients were used in the analysis. BMD testing rates increased from 17.0% pre-intervention to 20.9% post-intervention at intervention sites (P < .01) compared with no change at control sites (14.9% and 14.9%, P = .33). Medication initiation improved significantly at intervention sites (21.6–23.97%; P = .02) but not at control sites (17.5–18.5%; P = .27). Persistence with bisphosphonates decreased at all sites, from 59.9% to 56.4% at intervention sites (P = .02) and more so from 62.3% to 54.2% at control sites (P < .01) using 50% proportion of days covered (PDC 50). Significant improvements in BMD testing and treatment initiation were observed after the initiation of a coordinator-based screening program to improve osteoporosis management following fragility fracture. PMID:29310418

  13. Moving beyond "Bookish Knowledge": Using Film-Based Assignments to Promote Deep Learning

    ERIC Educational Resources Information Center

    Olson, Joann S.; Autry, Linda; Moe, Jeffry

    2016-01-01

    This article investigates the effectiveness of a film-based assignment given to adult learners in a graduate-level group counseling class. Semi-structured interviews were conducted with four students; data analysis suggested film-based assignments may promote deep approaches to learning (DALs). Participants indicated the assignment helped them…

  14. Relationship auditing of the FMA ontology

    PubMed Central

    Gu, Huanying (Helen); Wei, Duo; Mejino, Jose L.V.; Elhanan, Gai

    2010-01-01

    The Foundational Model of Anatomy (FMA) ontology is a domain reference ontology based on a disciplined modeling approach. Due to its large size, semantic complexity and manual data entry process, errors and inconsistencies are unavoidable and might remain within the FMA structure without detection. In this paper, we present computable methods to highlight candidate concepts for various relationship assignment errors. The process starts with locating structures formed by transitive structural relationships (part_of, tributary_of, branch_of) and examine their assignments in the context of the IS-A hierarchy. The algorithms were designed to detect five major categories of possible incorrect relationship assignments: circular, mutually exclusive, redundant, inconsistent, and missed entries. A domain expert reviewed samples of these presumptive errors to confirm the findings. Seven thousand and fifty-two presumptive errors were detected, the largest proportion related to part_of relationship assignments. The results highlight the fact that errors are unavoidable in complex ontologies and that well designed algorithms can help domain experts to focus on concepts with high likelihood of errors and maximize their effort to ensure consistency and reliability. In the future similar methods might be integrated with data entry processes to offer real-time error detection. PMID:19475727

  15. Randomized Trial Comparing Two Treatment Strategies Using Prize-Based Reinforcement of Abstinence in Cocaine and Opiate Users

    ERIC Educational Resources Information Center

    Preston, Kenzie L.; Ghitza, Udi E.; Schmittner, John P.; Schroeder, Jennifer R.; Epstein, David H.

    2008-01-01

    We compared two strategies of prize-based contingency management (CM) in methadone-maintained outpatients. Urine was tested thrice weekly for 5 weeks pre-CM, 12 weeks CM, and 8 weeks post-CM. Participants were randomly assigned to a cocaine contingency (four prize draws for each cocaine-negative urine, N = 29) or an opiate-cocaine contingency (one…

  16. In vitro and in silico derived relative effect potencies of ah-receptor-mediated effects by PCDD/Fs and PCBs in rat, mouse, and guinea pig CALUX cell lines.

    PubMed

    Ghorbanzadeh, Mehdi; van Ede, Karin I; Larsson, Malin; van Duursen, Majorie B M; Poellinger, Lorenz; Lücke-Johansson, Sandra; Machala, Miroslav; Pěnčíková, Kateřina; Vondráček, Jan; van den Berg, Martin; Denison, Michael S; Ringsted, Tine; Andersson, Patrik L

    2014-07-21

    For a better understanding of species-specific relative effect potencies (REPs), responses of dioxin-like compounds (DLCs) were assessed. REPs were calculated using chemical-activated luciferase gene expression assays (CALUX) derived from guinea pig, rat, and mouse cell lines. Almost all 20 congeners tested in the rodent cell lines were partial agonists and less efficacious than 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD). For this reason, REPs were calculated for each congener using concentrations at which 20% of the maximal TCDD response was reached (REP20TCDD). REP20TCDD values obtained for PCDD/Fs were comparable with their toxic equivalency factors assigned by the World Health Organization (WHO-TEF), while those for PCBs were in general lower than the WHO-TEF values. Moreover, the guinea pig cell line was the most sensitive as indicated by the 20% effect concentrations of TCDD of 1.5, 5.6, and 11.0 pM for guinea pig, rat, and mouse cells, respectively. A similar response pattern was observed using multivariate statistical analysis between the three CALUX assays and the WHO-TEFs. The mouse assay showed minor deviation due to higher relative induction potential for 2,3,7,8-tetrachlorodibenzofuran and 2,3,4,6,7,8-hexachlorodibenzofuran and lower for 1,2,3,4,6,7,8-heptachlorodibenzofuran and 3,3',4,4',5-pentachlorobiphenyl (PCB126). 2,3,7,8-Tetrachlorodibenzofuran was more than two times more potent in the mouse assay as compared with that of rat and guinea pig cells, while measured REP20TCDD for PCB126 was lower in mouse cells (0.05) as compared with that of the guinea pig (0.2) and rat (0.07). In order to provide REP20TCDD values for all WHO-TEF assigned compounds, quantitative structure-activity relationship (QSAR) models were developed. The QSAR models showed that specific electronic properties and molecular surface characteristics play important roles in the AhR-mediated response. In silico derived REP20TCDD values were generally consistent with the WHO-TEFs with a few exceptions. The QSAR models indicated that, e.g., 1,2,3,7,8-pentachlorodibenzofuran and 1,2,3,7,8,9-hexachlorodibenzofuran were more potent than given by their assigned WHO-TEF values, and the non-ortho PCB 81 was predicted, based on the guinea-pig model, to be 1 order of magnitude above its WHO-TEF value. By combining in vitro and in silico approaches, REPs were established for all WHO-TEF assigned compounds (except OCDD), which will provide future guidance in testing AhR-mediated responses of DLCs and to increase our understanding of species variation in AhR-mediated effects.

  17. Distributed resource allocation under communication constraints

    NASA Astrophysics Data System (ADS)

    Dodin, Pierre; Nimier, Vincent

    2001-03-01

    This paper deals with a study of the multi-sensor management problem for multi-target tracking. The collaboration between many sensors observing the same target means that they are able to fuse their data during the information process. Then one must take into account this possibility to compute the optimal association sensors-target at each step of time. In order to solve this problem for real large scale system, one must both consider the information aspect and the control aspect of the problem. To unify these problems, one possibility is to use a decentralized filtering algorithm locally driven by an assignment algorithm. The decentralized filtering algorithm we use in our model is the filtering algorithm of Grime, which relaxes the usual full-connected hypothesis. By full-connected, one means that the information in a full-connected system is totally distributed everywhere at the same moment, which is unacceptable for a real large scale system. We modelize the distributed assignment decision with the help of a greedy algorithm. Each sensor performs a global optimization, in order to estimate other information sets. A consequence of the relaxation of the full- connected hypothesis is that the sensors' information set are not the same at each step of time, producing an information dis- symmetry in the system. The assignment algorithm uses a local knowledge of this dis-symmetry. By testing the reactions and the coherence of the local assignment decisions of our system, against maneuvering targets, we show that it is still possible to manage with decentralized assignment control even though the system is not full-connected.

  18. Students' Achievement and Homework Assignment Strategies.

    PubMed

    Fernández-Alonso, Rubén; Álvarez-Díaz, Marcos; Suárez-Álvarez, Javier; Muñiz, José

    2017-01-01

    The optimum time students should spend on homework has been widely researched although the results are far from unanimous. The main objective of this research is to analyze how homework assignment strategies in schools affect students' academic performance and the differences in students' time spent on homework. Participants were a representative sample of Spanish adolescents ( N = 26,543) with a mean age of 14.4 (±0.75), 49.7% girls. A test battery was used to measure academic performance in four subjects: Spanish, Mathematics, Science, and Citizenship. A questionnaire allowed the measurement of the indicators used for the description of homework and control variables. Two three-level hierarchical-linear models (student, school, autonomous community) were produced for each subject being evaluated. The relationship between academic results and homework time is negative at the individual level but positive at school level. An increase in the amount of homework a school assigns is associated with an increase in the differences in student time spent on homework. An optimum amount of homework is proposed which schools should assign to maximize gains in achievement for students overall.

  19. Students' Achievement and Homework Assignment Strategies

    PubMed Central

    Fernández-Alonso, Rubén; Álvarez-Díaz, Marcos; Suárez-Álvarez, Javier; Muñiz, José

    2017-01-01

    The optimum time students should spend on homework has been widely researched although the results are far from unanimous. The main objective of this research is to analyze how homework assignment strategies in schools affect students' academic performance and the differences in students' time spent on homework. Participants were a representative sample of Spanish adolescents (N = 26,543) with a mean age of 14.4 (±0.75), 49.7% girls. A test battery was used to measure academic performance in four subjects: Spanish, Mathematics, Science, and Citizenship. A questionnaire allowed the measurement of the indicators used for the description of homework and control variables. Two three-level hierarchical-linear models (student, school, autonomous community) were produced for each subject being evaluated. The relationship between academic results and homework time is negative at the individual level but positive at school level. An increase in the amount of homework a school assigns is associated with an increase in the differences in student time spent on homework. An optimum amount of homework is proposed which schools should assign to maximize gains in achievement for students overall. PMID:28326046

  20. Design of multivariable feedback control systems via spectral assignment using reduced-order models and reduced-order observers

    NASA Technical Reports Server (NTRS)

    Mielke, R. R.; Tung, L. J.; Carraway, P. I., III

    1984-01-01

    The feasibility of using reduced order models and reduced order observers with eigenvalue/eigenvector assignment procedures is investigated. A review of spectral assignment synthesis procedures is presented. Then, a reduced order model which retains essential system characteristics is formulated. A constant state feedback matrix which assigns desired closed loop eigenvalues and approximates specified closed loop eigenvectors is calculated for the reduced order model. It is shown that the eigenvalue and eigenvector assignments made in the reduced order system are retained when the feedback matrix is implemented about the full order system. In addition, those modes and associated eigenvectors which are not included in the reduced order model remain unchanged in the closed loop full order system. The full state feedback design is then implemented by using a reduced order observer. It is shown that the eigenvalue and eigenvector assignments of the closed loop full order system rmain unchanged when a reduced order observer is used. The design procedure is illustrated by an actual design problem.

Top