Sample records for programming-based multi-category constrained

  1. Strategic considerations for support of humans in space and Moon/Mars exploration missions. Life sciences research and technology programs, volume 2

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Summary charts of the following topics are presented: the Percentage of Critical Questions in Constrained and Robust Programs; the Executive Committee and AMAC Disposition of Critical Questions for Constrained and Robust Programs; and the Requirements for Ground-based Research and Flight Platforms for Constrained and Robust Programs. Data Tables are also presented and cover the following: critical questions from all Life Sciences Division Discipline Science Plans; critical questions listed by category and criticality; all critical questions which require ground-based research; critical questions that would utilize spacelabs listed by category and criticality; critical questions that would utilize Space Station Freedom (SSF) listed by category and criticality; critical questions that would utilize the SSF Centrifuge; facility listed by category and criticality; critical questions that would utilize a Moon base listed by category and criticality; critical questions that would utilize robotic missions listed by category and criticality; critical questions that would utilize free flyers listed by category and criticality; and critical questions by deliverables.

  2. Functional connectivity constrains the category-related organization of human ventral occipitotemporal cortex

    PubMed Central

    Stevens, W. Dale; Tessler, Michael Henry; Peng, Cynthia S.; Martin, Alex

    2015-01-01

    One of the most robust and oft-replicated findings in cognitive neuroscience is that several spatially distinct, functionally dissociable ventral occipitotemporal cortex (VOTC) regions respond preferentially to different categories of concrete entities. However, the determinants of this category-related organization remain to be fully determined. One recent proposal is that privileged connectivity of these VOTC regions with other regions that store and/or process category-relevant properties may be a major contributing factor. To test this hypothesis, we used a multi-category functional MRI localizer to individually define category-related brain regions of interest (ROIs) in a large group of subjects (n=33). We then used these ROIs in resting-state functional connectivity MRI analyses to explore spontaneous functional connectivity among these regions. We demonstrate that during rest, distinct category-preferential VOTC regions show differentially stronger functional connectivity with other regions that have congruent category-preference, as defined by the functional localizer. Importantly, a ‘tool’-preferential region in the left medial fusiform gyrus showed differentially stronger functional connectivity with other left lateralized cortical regions associated with perceiving and knowing about common tools – posterior middle temporal gyrus (involved in perception of non-biological motion), lateral parietal cortex (critical for reaching, grasping, manipulating), and ventral premotor cortex (involved in storing/executing motor programs) – relative to other category-related regions in VOTC of both the right and left hemisphere. Our findings support the claim that privileged connectivity with other cortical regions that store and/or process category-relevant properties constrains the category-related organization of VOTC. PMID:25704493

  3. Fuzzy multi-objective chance-constrained programming model for hazardous materials transportation

    NASA Astrophysics Data System (ADS)

    Du, Jiaoman; Yu, Lean; Li, Xiang

    2016-04-01

    Hazardous materials transportation is an important and hot issue of public safety. Based on the shortest path model, this paper presents a fuzzy multi-objective programming model that minimizes the transportation risk to life, travel time and fuel consumption. First, we present the risk model, travel time model and fuel consumption model. Furthermore, we formulate a chance-constrained programming model within the framework of credibility theory, in which the lengths of arcs in the transportation network are assumed to be fuzzy variables. A hybrid intelligent algorithm integrating fuzzy simulation and genetic algorithm is designed for finding a satisfactory solution. Finally, some numerical examples are given to demonstrate the efficiency of the proposed model and algorithm.

  4. Is Statistical Learning Constrained by Lower Level Perceptual Organization?

    PubMed Central

    Emberson, Lauren L.; Liu, Ran; Zevin, Jason D.

    2013-01-01

    In order for statistical information to aid in complex developmental processes such as language acquisition, learning from higher-order statistics (e.g. across successive syllables in a speech stream to support segmentation) must be possible while perceptual abilities (e.g. speech categorization) are still developing. The current study examines how perceptual organization interacts with statistical learning. Adult participants were presented with multiple exemplars from novel, complex sound categories designed to reflect some of the spectral complexity and variability of speech. These categories were organized into sequential pairs and presented such that higher-order statistics, defined based on sound categories, could support stream segmentation. Perceptual similarity judgments and multi-dimensional scaling revealed that participants only perceived three perceptual clusters of sounds and thus did not distinguish the four experimenter-defined categories, creating a tension between lower level perceptual organization and higher-order statistical information. We examined whether the resulting pattern of learning is more consistent with statistical learning being “bottom-up,” constrained by the lower levels of organization, or “top-down,” such that higher-order statistical information of the stimulus stream takes priority over the perceptual organization, and perhaps influences perceptual organization. We consistently find evidence that learning is constrained by perceptual organization. Moreover, participants generalize their learning to novel sounds that occupy a similar perceptual space, suggesting that statistical learning occurs based on regions of or clusters in perceptual space. Overall, these results reveal a constraint on learning of sound sequences, such that statistical information is determined based on lower level organization. These findings have important implications for the role of statistical learning in language acquisition. PMID:23618755

  5. MOFA Software for the COBRA Toolbox

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Griesemer, Marc; Navid, Ali

    MOFA-COBRA is a software code for Matlab that performs Multi-Objective Flux Analysis (MOFA), a solving of linear programming problems. Teh leading software package for conducting different types of analyses using constrain-based models is the COBRA Toolbox for Matlab. MOFA-COBRA is an added tool for COBRA that solves multi-objective problems using a novel algorithm.

  6. An expert system for choosing the best combination of options in a general purpose program for automated design synthesis

    NASA Technical Reports Server (NTRS)

    Rogers, J. L.; Barthelemy, J.-F. M.

    1986-01-01

    An expert system called EXADS has been developed to aid users of the Automated Design Synthesis (ADS) general purpose optimization program. ADS has approximately 100 combinations of strategy, optimizer, and one-dimensional search options from which to choose. It is difficult for a nonexpert to make this choice. This expert system aids the user in choosing the best combination of options based on the users knowledge of the problem and the expert knowledge stored in the knowledge base. The knowledge base is divided into three categories; constrained problems, unconstrained problems, and constrained problems being treated as unconstrained problems. The inference engine and rules are written in LISP, contains about 200 rules, and executes on DEC-VAX (with Franz-LISP) and IBM PC (with IQ-LISP) computers.

  7. Advanced Computational Methods for Security Constrained Financial Transmission Rights: Structure and Parallelism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elbert, Stephen T.; Kalsi, Karanjit; Vlachopoulou, Maria

    Financial Transmission Rights (FTRs) help power market participants reduce price risks associated with transmission congestion. FTRs are issued based on a process of solving a constrained optimization problem with the objective to maximize the FTR social welfare under power flow security constraints. Security constraints for different FTR categories (monthly, seasonal or annual) are usually coupled and the number of constraints increases exponentially with the number of categories. Commercial software for FTR calculation can only provide limited categories of FTRs due to the inherent computational challenges mentioned above. In this paper, a novel non-linear dynamical system (NDS) approach is proposed tomore » solve the optimization problem. The new formulation and performance of the NDS solver is benchmarked against widely used linear programming (LP) solvers like CPLEX™ and tested on large-scale systems using data from the Western Electricity Coordinating Council (WECC). The NDS is demonstrated to outperform the widely used CPLEX algorithms while exhibiting superior scalability. Furthermore, the NDS based solver can be easily parallelized which results in significant computational improvement.« less

  8. MQ-MAC: A Multi-Constrained QoS-Aware Duty Cycle MAC for Heterogeneous Traffic in Wireless Sensor Networks

    PubMed Central

    Monowar, Muhammad Mostafa; Rahman, Md. Obaidur; Hong, Choong Seon; Lee, Sungwon

    2010-01-01

    Energy conservation is one of the striking research issues now-a-days for power constrained wireless sensor networks (WSNs) and hence, several duty-cycle based MAC protocols have been devised for WSNs in the last few years. However, assimilation of diverse applications with different QoS requirements (i.e., delay and reliability) within the same network also necessitates in devising a generic duty-cycle based MAC protocol that can achieve both the delay and reliability guarantee, termed as multi-constrained QoS, while preserving the energy efficiency. To address this, in this paper, we propose a Multi-constrained QoS-aware duty-cycle MAC for heterogeneous traffic in WSNs (MQ-MAC). MQ-MAC classifies the traffic based on their multi-constrained QoS demands. Through extensive simulation using ns-2 we evaluate the performance of MQ-MAC. MQ-MAC provides the desired delay and reliability guarantee according to the nature of the traffic classes as well as achieves energy efficiency. PMID:22163439

  9. Development of a multi-space constrained density functional theory approach and its application to graphene-based vertical transistors

    NASA Astrophysics Data System (ADS)

    Kim, Han Seul; Kim, Yong-Hoon

    We have been developing a multi-space-constrained density functional theory approach for the first-principles calculations of nano-scale junctions subjected to non-equilibrium conditions and charge transport through them. In this presentation, we apply the method to vertically-stacked graphene/hexagonal boron nitride (hBN)/graphene Van der Waals heterostructures in the context of tunneling transistor applications. Bias-dependent changes in energy level alignment, wavefunction hybridization, and current are extracted. In particular, we compare quantum transport properties of single-layer (graphene) and infinite (graphite) electrode limits on the same ground, which is not possible within the traditional non-equilibrium Green function formalism. The effects of point defects within hBN on the current-voltage characteristics will be also discussed. Global Frontier Program (2013M3A6B1078881), Nano-Material Technology Development Programs (2016M3A7B4024133, 2016M3A7B4909944, and 2012M3A7B4049888), and Pioneer Program (2016M3C1A3906149) of the National Research Foundation.

  10. Advanced Computational Methods for Security Constrained Financial Transmission Rights

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kalsi, Karanjit; Elbert, Stephen T.; Vlachopoulou, Maria

    Financial Transmission Rights (FTRs) are financial insurance tools to help power market participants reduce price risks associated with transmission congestion. FTRs are issued based on a process of solving a constrained optimization problem with the objective to maximize the FTR social welfare under power flow security constraints. Security constraints for different FTR categories (monthly, seasonal or annual) are usually coupled and the number of constraints increases exponentially with the number of categories. Commercial software for FTR calculation can only provide limited categories of FTRs due to the inherent computational challenges mentioned above. In this paper, first an innovative mathematical reformulationmore » of the FTR problem is presented which dramatically improves the computational efficiency of optimization problem. After having re-formulated the problem, a novel non-linear dynamic system (NDS) approach is proposed to solve the optimization problem. The new formulation and performance of the NDS solver is benchmarked against widely used linear programming (LP) solvers like CPLEX™ and tested on both standard IEEE test systems and large-scale systems using data from the Western Electricity Coordinating Council (WECC). The performance of the NDS is demonstrated to be comparable and in some cases is shown to outperform the widely used CPLEX algorithms. The proposed formulation and NDS based solver is also easily parallelizable enabling further computational improvement.« less

  11. Multiple utility constrained multi-objective programs using Bayesian theory

    NASA Astrophysics Data System (ADS)

    Abbasian, Pooneh; Mahdavi-Amiri, Nezam; Fazlollahtabar, Hamed

    2018-03-01

    A utility function is an important tool for representing a DM's preference. We adjoin utility functions to multi-objective optimization problems. In current studies, usually one utility function is used for each objective function. Situations may arise for a goal to have multiple utility functions. Here, we consider a constrained multi-objective problem with each objective having multiple utility functions. We induce the probability of the utilities for each objective function using Bayesian theory. Illustrative examples considering dependence and independence of variables are worked through to demonstrate the usefulness of the proposed model.

  12. On the nullspace of TLS multi-station adjustment

    NASA Astrophysics Data System (ADS)

    Sterle, Oskar; Kogoj, Dušan; Stopar, Bojan; Kregar, Klemen

    2018-07-01

    In the article we present an analytic aspect of TLS multi-station least-squares adjustment with the main focus on the datum problem. The datum problem is, compared to previously published researches, theoretically analyzed and solved, where the solution is based on nullspace derivation of the mathematical model. The importance of datum problem solution is seen in a complete description of TLS multi-station adjustment solutions from a set of all minimally constrained least-squares solutions. On a basis of known nullspace, estimable parameters are described and the geometric interpretation of all minimally constrained least squares solutions is presented. At the end a simulated example is used to analyze the results of TLS multi-station minimally constrained and inner constrained least-squares adjustment solutions.

  13. A Mixed Integer Linear Programming Approach to Electrical Stimulation Optimization Problems.

    PubMed

    Abouelseoud, Gehan; Abouelseoud, Yasmine; Shoukry, Amin; Ismail, Nour; Mekky, Jaidaa

    2018-02-01

    Electrical stimulation optimization is a challenging problem. Even when a single region is targeted for excitation, the problem remains a constrained multi-objective optimization problem. The constrained nature of the problem results from safety concerns while its multi-objectives originate from the requirement that non-targeted regions should remain unaffected. In this paper, we propose a mixed integer linear programming formulation that can successfully address the challenges facing this problem. Moreover, the proposed framework can conclusively check the feasibility of the stimulation goals. This helps researchers to avoid wasting time trying to achieve goals that are impossible under a chosen stimulation setup. The superiority of the proposed framework over alternative methods is demonstrated through simulation examples.

  14. Experimental evaluation of model predictive control and inverse dynamics control for spacecraft proximity and docking maneuvers

    NASA Astrophysics Data System (ADS)

    Virgili-Llop, Josep; Zagaris, Costantinos; Park, Hyeongjun; Zappulla, Richard; Romano, Marcello

    2018-03-01

    An experimental campaign has been conducted to evaluate the performance of two different guidance and control algorithms on a multi-constrained docking maneuver. The evaluated algorithms are model predictive control (MPC) and inverse dynamics in the virtual domain (IDVD). A linear-quadratic approach with a quadratic programming solver is used for the MPC approach. A nonconvex optimization problem results from the IDVD approach, and a nonlinear programming solver is used. The docking scenario is constrained by the presence of a keep-out zone, an entry cone, and by the chaser's maximum actuation level. The performance metrics for the experiments and numerical simulations include the required control effort and time to dock. The experiments have been conducted in a ground-based air-bearing test bed, using spacecraft simulators that float over a granite table.

  15. Integer Linear Programming for Constrained Multi-Aspect Committee Review Assignment

    PubMed Central

    Karimzadehgan, Maryam; Zhai, ChengXiang

    2011-01-01

    Automatic review assignment can significantly improve the productivity of many people such as conference organizers, journal editors and grant administrators. A general setup of the review assignment problem involves assigning a set of reviewers on a committee to a set of documents to be reviewed under the constraint of review quota so that the reviewers assigned to a document can collectively cover multiple topic aspects of the document. No previous work has addressed such a setup of committee review assignments while also considering matching multiple aspects of topics and expertise. In this paper, we tackle the problem of committee review assignment with multi-aspect expertise matching by casting it as an integer linear programming problem. The proposed algorithm can naturally accommodate any probabilistic or deterministic method for modeling multiple aspects to automate committee review assignments. Evaluation using a multi-aspect review assignment test set constructed using ACM SIGIR publications shows that the proposed algorithm is effective and efficient for committee review assignments based on multi-aspect expertise matching. PMID:22711970

  16. Reaching Mars: multi-criteria R&D portfolio selection for Mars exploration technology planning

    NASA Technical Reports Server (NTRS)

    Smith, J. H.; Dolgin, B. P.; Weisbin, C. R.

    2003-01-01

    The exploration of Mars has been the focus of increasing scientific interest about the planet and its relationship to Earth. A multi-criteria decision-making approach was developed to address the question, Given a Mars program composed of mission concepts dependent on a variety of alternative technology development programs, which combination of technologies would enable missions to maximize science return under a constrained budget?.

  17. McMAC: Towards a MAC Protocol with Multi-Constrained QoS Provisioning for Diverse Traffic in Wireless Body Area Networks

    PubMed Central

    Monowar, Muhammad Mostafa; Hassan, Mohammad Mehedi; Bajaber, Fuad; Al-Hussein, Musaed; Alamri, Atif

    2012-01-01

    The emergence of heterogeneous applications with diverse requirements for resource-constrained Wireless Body Area Networks (WBANs) poses significant challenges for provisioning Quality of Service (QoS) with multi-constraints (delay and reliability) while preserving energy efficiency. To address such challenges, this paper proposes McMAC, a MAC protocol with multi-constrained QoS provisioning for diverse traffic classes in WBANs. McMAC classifies traffic based on their multi-constrained QoS demands and introduces a novel superframe structure based on the “transmit-whenever-appropriate” principle, which allows diverse periods for diverse traffic classes according to their respective QoS requirements. Furthermore, a novel emergency packet handling mechanism is proposed to ensure packet delivery with the least possible delay and the highest reliability. McMAC is also modeled analytically, and extensive simulations were performed to evaluate its performance. The results reveal that McMAC achieves the desired delay and reliability guarantee according to the requirements of a particular traffic class while achieving energy efficiency. PMID:23202224

  18. Constrained and Unconstrained Partial Adjacent Category Logit Models for Ordinal Response Variables

    ERIC Educational Resources Information Center

    Fullerton, Andrew S.; Xu, Jun

    2018-01-01

    Adjacent category logit models are ordered regression models that focus on comparisons of adjacent categories. These models are particularly useful for ordinal response variables with categories that are of substantive interest. In this article, we consider unconstrained and constrained versions of the partial adjacent category logit model, which…

  19. Design and Simulation of Material-Integrated Distributed Sensor Processing with a Code-Based Agent Platform and Mobile Multi-Agent Systems

    PubMed Central

    Bosse, Stefan

    2015-01-01

    Multi-agent systems (MAS) can be used for decentralized and self-organizing data processing in a distributed system, like a resource-constrained sensor network, enabling distributed information extraction, for example, based on pattern recognition and self-organization, by decomposing complex tasks in simpler cooperative agents. Reliable MAS-based data processing approaches can aid the material-integration of structural-monitoring applications, with agent processing platforms scaled to the microchip level. The agent behavior, based on a dynamic activity-transition graph (ATG) model, is implemented with program code storing the control and the data state of an agent, which is novel. The program code can be modified by the agent itself using code morphing techniques and is capable of migrating in the network between nodes. The program code is a self-contained unit (a container) and embeds the agent data, the initialization instructions and the ATG behavior implementation. The microchip agent processing platform used for the execution of the agent code is a standalone multi-core stack machine with a zero-operand instruction format, leading to a small-sized agent program code, low system complexity and high system performance. The agent processing is token-queue-based, similar to Petri-nets. The agent platform can be implemented in software, too, offering compatibility at the operational and code level, supporting agent processing in strong heterogeneous networks. In this work, the agent platform embedded in a large-scale distributed sensor network is simulated at the architectural level by using agent-based simulation techniques. PMID:25690550

  20. Design and simulation of material-integrated distributed sensor processing with a code-based agent platform and mobile multi-agent systems.

    PubMed

    Bosse, Stefan

    2015-02-16

    Multi-agent systems (MAS) can be used for decentralized and self-organizing data processing in a distributed system, like a resource-constrained sensor network, enabling distributed information extraction, for example, based on pattern recognition and self-organization, by decomposing complex tasks in simpler cooperative agents. Reliable MAS-based data processing approaches can aid the material-integration of structural-monitoring applications, with agent processing platforms scaled to the microchip level. The agent behavior, based on a dynamic activity-transition graph (ATG) model, is implemented with program code storing the control and the data state of an agent, which is novel. The program code can be modified by the agent itself using code morphing techniques and is capable of migrating in the network between nodes. The program code is a self-contained unit (a container) and embeds the agent data, the initialization instructions and the ATG behavior implementation. The microchip agent processing platform used for the execution of the agent code is a standalone multi-core stack machine with a zero-operand instruction format, leading to a small-sized agent program code, low system complexity and high system performance. The agent processing is token-queue-based, similar to Petri-nets. The agent platform can be implemented in software, too, offering compatibility at the operational and code level, supporting agent processing in strong heterogeneous networks. In this work, the agent platform embedded in a large-scale distributed sensor network is simulated at the architectural level by using agent-based simulation techniques.

  1. A Bayesian Multi-Level Factor Analytic Model of Consumer Price Sensitivities across Categories

    ERIC Educational Resources Information Center

    Duvvuri, Sri Devi; Gruca, Thomas S.

    2010-01-01

    Identifying price sensitive consumers is an important problem in marketing. We develop a Bayesian multi-level factor analytic model of the covariation among household-level price sensitivities across product categories that are substitutes. Based on a multivariate probit model of category incidence, this framework also allows the researcher to…

  2. Chance-constrained multi-objective optimization of groundwater remediation design at DNAPLs-contaminated sites using a multi-algorithm genetically adaptive method

    NASA Astrophysics Data System (ADS)

    Ouyang, Qi; Lu, Wenxi; Hou, Zeyu; Zhang, Yu; Li, Shuai; Luo, Jiannan

    2017-05-01

    In this paper, a multi-algorithm genetically adaptive multi-objective (AMALGAM) method is proposed as a multi-objective optimization solver. It was implemented in the multi-objective optimization of a groundwater remediation design at sites contaminated by dense non-aqueous phase liquids. In this study, there were two objectives: minimization of the total remediation cost, and minimization of the remediation time. A non-dominated sorting genetic algorithm II (NSGA-II) was adopted to compare with the proposed method. For efficiency, the time-consuming surfactant-enhanced aquifer remediation simulation model was replaced by a surrogate model constructed by a multi-gene genetic programming (MGGP) technique. Similarly, two other surrogate modeling methods-support vector regression (SVR) and Kriging (KRG)-were employed to make comparisons with MGGP. In addition, the surrogate-modeling uncertainty was incorporated in the optimization model by chance-constrained programming (CCP). The results showed that, for the problem considered in this study, (1) the solutions obtained by AMALGAM incurred less remediation cost and required less time than those of NSGA-II, indicating that AMALGAM outperformed NSGA-II. It was additionally shown that (2) the MGGP surrogate model was more accurate than SVR and KRG; and (3) the remediation cost and time increased with the confidence level, which can enable decision makers to make a suitable choice by considering the given budget, remediation time, and reliability.

  3. Between unemployment and employment: experience of unemployed long-term pain sufferers.

    PubMed

    Glavare, Maria; Löfgren, Monika; Schult, Marie-Louise

    2012-01-01

    This study explored and analysed how patients experienced possibilities for, and barriers to, work return after participation in a multi-professional pain-rehabilitation program followed by a coached work-training program (CWT). Eleven informants (8 women/3 men) with long-term musculoskeletal pain who had participated in the CWT program for 4-21 months (mean=11) comprised the study. A qualitative emergent design was used. Data collected with interviews were analysed using the constant comparison method of grounded theory. Triangulation in researchers were used. The analyses of the interviews resulted in the development of a three-category theoretical model, which was named "a way back to work". The main category "Experience of a way back to work" consisted of the informants' experience during the process between unemployment and employment. The category "Support" describes the help the informants received from various actors, and the category "Negative response" describes negative responses from the actors involved, which was an important barrier in the process between unemployment and employment. Professional individualised support, participants feeling involved in their rehabilitation process, coaching at real workplaces and multi-professional team including health care personnel, were valuable during the process towards work.

  4. EXADS - EXPERT SYSTEM FOR AUTOMATED DESIGN SYNTHESIS

    NASA Technical Reports Server (NTRS)

    Rogers, J. L.

    1994-01-01

    The expert system called EXADS was developed to aid users of the Automated Design Synthesis (ADS) general purpose optimization program. Because of the general purpose nature of ADS, it is difficult for a nonexpert to select the best choice of strategy, optimizer, and one-dimensional search options from the one hundred or so combinations that are available. EXADS aids engineers in determining the best combination based on their knowledge of the problem and the expert knowledge previously stored by experts who developed ADS. EXADS is a customized application of the AESOP artificial intelligence program (the general version of AESOP is available separately from COSMIC. The ADS program is also available from COSMIC.) The expert system consists of two main components. The knowledge base contains about 200 rules and is divided into three categories: constrained, unconstrained, and constrained treated as unconstrained. The EXADS inference engine is rule-based and makes decisions about a particular situation using hypotheses (potential solutions), rules, and answers to questions drawn from the rule base. EXADS is backward-chaining, that is, it works from hypothesis to facts. The rule base was compiled from sources such as literature searches, ADS documentation, and engineer surveys. EXADS will accept answers such as yes, no, maybe, likely, and don't know, or a certainty factor ranging from 0 to 10. When any hypothesis reaches a confidence level of 90% or more, it is deemed as the best choice and displayed to the user. If no hypothesis is confirmed, the user can examine explanations of why the hypotheses failed to reach the 90% level. The IBM PC version of EXADS is written in IQ-LISP for execution under DOS 2.0 or higher with a central memory requirement of approximately 512K of 8 bit bytes. This program was developed in 1986.

  5. A multi-frequency receiver function inversion approach for crustal velocity structure

    NASA Astrophysics Data System (ADS)

    Li, Xuelei; Li, Zhiwei; Hao, Tianyao; Wang, Sheng; Xing, Jian

    2017-05-01

    In order to constrain the crustal velocity structures better, we developed a new nonlinear inversion approach based on multi-frequency receiver function waveforms. With the global optimizing algorithm of Differential Evolution (DE), low-frequency receiver function waveforms can primarily constrain large-scale velocity structures, while high-frequency receiver function waveforms show the advantages in recovering small-scale velocity structures. Based on the synthetic tests with multi-frequency receiver function waveforms, the proposed approach can constrain both long- and short-wavelength characteristics of the crustal velocity structures simultaneously. Inversions with real data are also conducted for the seismic stations of KMNB in southeast China and HYB in Indian continent, where crustal structures have been well studied by former researchers. Comparisons of inverted velocity models from previous and our studies suggest good consistency, but better waveform fitness with fewer model parameters are achieved by our proposed approach. Comprehensive tests with synthetic and real data suggest that the proposed inversion approach with multi-frequency receiver function is effective and robust in inverting the crustal velocity structures.

  6. Optimizing Constrained Single Period Problem under Random Fuzzy Demand

    NASA Astrophysics Data System (ADS)

    Taleizadeh, Ata Allah; Shavandi, Hassan; Riazi, Afshin

    2008-09-01

    In this paper, we consider the multi-product multi-constraint newsboy problem with random fuzzy demands and total discount. The demand of the products is often stochastic in the real word but the estimation of the parameters of distribution function may be done by fuzzy manner. So an appropriate option to modeling the demand of products is using the random fuzzy variable. The objective function of proposed model is to maximize the expected profit of newsboy. We consider the constraints such as warehouse space and restriction on quantity order for products, and restriction on budget. We also consider the batch size for products order. Finally we introduce a random fuzzy multi-product multi-constraint newsboy problem (RFM-PM-CNP) and it is changed to a multi-objective mixed integer nonlinear programming model. Furthermore, a hybrid intelligent algorithm based on genetic algorithm, Pareto and TOPSIS is presented for the developed model. Finally an illustrative example is presented to show the performance of the developed model and algorithm.

  7. Chance-constrained multi-objective optimization of groundwater remediation design at DNAPLs-contaminated sites using a multi-algorithm genetically adaptive method.

    PubMed

    Ouyang, Qi; Lu, Wenxi; Hou, Zeyu; Zhang, Yu; Li, Shuai; Luo, Jiannan

    2017-05-01

    In this paper, a multi-algorithm genetically adaptive multi-objective (AMALGAM) method is proposed as a multi-objective optimization solver. It was implemented in the multi-objective optimization of a groundwater remediation design at sites contaminated by dense non-aqueous phase liquids. In this study, there were two objectives: minimization of the total remediation cost, and minimization of the remediation time. A non-dominated sorting genetic algorithm II (NSGA-II) was adopted to compare with the proposed method. For efficiency, the time-consuming surfactant-enhanced aquifer remediation simulation model was replaced by a surrogate model constructed by a multi-gene genetic programming (MGGP) technique. Similarly, two other surrogate modeling methods-support vector regression (SVR) and Kriging (KRG)-were employed to make comparisons with MGGP. In addition, the surrogate-modeling uncertainty was incorporated in the optimization model by chance-constrained programming (CCP). The results showed that, for the problem considered in this study, (1) the solutions obtained by AMALGAM incurred less remediation cost and required less time than those of NSGA-II, indicating that AMALGAM outperformed NSGA-II. It was additionally shown that (2) the MGGP surrogate model was more accurate than SVR and KRG; and (3) the remediation cost and time increased with the confidence level, which can enable decision makers to make a suitable choice by considering the given budget, remediation time, and reliability. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Assessing the Benefits of NASA Category 3, Low Cost Class C/D Missions

    NASA Technical Reports Server (NTRS)

    Bitten, Robert E.; Shinn, Steven A.; Mahr, Eric M.

    2013-01-01

    Category 3, Class C/D missions have the benefit of delivering worthwhile science at minimal cost which is increasingly important in NASA's constrained budget environment. Although higher cost Category 1 and 2 missions are necessary to achieve NASA's science objectives, Category 3 missions are shown to be an effective way to provide significant science return at a low cost. Category 3 missions, however, are often reviewed the same as the more risk averse Category 1 and 2 missions. Acknowledging that reviews are not the only aspect of a total engineering effort, reviews are still a significant concern for NASA programs. This can unnecessarily increase the cost and schedule of Category 3 missions. This paper quantifies the benefit and performance of Category 3 missions by looking at the cost vs. capability relative to Category 1 and 2 missions. Lessons learned from successful organizations that develop low cost Category 3, Class C/D missions are also investigated to help provide the basis for suggestions to streamline the review of NASA Category 3 missions.

  9. Interval-parameter semi-infinite fuzzy-stochastic mixed-integer programming approach for environmental management under multiple uncertainties.

    PubMed

    Guo, P; Huang, G H

    2010-03-01

    In this study, an interval-parameter semi-infinite fuzzy-chance-constrained mixed-integer linear programming (ISIFCIP) approach is developed for supporting long-term planning of waste-management systems under multiple uncertainties in the City of Regina, Canada. The method improves upon the existing interval-parameter semi-infinite programming (ISIP) and fuzzy-chance-constrained programming (FCCP) by incorporating uncertainties expressed as dual uncertainties of functional intervals and multiple uncertainties of distributions with fuzzy-interval admissible probability of violating constraint within a general optimization framework. The binary-variable solutions represent the decisions of waste-management-facility expansion, and the continuous ones are related to decisions of waste-flow allocation. The interval solutions can help decision-makers to obtain multiple decision alternatives, as well as provide bases for further analyses of tradeoffs between waste-management cost and system-failure risk. In the application to the City of Regina, Canada, two scenarios are considered. In Scenario 1, the City's waste-management practices would be based on the existing policy over the next 25 years. The total diversion rate for the residential waste would be approximately 14%. Scenario 2 is associated with a policy for waste minimization and diversion, where 35% diversion of residential waste should be achieved within 15 years, and 50% diversion over 25 years. In this scenario, not only landfill would be expanded, but also CF and MRF would be expanded. Through the scenario analyses, useful decision support for the City's solid-waste managers and decision-makers has been generated. Three special characteristics of the proposed method make it unique compared with other optimization techniques that deal with uncertainties. Firstly, it is useful for tackling multiple uncertainties expressed as intervals, functional intervals, probability distributions, fuzzy sets, and their combinations; secondly, it has capability in addressing the temporal variations of the functional intervals; thirdly, it can facilitate dynamic analysis for decisions of facility-expansion planning and waste-flow allocation within a multi-facility, multi-period and multi-option context. Copyright 2009 Elsevier Ltd. All rights reserved.

  10. Multi-faceted Rasch measurement and bias patterns in EFL writing performance assessment.

    PubMed

    He, Tung-Hsien; Gou, Wen Johnny; Chien, Ya-Chen; Chen, I-Shan Jenny; Chang, Shan-Mao

    2013-04-01

    This study applied multi-faceted Rasch measurement to examine rater bias in the assessment of essays written by college students learning English as a foreign language. Four raters who had received different academic training from four distinctive disciplines applied a six-category rating scale to analytically rate essays on an argumentative topic and on a descriptive topic. FACETS, a Rasch computer program, was utilized to pinpoint bias patterns by analyzing the rater-topic, rater-category, and topic-category interactions. Results showed: argumentative essays were rated more severely than were descriptive essays; the linguistics-major rater was the most lenient rater, while the literature-major rater was the severest one; and the category of language use received the severest ratings, whereas content was given the most lenient ratings. The severity hierarchies for raters, essay topics, and rating categories suggested that raters' academic training and their perceptions of the importance of categories were associated with their bias patterns. Implications for rater training are discussed.

  11. Non-Negative Spherical Deconvolution (NNSD) for estimation of fiber Orientation Distribution Function in single-/multi-shell diffusion MRI.

    PubMed

    Cheng, Jian; Deriche, Rachid; Jiang, Tianzi; Shen, Dinggang; Yap, Pew-Thian

    2014-11-01

    Spherical Deconvolution (SD) is commonly used for estimating fiber Orientation Distribution Functions (fODFs) from diffusion-weighted signals. Existing SD methods can be classified into two categories: 1) Continuous Representation based SD (CR-SD), where typically Spherical Harmonic (SH) representation is used for convenient analytical solutions, and 2) Discrete Representation based SD (DR-SD), where the signal profile is represented by a discrete set of basis functions uniformly oriented on the unit sphere. A feasible fODF should be non-negative and should integrate to unity throughout the unit sphere S(2). However, to our knowledge, most existing SH-based SD methods enforce non-negativity only on discretized points and not the whole continuum of S(2). Maximum Entropy SD (MESD) and Cartesian Tensor Fiber Orientation Distributions (CT-FOD) are the only SD methods that ensure non-negativity throughout the unit sphere. They are however computational intensive and are susceptible to errors caused by numerical spherical integration. Existing SD methods are also known to overestimate the number of fiber directions, especially in regions with low anisotropy. DR-SD introduces additional error in peak detection owing to the angular discretization of the unit sphere. This paper proposes a SD framework, called Non-Negative SD (NNSD), to overcome all the limitations above. NNSD is significantly less susceptible to the false-positive peaks, uses SH representation for efficient analytical spherical deconvolution, and allows accurate peak detection throughout the whole unit sphere. We further show that NNSD and most existing SD methods can be extended to work on multi-shell data by introducing a three-dimensional fiber response function. We evaluated NNSD in comparison with Constrained SD (CSD), a quadratic programming variant of CSD, MESD, and an L1-norm regularized non-negative least-squares DR-SD. Experiments on synthetic and real single-/multi-shell data indicate that NNSD improves estimation performance in terms of mean difference of angles, peak detection consistency, and anisotropy contrast between isotropic and anisotropic regions. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. Modeling and predicting urban growth pattern of the Tokyo metropolitan area based on cellular automata

    NASA Astrophysics Data System (ADS)

    Zhao, Yaolong; Zhao, Junsan; Murayama, Yuji

    2008-10-01

    The period of high economic growth in Japan which began in the latter half of the 1950s led to a massive migration of population from rural regions to the Tokyo metropolitan area. This phenomenon brought about rapid urban growth and urban structure changes in this area. Purpose of this study is to establish a constrained CA (Cellular Automata) model with GIS (Geographical Information Systems) to simulate urban growth pattern in the Tokyo metropolitan area towards predicting urban form and landscape for the near future. Urban land-use is classified into multi-categories for interpreting the effect of interaction among land-use categories in the spatial process of urban growth. Driving factors of urban growth pattern, such as land condition, railway network, land-use zoning, random perturbation, and neighborhood interaction and so forth, are explored and integrated into this model. These driving factors are calibrated based on exploratory spatial data analysis (ESDA), spatial statistics, logistic regression, and "trial and error" approach. The simulation is assessed at both macro and micro classification levels in three ways: visual approach; fractal dimension; and spatial metrics. Results indicate that this model provides an effective prototype to simulate and predict urban growth pattern of the Tokyo metropolitan area.

  13. Conservative strategy-based ensemble surrogate model for optimal groundwater remediation design at DNAPLs-contaminated sites

    NASA Astrophysics Data System (ADS)

    Ouyang, Qi; Lu, Wenxi; Lin, Jin; Deng, Wenbing; Cheng, Weiguo

    2017-08-01

    The surrogate-based simulation-optimization techniques are frequently used for optimal groundwater remediation design. When this technique is used, surrogate errors caused by surrogate-modeling uncertainty may lead to generation of infeasible designs. In this paper, a conservative strategy that pushes the optimal design into the feasible region was used to address surrogate-modeling uncertainty. In addition, chance-constrained programming (CCP) was adopted to compare with the conservative strategy in addressing this uncertainty. Three methods, multi-gene genetic programming (MGGP), Kriging (KRG) and support vector regression (SVR), were used to construct surrogate models for a time-consuming multi-phase flow model. To improve the performance of the surrogate model, ensemble surrogates were constructed based on combinations of different stand-alone surrogate models. The results show that: (1) the surrogate-modeling uncertainty was successfully addressed by the conservative strategy, which means that this method is promising for addressing surrogate-modeling uncertainty. (2) The ensemble surrogate model that combines MGGP with KRG showed the most favorable performance, which indicates that this ensemble surrogate can utilize both stand-alone surrogate models to improve the performance of the surrogate model.

  14. A Goal Programming/Constrained Regression Review of the Bell System Breakup.

    DTIC Science & Technology

    1985-05-01

    characteristically employ. 4 .- - -. . ,. - - ;--.. . . .. 2. MULTI-PRODUCT COST MODEL AND DATA DETAILS When technical efficiency (i.e. zero waste ) can be assumed...assumming, but we believe that it was probably technical (= zero waste ) efficiency by virtue of the following reasons. Scale efficien- cy was a

  15. On the Nature of Agreement in English-French Acquisition: A Processing Investigation in the Verbal and Nominal Domains

    ERIC Educational Resources Information Center

    Renaud, Claire

    2010-01-01

    Current second language (L2) research focuses on the level of features--that is, the core elements of languages in the Minimalist Program framework. These features, involved in computations, are further divided into two types: those that indicate to which category a word belongs (i.e., interpretable features) versus those that constrain the type…

  16. Toward a dual-learning systems model of speech category learning

    PubMed Central

    Chandrasekaran, Bharath; Koslov, Seth R.; Maddox, W. T.

    2014-01-01

    More than two decades of work in vision posits the existence of dual-learning systems of category learning. The reflective system uses working memory to develop and test rules for classifying in an explicit fashion, while the reflexive system operates by implicitly associating perception with actions that lead to reinforcement. Dual-learning systems models hypothesize that in learning natural categories, learners initially use the reflective system and, with practice, transfer control to the reflexive system. The role of reflective and reflexive systems in auditory category learning and more specifically in speech category learning has not been systematically examined. In this article, we describe a neurobiologically constrained dual-learning systems theoretical framework that is currently being developed in speech category learning and review recent applications of this framework. Using behavioral and computational modeling approaches, we provide evidence that speech category learning is predominantly mediated by the reflexive learning system. In one application, we explore the effects of normal aging on non-speech and speech category learning. Prominently, we find a large age-related deficit in speech learning. The computational modeling suggests that older adults are less likely to transition from simple, reflective, unidimensional rules to more complex, reflexive, multi-dimensional rules. In a second application, we summarize a recent study examining auditory category learning in individuals with elevated depressive symptoms. We find a deficit in reflective-optimal and an enhancement in reflexive-optimal auditory category learning. Interestingly, individuals with elevated depressive symptoms also show an advantage in learning speech categories. We end with a brief summary and description of a number of future directions. PMID:25132827

  17. A Novel Optimal Joint Resource Allocation Method in Cooperative Multicarrier Networks: Theory and Practice

    PubMed Central

    Gao, Yuan; Zhou, Weigui; Ao, Hong; Chu, Jian; Zhou, Quan; Zhou, Bo; Wang, Kang; Li, Yi; Xue, Peng

    2016-01-01

    With the increasing demands for better transmission speed and robust quality of service (QoS), the capacity constrained backhaul gradually becomes a bottleneck in cooperative wireless networks, e.g., in the Internet of Things (IoT) scenario in joint processing mode of LTE-Advanced Pro. This paper focuses on resource allocation within capacity constrained backhaul in uplink cooperative wireless networks, where two base stations (BSs) equipped with single antennae serve multiple single-antennae users via multi-carrier transmission mode. In this work, we propose a novel cooperative transmission scheme based on compress-and-forward with user pairing to solve the joint mixed integer programming problem. To maximize the system capacity under the limited backhaul, we formulate the joint optimization problem of user sorting, subcarrier mapping and backhaul resource sharing among different pairs (subcarriers for users). A novel robust and efficient centralized algorithm based on alternating optimization strategy and perfect mapping is proposed. Simulations show that our novel method can improve the system capacity significantly under the constraint of the backhaul resource compared with the blind alternatives. PMID:27077865

  18. Multi-Constraint Multi-Variable Optimization of Source-Driven Nuclear Systems

    NASA Astrophysics Data System (ADS)

    Watkins, Edward Francis

    1995-01-01

    A novel approach to the search for optimal designs of source-driven nuclear systems is investigated. Such systems include radiation shields, fusion reactor blankets and various neutron spectrum-shaping assemblies. The novel approach involves the replacement of the steepest-descents optimization algorithm incorporated in the code SWAN by a significantly more general and efficient sequential quadratic programming optimization algorithm provided by the code NPSOL. The resulting SWAN/NPSOL code system can be applied to more general, multi-variable, multi-constraint shield optimization problems. The constraints it accounts for may include simple bounds on variables, linear constraints, and smooth nonlinear constraints. It may also be applied to unconstrained, bound-constrained and linearly constrained optimization. The shield optimization capabilities of the SWAN/NPSOL code system is tested and verified in a variety of optimization problems: dose minimization at constant cost, cost minimization at constant dose, and multiple-nonlinear constraint optimization. The replacement of the optimization part of SWAN with NPSOL is found feasible and leads to a very substantial improvement in the complexity of optimization problems which can be efficiently handled.

  19. NASA Applications of Structural Health Monitoring Technology

    NASA Technical Reports Server (NTRS)

    Richards, W Lance; Madaras, Eric I.; Prosser, William H.; Studor, George

    2013-01-01

    This presentation provides examples of research and development that has recently or is currently being conducted at NASA, with a special emphasis on the application of structural health monitoring (SHM) of aerospace vehicles. SHM applications on several vehicle programs are highlighted, including Space Shuttle Orbiter, International Space Station, Uninhabited Aerial Vehicles, and Expandable Launch Vehicles. Examples of current and previous work are presented in the following categories: acoustic emission impact detection, multi-parameter fiber optic strain-based sensing, wireless sensor system development, and distributed leak detection.

  20. NASA Applications of Structural Health Monitoring Technology

    NASA Technical Reports Server (NTRS)

    Richards, W Lance; Madaras, Eric I.; Prosser, William H.; Studor, George

    2013-01-01

    This presentation provides examples of research and development that has recently or is currently being conducted at NASA, with a special emphasis on the application of structural health monitoring (SHM) of aerospace vehicles. SHM applications on several vehicle programs are highlighted, including Space Shuttle Orbiter, the International Space Station, Uninhabited Aerial Vehicles, and Expendable Launch Vehicles. Examples of current and previous work are presented in the following categories: acoustic emission impact detection, multi-parameter fiber optic strain-based sensing, wireless sensor system development, and distributed leak detection.

  1. Lattice-free prediction of three-dimensional structure of programmed DNA assemblies

    PubMed Central

    Pan, Keyao; Kim, Do-Nyun; Zhang, Fei; Adendorff, Matthew R.; Yan, Hao; Bathe, Mark

    2014-01-01

    DNA can be programmed to self-assemble into high molecular weight 3D assemblies with precise nanometer-scale structural features. Although numerous sequence design strategies exist to realize these assemblies in solution, there is currently no computational framework to predict their 3D structures on the basis of programmed underlying multi-way junction topologies constrained by DNA duplexes. Here, we introduce such an approach and apply it to assemblies designed using the canonical immobile four-way junction. The procedure is used to predict the 3D structure of high molecular weight planar and spherical ring-like origami objects, a tile-based sheet-like ribbon, and a 3D crystalline tensegrity motif, in quantitative agreement with experiments. Our framework provides a new approach to predict programmed nucleic acid 3D structure on the basis of prescribed secondary structure motifs, with possible application to the design of such assemblies for use in biomolecular and materials science. PMID:25470497

  2. Inductive Reasoning about Causally Transmitted Properties

    ERIC Educational Resources Information Center

    Shafto, Patrick; Kemp, Charles; Bonawitz, Elizabeth Baraff; Coley, John D.; Tenenbaum, Joshua B.

    2008-01-01

    Different intuitive theories constrain and guide inferences in different contexts. Formalizing simple intuitive theories as probabilistic processes operating over structured representations, we present a new computational model of category-based induction about causally transmitted properties. A first experiment demonstrates undergraduates'…

  3. Residual Risk Assessments

    EPA Science Inventory

    Each source category previously subjected to a technology-based standard will be examined to determine if health or ecological risks are significant enough to warrant further regulation. These assesments utilize existing models and data bases to examine the multi-media and multi-...

  4. Using Multi-Objective Genetic Programming to Synthesize Stochastic Processes

    NASA Astrophysics Data System (ADS)

    Ross, Brian; Imada, Janine

    Genetic programming is used to automatically construct stochastic processes written in the stochastic π-calculus. Grammar-guided genetic programming constrains search to useful process algebra structures. The time-series behaviour of a target process is denoted with a suitable selection of statistical feature tests. Feature tests can permit complex process behaviours to be effectively evaluated. However, they must be selected with care, in order to accurately characterize the desired process behaviour. Multi-objective evaluation is shown to be appropriate for this application, since it permits heterogeneous statistical feature tests to reside as independent objectives. Multiple undominated solutions can be saved and evaluated after a run, for determination of those that are most appropriate. Since there can be a vast number of candidate solutions, however, strategies for filtering and analyzing this set are required.

  5. A Demonstrator Intelligent Scheduler For Sensor-Based Robots

    NASA Astrophysics Data System (ADS)

    Perrotta, Gabriella; Allen, Charles R.; Shepherd, Andrew J.

    1987-10-01

    The development of an execution module capable of functioning as as on-line supervisor for a robot equipped with a vision sensor and tactile sensing gripper system is described. The on-line module is supported by two off-line software modules which provide a procedural based assembly constraints language to allow the assembly task to be defined. This input is then converted into a normalised and minimised form. The host Robot programming language permits high level motions to be issued at the to level, hence allowing a low programming overhead to the designer, who must describe the assembly sequence. Components are selected for pick and place robot movement, based on information derived from two cameras, one static and the other mounted on the end effector of the robot. The approach taken is multi-path scheduling as described by Fox pi. The system is seen to permit robot assembly in a less constrained parts presentation environment making full use of the sensory detail available on the robot.

  6. Category learning in the color-word contingency learning paradigm.

    PubMed

    Schmidt, James R; Augustinova, Maria; De Houwer, Jan

    2018-04-01

    In the typical color-word contingency learning paradigm, participants respond to the print color of words where each word is presented most often in one color. Learning is indicated by faster and more accurate responses when a word is presented in its usual color, relative to another color. To eliminate the possibility that this effect is driven exclusively by the familiarity of item-specific word-color pairings, we examine whether contingency learning effects can be observed also when colors are related to categories of words rather than to individual words. To this end, the reported experiments used three categories of words (animals, verbs, and professions) that were each predictive of one color. Importantly, each individual word was presented only once, thus eliminating individual color-word contingencies. Nevertheless, for the first time, a category-based contingency effect was observed, with faster and more accurate responses when a category item was presented in the color in which most of the other items of that category were presented. This finding helps to constrain episodic learning models and sets the stage for new research on category-based contingency learning.

  7. A Study of Interstellar Medium Components of the Ohio State University Bright Spiral Galaxy Survey

    NASA Astrophysics Data System (ADS)

    Butner, Melissa; Deustua, S. E.; Conti, A.; Smtih, J.

    2011-01-01

    Multi-wavelength data can be used to provide information on the interstellar medium of galaxies, as well as on their stellar populations. We use the Ohio State University Bright Spiral Galaxy Survey (OSBSGS) to investigate the distribution and properties of the interstellar medium in a set of nearby galaxies. The OSBSGS consists of B, V, R, J, H and K band images for a over 200 nearby spiral galaxies. These data allow us to probe the dust temperatures and distribution using color maps. When combined with a pixel based analysis, it may be possible to tease out, perhaps better constraining, the heating mechanism for the ISM, as well as constrain dust models. In this paper we will discuss our progress in understanding, in particular, the properties of dust in nearby galaxies. Melissa Butner was a participant in the STScI Summer Student Program supported by the STScI Director's Discretionary Research Fund. MB also acknowledges support and computer cluster access via NSF grant 07-22890.

  8. Automated Discovery of Speech Act Categories in Educational Games

    ERIC Educational Resources Information Center

    Rus, Vasile; Moldovan, Cristian; Niraula, Nobal; Graesser, Arthur C.

    2012-01-01

    In this paper we address the important task of automated discovery of speech act categories in dialogue-based, multi-party educational games. Speech acts are important in dialogue-based educational systems because they help infer the student speaker's intentions (the task of speech act classification) which in turn is crucial to providing adequate…

  9. Multi-voxel patterns of visual category representation during episodic encoding are predictive of subsequent memory

    PubMed Central

    Kuhl, Brice A.; Rissman, Jesse; Wagner, Anthony D.

    2012-01-01

    Successful encoding of episodic memories is thought to depend on contributions from prefrontal and temporal lobe structures. Neural processes that contribute to successful encoding have been extensively explored through univariate analyses of neuroimaging data that compare mean activity levels elicited during the encoding of events that are subsequently remembered vs. those subsequently forgotten. Here, we applied pattern classification to fMRI data to assess the degree to which distributed patterns of activity within prefrontal and temporal lobe structures elicited during the encoding of word-image pairs were diagnostic of the visual category (Face or Scene) of the encoded image. We then assessed whether representation of category information was predictive of subsequent memory. Classification analyses indicated that temporal lobe structures contained information robustly diagnostic of visual category. Information in prefrontal cortex was less diagnostic of visual category, but was nonetheless associated with highly reliable classifier-based evidence for category representation. Critically, trials associated with greater classifier-based estimates of category representation in temporal and prefrontal regions were associated with a higher probability of subsequent remembering. Finally, consideration of trial-by-trial variance in classifier-based measures of category representation revealed positive correlations between prefrontal and temporal lobe representations, with the strength of these correlations varying as a function of the category of image being encoded. Together, these results indicate that multi-voxel representations of encoded information can provide unique insights into how visual experiences are transformed into episodic memories. PMID:21925190

  10. An interactive computer approach to performing resource analysis for a multi-resource/multi-project problem. [Spacelab inventory procurement planning

    NASA Technical Reports Server (NTRS)

    Schlagheck, R. A.

    1977-01-01

    New planning techniques and supporting computer tools are needed for the optimization of resources and costs for space transportation and payload systems. Heavy emphasis on cost effective utilization of resources has caused NASA program planners to look at the impact of various independent variables that affect procurement buying. A description is presented of a category of resource planning which deals with Spacelab inventory procurement analysis. Spacelab is a joint payload project between NASA and the European Space Agency and will be flown aboard the Space Shuttle starting in 1980. In order to respond rapidly to the various procurement planning exercises, a system was built that could perform resource analysis in a quick and efficient manner. This system is known as the Interactive Resource Utilization Program (IRUP). Attention is given to aspects of problem definition, an IRUP system description, questions of data base entry, the approach used for project scheduling, and problems of resource allocation.

  11. A Three-Dimensional Target Depth-Resolution Method with a Single-Vector Sensor

    PubMed Central

    Zhao, Anbang; Bi, Xuejie; Hui, Juan; Zeng, Caigao; Ma, Lin

    2018-01-01

    This paper mainly studies and verifies the target number category-resolution method in multi-target cases and the target depth-resolution method of aerial targets. Firstly, target depth resolution is performed by using the sign distribution of the reactive component of the vertical complex acoustic intensity; the target category and the number resolution in multi-target cases is realized with a combination of the bearing-time recording information; and the corresponding simulation verification is carried out. The algorithm proposed in this paper can distinguish between the single-target multi-line spectrum case and the multi-target multi-line spectrum case. This paper presents an improved azimuth-estimation method for multi-target cases, which makes the estimation results more accurate. Using the Monte Carlo simulation, the feasibility of the proposed target number and category-resolution algorithm in multi-target cases is verified. In addition, by studying the field characteristics of the aerial and surface targets, the simulation results verify that there is only amplitude difference between the aerial target field and the surface target field under the same environmental parameters, and an aerial target can be treated as a special case of a surface target; the aerial target category resolution can then be realized based on the sign distribution of the reactive component of the vertical acoustic intensity so as to realize three-dimensional target depth resolution. By processing data from a sea experiment, the feasibility of the proposed aerial target three-dimensional depth-resolution algorithm is verified. PMID:29649173

  12. A Three-Dimensional Target Depth-Resolution Method with a Single-Vector Sensor.

    PubMed

    Zhao, Anbang; Bi, Xuejie; Hui, Juan; Zeng, Caigao; Ma, Lin

    2018-04-12

    This paper mainly studies and verifies the target number category-resolution method in multi-target cases and the target depth-resolution method of aerial targets. Firstly, target depth resolution is performed by using the sign distribution of the reactive component of the vertical complex acoustic intensity; the target category and the number resolution in multi-target cases is realized with a combination of the bearing-time recording information; and the corresponding simulation verification is carried out. The algorithm proposed in this paper can distinguish between the single-target multi-line spectrum case and the multi-target multi-line spectrum case. This paper presents an improved azimuth-estimation method for multi-target cases, which makes the estimation results more accurate. Using the Monte Carlo simulation, the feasibility of the proposed target number and category-resolution algorithm in multi-target cases is verified. In addition, by studying the field characteristics of the aerial and surface targets, the simulation results verify that there is only amplitude difference between the aerial target field and the surface target field under the same environmental parameters, and an aerial target can be treated as a special case of a surface target; the aerial target category resolution can then be realized based on the sign distribution of the reactive component of the vertical acoustic intensity so as to realize three-dimensional target depth resolution. By processing data from a sea experiment, the feasibility of the proposed aerial target three-dimensional depth-resolution algorithm is verified.

  13. Recent Results of NASA's Space Environments and Effects Program

    NASA Technical Reports Server (NTRS)

    Minor, Jody L.; Brewer, Dana S.

    1998-01-01

    The Space Environments and Effects (SEE) Program is a multi-center multi-agency program managed by the NASA Marshall Space Flight Center. The program evolved from the Long Duration Exposure Facility (LDEF), analysis of LDEF data, and recognition of the importance of the environments and environmental effects on future space missions. It is a very comprehensive and focused approach to understanding the space environments, to define the best techniques for both flight and ground-based experimentation, to update the models which predict both the environments and the environmental effects on spacecraft, and finally to ensure that this information is properly maintained and inserted into spacecraft design programs. Formal funding of the SEE Program began initially in FY95. A NASA Research Announcement (NRA) solicited research proposals in the following categories: 1) Engineering environment definitions; 2) Environments and effects design guidelines; 3) Environments and effects assessment models and databases; and, 4) Flight/ground simulation/technology assessment data. This solicitation resulted in funding for eighteen technology development activities (TDA's). This paper will present and describe technical results rom the first set of TDA's of the SEE Program. It will also describe the second set of technology development activities which are expected to begin in January 1998. These new technology development activities will enable the SEE Program to start numerous new development activities in support of mission customer needs.

  14. RF model of the distribution system as a communication channel, phase 2. Volume 4: Sofware source program and illustrations ASCII database listings

    NASA Technical Reports Server (NTRS)

    Rustay, R. C.; Gajjar, J. T.; Rankin, R. W.; Wentz, R. C.; Wooding, R.

    1982-01-01

    Listings of source programs and some illustrative examples of various ASCII data base files are presented. The listings are grouped into the following categories: main programs, subroutine programs, illustrative ASCII data base files. Within each category files are listed alphabetically.

  15. Three Program Architecture for Design Optimization

    NASA Technical Reports Server (NTRS)

    Miura, Hirokazu; Olson, Lawrence E. (Technical Monitor)

    1998-01-01

    In this presentation, I would like to review historical perspective on the program architecture used to build design optimization capabilities based on mathematical programming and other numerical search techniques. It is rather straightforward to classify the program architecture in three categories as shown above. However, the relative importance of each of the three approaches has not been static, instead dynamically changing as the capabilities of available computational resource increases. For example, we considered that the direct coupling architecture would never be used for practical problems, but availability of such computer systems as multi-processor. In this presentation, I would like to review the roles of three architecture from historical as well as current and future perspective. There may also be some possibility for emergence of hybrid architecture. I hope to provide some seeds for active discussion where we are heading to in the very dynamic environment for high speed computing and communication.

  16. State-of-the-Art: DTM Generation Using Airborne LIDAR Data

    PubMed Central

    Chen, Ziyue; Gao, Bingbo; Devereux, Bernard

    2017-01-01

    Digital terrain model (DTM) generation is the fundamental application of airborne Lidar data. In past decades, a large body of studies has been conducted to present and experiment a variety of DTM generation methods. Although great progress has been made, DTM generation, especially DTM generation in specific terrain situations, remains challenging. This research introduces the general principles of DTM generation and reviews diverse mainstream DTM generation methods. In accordance with the filtering strategy, these methods are classified into six categories: surface-based adjustment; morphology-based filtering, triangulated irregular network (TIN)-based refinement, segmentation and classification, statistical analysis and multi-scale comparison. Typical methods for each category are briefly introduced and the merits and limitations of each category are discussed accordingly. Despite different categories of filtering strategies, these DTM generation methods present similar difficulties when implemented in sharply changing terrain, areas with dense non-ground features and complicated landscapes. This paper suggests that the fusion of multi-sources and integration of different methods can be effective ways for improving the performance of DTM generation. PMID:28098810

  17. Understanding Grb Physics With Multi-Wavelength Data

    NASA Astrophysics Data System (ADS)

    Zhang, Bing

    The study of Gamma-ray bursts (GRBs) has entered a full multi-wavelength era. A rich trove of data from NASA GRB missions and ground-based follow up observations have been collected. Careful data mining with well-defined scientific objectives holds the key to address open questions in GRB physics, such as jet composition, radiation mechanism, progenitor and central engine physics. We propose to perform data analyses in the following three directions. 1. The time resolved GRB spectra have a dominant component that can be fit with a phenomenological ``Band'' function. The physical meaning of this function remains unclear. Recently we made a breakthrough in theoretical modeling, and showed that fast-cooling synchrotron radiation of electrons in a decreasing magnetic field can mimic the Band function in detector's bandpass, but differs from Band function slightly. We propose to apply this physically-motivated model to systematically fit the GRB prompt emission data collected by Fermi GBM and LAT, and test whether the dominant GRB emission mechanism is fast cooling synchrotron radiation. We will also fit time-dependent spectra with a time-dependent model to investigate whether a quasi- thermal "photosphere'' emission component is indeed needed to fit the observed spectra. This would shed light onto the unknown composition of GRB jets. By fitting the time resolved spectra, we will also constrain important physical parameters of GRB prompt emission, such as the emission site of GRBs, the strength of magnetic fields, as well as their evolution with radius. 2. Recent GRB multi-wavelength observations suggest that it is not straightforward to define the physical category of a GRB based on the traditional classification in the "duration''-"hardness'' domain. Some long-duration GRBs may not have a massive star origin, while some short-duration GRBs may instead have a massive star origin. We propose to systematically study the gamma-ray Swift/BAT, Fermi/GBM- LAT), X-ray (Swift/XRT, Chandra), and optical (ground-based and HST) properties of all short GRBs, and apply multi-wavelength observational criteria to constrain the possible progenitor(s) of them. 3. The GRB central engine is still not identified. Growing observational data and theoretical modeling suggest that at least some GRBs may host a magnetar (in contrast to a hyper-accreting black hole) central engine. We propose to carry out a statistical study of the prompt emission and afterglow properties of GRBs that show possible evidence of magnetar behavior and compare their properties with those that do not show such evidence. We will define three samples: a gold sample that show a steady X-ray emission followed by a rapid decline, which are likely powered by internal dissipation of a magnetar wind, a silver sample showing a shallow decay segment followed by a normal decay, which can be interpreted as external shock emission with a magnetar continuous energy injection into the blastwave, and a sample that includes other GRBs that do not show any evidence of magnetar. We will compare various observational properties (e.g. isotropic energy/luminosity, jet-corrected energy/luminosity, jet opening angle, peak energy) of these samples and investigate whether there are noticeable differences among these samples. The results would shed light onto the difficult problem of GRB central engine, addressing whether different engines work in GRBs, and if so, what difference. The program conforms to NASA's Strategic Plan, and will make use of the public archival data of many NASA missions, including Fermi, Swift, HST, and Chandra.

  18. A qualitative study exploring adolescents' experiences with a school-based mental health program.

    PubMed

    Garmy, Pernilla; Berg, Agneta; Clausson, Eva K

    2015-10-21

    Supporting positive mental health development in adolescents is a major public health concern worldwide. Although several school-based programs aimed at preventing depression have been launched, it is crucial to evaluate these programs and to obtain feedback from participating adolescents. This study aimed to explore adolescents' experiences with a -based cognitive-behavioral depression prevention program. Eighty-nine adolescents aged 13-15 years were divided into 12 focus groups. The focus group interviews were analyzed using qualitative content analysis. Three categories and eight subcategories were found to be related to the experience of the school-based program. The first category, intrapersonal strategies, consisted of the subcategories of directed thinking, improved self-confidence, stress management, and positive activities. The second category, interpersonal awareness, consisted of the subcategories of trusting the group and considering others. The third category, structural constraints, consisted of the subcategories of negative framing and emphasis on performance. The school-based mental health program was perceived as beneficial and meaningful on both individual and group levels, but students expressed a desire for a more health-promoting approach.

  19. Methods used to parameterize the spatially-explicit components of a state-and-transition simulation model

    USGS Publications Warehouse

    Sleeter, Rachel; Acevedo, William; Soulard, Christopher E.; Sleeter, Benjamin M.

    2015-01-01

    Spatially-explicit state-and-transition simulation models of land use and land cover (LULC) increase our ability to assess regional landscape characteristics and associated carbon dynamics across multiple scenarios. By characterizing appropriate spatial attributes such as forest age and land-use distribution, a state-and-transition model can more effectively simulate the pattern and spread of LULC changes. This manuscript describes the methods and input parameters of the Land Use and Carbon Scenario Simulator (LUCAS), a customized state-and-transition simulation model utilized to assess the relative impacts of LULC on carbon stocks for the conterminous U.S. The methods and input parameters are spatially explicit and describe initial conditions (strata, state classes and forest age), spatial multipliers, and carbon stock density. Initial conditions were derived from harmonization of multi-temporal data characterizing changes in land use as well as land cover. Harmonization combines numerous national-level datasets through a cell-based data fusion process to generate maps of primary LULC categories. Forest age was parameterized using data from the North American Carbon Program and spatially-explicit maps showing the locations of past disturbances (i.e. wildfire and harvest). Spatial multipliers were developed to spatially constrain the location of future LULC transitions. Based on distance-decay theory, maps were generated to guide the placement of changes related to forest harvest, agricultural intensification/extensification, and urbanization. We analyze the spatially-explicit input parameters with a sensitivity analysis, by showing how LUCAS responds to variations in the model input. This manuscript uses Mediterranean California as a regional subset to highlight local to regional aspects of land change, which demonstrates the utility of LUCAS at many scales and applications.

  20. A generalized network flow model for the multi-mode resource-constrained project scheduling problem with discounted cash flows

    NASA Astrophysics Data System (ADS)

    Chen, Miawjane; Yan, Shangyao; Wang, Sin-Siang; Liu, Chiu-Lan

    2015-02-01

    An effective project schedule is essential for enterprises to increase their efficiency of project execution, to maximize profit, and to minimize wastage of resources. Heuristic algorithms have been developed to efficiently solve the complicated multi-mode resource-constrained project scheduling problem with discounted cash flows (MRCPSPDCF) that characterize real problems. However, the solutions obtained in past studies have been approximate and are difficult to evaluate in terms of optimality. In this study, a generalized network flow model, embedded in a time-precedence network, is proposed to formulate the MRCPSPDCF with the payment at activity completion times. Mathematically, the model is formulated as an integer network flow problem with side constraints, which can be efficiently solved for optimality, using existing mathematical programming software. To evaluate the model performance, numerical tests are performed. The test results indicate that the model could be a useful planning tool for project scheduling in the real world.

  1. Multi-task linear programming discriminant analysis for the identification of progressive MCI individuals.

    PubMed

    Yu, Guan; Liu, Yufeng; Thung, Kim-Han; Shen, Dinggang

    2014-01-01

    Accurately identifying mild cognitive impairment (MCI) individuals who will progress to Alzheimer's disease (AD) is very important for making early interventions. Many classification methods focus on integrating multiple imaging modalities such as magnetic resonance imaging (MRI) and fluorodeoxyglucose positron emission tomography (FDG-PET). However, the main challenge for MCI classification using multiple imaging modalities is the existence of a lot of missing data in many subjects. For example, in the Alzheimer's Disease Neuroimaging Initiative (ADNI) study, almost half of the subjects do not have PET images. In this paper, we propose a new and flexible binary classification method, namely Multi-task Linear Programming Discriminant (MLPD) analysis, for the incomplete multi-source feature learning. Specifically, we decompose the classification problem into different classification tasks, i.e., one for each combination of available data sources. To solve all different classification tasks jointly, our proposed MLPD method links them together by constraining them to achieve the similar estimated mean difference between the two classes (under classification) for those shared features. Compared with the state-of-the-art incomplete Multi-Source Feature (iMSF) learning method, instead of constraining different classification tasks to choose a common feature subset for those shared features, MLPD can flexibly and adaptively choose different feature subsets for different classification tasks. Furthermore, our proposed MLPD method can be efficiently implemented by linear programming. To validate our MLPD method, we perform experiments on the ADNI baseline dataset with the incomplete MRI and PET images from 167 progressive MCI (pMCI) subjects and 226 stable MCI (sMCI) subjects. We further compared our method with the iMSF method (using incomplete MRI and PET images) and also the single-task classification method (using only MRI or only subjects with both MRI and PET images). Experimental results show very promising performance of our proposed MLPD method.

  2. Multi-Task Linear Programming Discriminant Analysis for the Identification of Progressive MCI Individuals

    PubMed Central

    Yu, Guan; Liu, Yufeng; Thung, Kim-Han; Shen, Dinggang

    2014-01-01

    Accurately identifying mild cognitive impairment (MCI) individuals who will progress to Alzheimer's disease (AD) is very important for making early interventions. Many classification methods focus on integrating multiple imaging modalities such as magnetic resonance imaging (MRI) and fluorodeoxyglucose positron emission tomography (FDG-PET). However, the main challenge for MCI classification using multiple imaging modalities is the existence of a lot of missing data in many subjects. For example, in the Alzheimer's Disease Neuroimaging Initiative (ADNI) study, almost half of the subjects do not have PET images. In this paper, we propose a new and flexible binary classification method, namely Multi-task Linear Programming Discriminant (MLPD) analysis, for the incomplete multi-source feature learning. Specifically, we decompose the classification problem into different classification tasks, i.e., one for each combination of available data sources. To solve all different classification tasks jointly, our proposed MLPD method links them together by constraining them to achieve the similar estimated mean difference between the two classes (under classification) for those shared features. Compared with the state-of-the-art incomplete Multi-Source Feature (iMSF) learning method, instead of constraining different classification tasks to choose a common feature subset for those shared features, MLPD can flexibly and adaptively choose different feature subsets for different classification tasks. Furthermore, our proposed MLPD method can be efficiently implemented by linear programming. To validate our MLPD method, we perform experiments on the ADNI baseline dataset with the incomplete MRI and PET images from 167 progressive MCI (pMCI) subjects and 226 stable MCI (sMCI) subjects. We further compared our method with the iMSF method (using incomplete MRI and PET images) and also the single-task classification method (using only MRI or only subjects with both MRI and PET images). Experimental results show very promising performance of our proposed MLPD method. PMID:24820966

  3. Multi-Cultural Graduate Library Education. Historical Paper 5

    ERIC Educational Resources Information Center

    Carter, Jane Robbins

    2015-01-01

    This paper examines factors influencing the number of minority students enrolling in library schools during the 10 years prior to 1978. Robbins notes that there are four categories of barriers likely obstructing recruitment of students of color into LIS programs: financial, educational, psychosocial, and cultural. [For the commentary on this…

  4. Power-Aware Intrusion Detection in Mobile Ad Hoc Networks

    NASA Astrophysics Data System (ADS)

    Şen, Sevil; Clark, John A.; Tapiador, Juan E.

    Mobile ad hoc networks (MANETs) are a highly promising new form of networking. However they are more vulnerable to attacks than wired networks. In addition, conventional intrusion detection systems (IDS) are ineffective and inefficient for highly dynamic and resource-constrained environments. Achieving an effective operational MANET requires tradeoffs to be made between functional and non-functional criteria. In this paper we show how Genetic Programming (GP) together with a Multi-Objective Evolutionary Algorithm (MOEA) can be used to synthesise intrusion detection programs that make optimal tradeoffs between security criteria and the power they consume.

  5. Density-to-Potential Inversions to Guide Development of Exchange-Correlation Approximations at Finite Temperature

    NASA Astrophysics Data System (ADS)

    Jensen, Daniel; Wasserman, Adam; Baczewski, Andrew

    The construction of approximations to the exchange-correlation potential for warm dense matter (WDM) is a topic of significant recent interest. In this work, we study the inverse problem of Kohn-Sham (KS) DFT as a means of guiding functional design at zero temperature and in WDM. Whereas the forward problem solves the KS equations to produce a density from a specified exchange-correlation potential, the inverse problem seeks to construct the exchange-correlation potential from specified densities. These two problems require different computational methods and convergence criteria despite sharing the same mathematical equations. We present two new inversion methods based on constrained variational and PDE-constrained optimization methods. We adapt these methods to finite temperature calculations to reveal the exchange-correlation potential's temperature dependence in WDM-relevant conditions. The different inversion methods presented are applied to both non-interacting and interacting model systems for comparison. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Security Administration under contract DE-AC04-94.

  6. Optimization-based channel constrained data aggregation routing algorithms in multi-radio wireless sensor networks.

    PubMed

    Yen, Hong-Hsu

    2009-01-01

    In wireless sensor networks, data aggregation routing could reduce the number of data transmissions so as to achieve energy efficient transmission. However, data aggregation introduces data retransmission that is caused by co-channel interference from neighboring sensor nodes. This kind of co-channel interference could result in extra energy consumption and significant latency from retransmission. This will jeopardize the benefits of data aggregation. One possible solution to circumvent data retransmission caused by co-channel interference is to assign different channels to every sensor node that is within each other's interference range on the data aggregation tree. By associating each radio with a different channel, a sensor node could receive data from all the children nodes on the data aggregation tree simultaneously. This could reduce the latency from the data source nodes back to the sink so as to meet the user's delay QoS. Since the number of radios on each sensor node and the number of non-overlapping channels are all limited resources in wireless sensor networks, a challenging question here is to minimize the total transmission cost under limited number of non-overlapping channels in multi-radio wireless sensor networks. This channel constrained data aggregation routing problem in multi-radio wireless sensor networks is an NP-hard problem. I first model this problem as a mixed integer and linear programming problem where the objective is to minimize the total transmission subject to the data aggregation routing, channel and radio resources constraints. The solution approach is based on the Lagrangean relaxation technique to relax some constraints into the objective function and then to derive a set of independent subproblems. By optimally solving these subproblems, it can not only calculate the lower bound of the original primal problem but also provide useful information to get the primal feasible solutions. By incorporating these Lagrangean multipliers as the link arc weight, the optimization-based heuristics are proposed to get energy-efficient data aggregation tree with better resource (channel and radio) utilization. From the computational experiments, the proposed optimization-based approach is superior to existing heuristics under all tested cases.

  7. Orchestra Festival Evaluations: Interjudge Agreement and Relationships between Performance Categories and Final Ratings.

    ERIC Educational Resources Information Center

    Garman, Barry R.; And Others

    1991-01-01

    Band, orchestra, and choir festival evaluations are a regular part of many secondary school music programs, and most such festivals engage adjudicators who rate each group's performance. Because music ensemble performance is complex and multi-dimensional, it does not lend itself readily to precise measurement; generally, musical performances are…

  8. Study on stimulus-responsive cellulose-based polymeric materials

    NASA Astrophysics Data System (ADS)

    Luo, Hongsheng

    Stimulus-responsive cellulose-based polymeric materials were developed by physical and chemical approaches. The thermal, structural, mechanical and morphological properties of the samples were comprehensively investigated by multiple tools. Shape memory effect (SME), programming-structure-property relationship and underling mechanisms were emphasized in this study. Some new concepts, such as heterogeneous-twin-switch, path-dependent multi-shape, rapidly switchable water-sensitive SME were established. The samples were divided into two categories. For the first category, cellulose nano-whiskers (CNWs) were incorporated into crystalline shape memory polyurethane (SMPU) and thermal plastic polyurethane (TPU). The CNW-SMPU nano-composites had heterogeneous switches. Triple- and multi-shape effects were achieved for the CNW-SMPU nano-composites by applying into appropriate thermal-aqueous-mechanical programming. Furthermore, the thermally triggered shape recovery of the composites was found to be tuneable, depending on the PCN content. Theoretical prediction along with numerical analysis was conducted, providing evidence on the possible microstructure of the CNW-SMPU nano-composites. Rapidly switchable water-sensitive SME of the CNW-TPU nano-composites was unprecedentedly studied, which originated from the reversible regulation of hydrogen bonding by water. The samples in the second category consisted of cellulose-polyurethane (PU) blends, cellulose-poly(acrylic acid) (PAA) composites and modified cellulose with supramolecular switches, featuring the requirement of homogeneous cellulose solution in the synthesis process. The reversible behaviours of the cellulose-PU blends in wet-dry cycles as well as the underlying shape memory mechanism were characterized and disclosed. The micro-patterns of the blends were found to be self-similar in fractal dimensions. Cellulose-PAA semi-interpenetrating networks exhibited mechanical adaptability in wet-dry cycles. A type of thermally reversible quadruple hydrogen bonding units, ureidopyrimidinone (UPy), reacted with the cellulose as pendent side-groups, which may impart the modified cellulose with thermal sensitivity. It is the first attempt to explore the natural cellulose as smart polymeric materials systematically and comprehensively. The concepts originally created in the study provided new viewpoints and routes for the development of novel shape memory polymers. The findings significantly benefits extension of the potential application of the cellulose in smart polymeric materials field.

  9. Studies of jet cross-sections and production properties with the ATLAS and CMS detectors

    NASA Astrophysics Data System (ADS)

    Anjos, Nuno

    2016-07-01

    Several characteristics of jet production in pp collisions have been measured by the ATLAS and CMS collaborations at the LHC. Measurements of event shapes and multi-jet production probe the dynamics of QCD in the soft regime and can constrain parton shower and hadronisation models. Measurements of multi-jet systems with a veto on additional jets probe QCD radiation effects. Double-differential cross-sections for threeand four-jet final states are measured at different centre-of-mass energies of pp collisions and are compared to expectations based on NLO QCD calculations. The distribution of the jet charge has been measured in di-jet events and compared to predictions from different hadronisation models and tunes. Jet-jet energy correlations are sensitive to the strong coupling constant. These measurements constitute precision tests of QCD in a new energy regime. Work supported by the Beatriu de Pinós program managed by Agència de Gestió d'Ajuts Universitaris i de Recerca with the support of the Secretaria d'Universitats i Recerca of the Departament d'Economia i Coneixement of the Generalitat de Catalunya, and the Cofund program of the Marie Curie Actions of the 7th R&D Framework Program of the European Union. Work partially supported by MINECO under grants SEV-2012-0234, FPA2013-48308, and FPA2012-38713, which include FEDER funds from the European Union.

  10. Serving some and serving all: how providers navigate the challenges of providing racially targeted health services.

    PubMed

    Zhou, Amy

    2017-10-01

    Racially targeted healthcare provides racial minorities with culturally and linguistically appropriate health services. This mandate, however, can conflict with the professional obligation of healthcare providers to serve patients based on their health needs. The dilemma between serving a particular population and serving all is heightened when the patients seeking care are racially diverse. This study examines how providers in a multi-racial context decide whom to include or exclude from health programs. This study draws on 12 months of ethnographic fieldwork at an Asian-specific HIV organization. Fieldwork included participant observation of HIV support groups, community outreach programs, and substance abuse recovery groups, as well as interviews with providers and clients. Providers managed the dilemma in different ways. While some programs in the organization focused on an Asian clientele, others de-emphasized race and served a predominantly Latino and African American clientele. Organizational structures shaped whether services were delivered according to racial categories. When funders examined client documents, providers prioritized finding Asian clients so that their documents reflected program goals to serve the Asian population. In contrast, when funders used qualitative methods, providers could construct an image of a program that targets Asians during evaluations while they included other racial minorities in their everyday practice. Program services were organized more broadly by health needs. Even within racially targeted programs, the meaning of race fluctuates and is contested. Patients' health needs cross cut racial boundaries, and in some circumstances, the boundaries of inclusion can expand beyond specific racial categories to include racial minorities and underserved populations more generally.

  11. Towards designing an optical-flow based colonoscopy tracking algorithm: a comparative study

    NASA Astrophysics Data System (ADS)

    Liu, Jianfei; Subramanian, Kalpathi R.; Yoo, Terry S.

    2013-03-01

    Automatic co-alignment of optical and virtual colonoscopy images can supplement traditional endoscopic procedures, by providing more complete information of clinical value to the gastroenterologist. In this work, we present a comparative analysis of our optical flow based technique for colonoscopy tracking, in relation to current state of the art methods, in terms of tracking accuracy, system stability, and computational efficiency. Our optical-flow based colonoscopy tracking algorithm starts with computing multi-scale dense and sparse optical flow fields to measure image displacements. Camera motion parameters are then determined from optical flow fields by employing a Focus of Expansion (FOE) constrained egomotion estimation scheme. We analyze the design choices involved in the three major components of our algorithm: dense optical flow, sparse optical flow, and egomotion estimation. Brox's optical flow method,1 due to its high accuracy, was used to compare and evaluate our multi-scale dense optical flow scheme. SIFT6 and Harris-affine features7 were used to assess the accuracy of the multi-scale sparse optical flow, because of their wide use in tracking applications; the FOE-constrained egomotion estimation was compared with collinear,2 image deformation10 and image derivative4 based egomotion estimation methods, to understand the stability of our tracking system. Two virtual colonoscopy (VC) image sequences were used in the study, since the exact camera parameters(for each frame) were known; dense optical flow results indicated that Brox's method was superior to multi-scale dense optical flow in estimating camera rotational velocities, but the final tracking errors were comparable, viz., 6mm vs. 8mm after the VC camera traveled 110mm. Our approach was computationally more efficient, averaging 7.2 sec. vs. 38 sec. per frame. SIFT and Harris affine features resulted in tracking errors of up to 70mm, while our sparse optical flow error was 6mm. The comparison among egomotion estimation algorithms showed that our FOE-constrained egomotion estimation method achieved the optimal balance between tracking accuracy and robustness. The comparative study demonstrated that our optical-flow based colonoscopy tracking algorithm maintains good accuracy and stability for routine use in clinical practice.

  12. APPLICATION OF A BIP CONSTRAINED OPTIMIZATION MODEL COMBINED WITH NASA's ATLAS MODEL TO OPTIMIZE THE SOCIETAL BENEFITS OF THE USA's INTERNATIONAL SPACE EXPLORATION AND UTILIZATION INITIATIVE OF 1/14/04

    NASA Technical Reports Server (NTRS)

    Morgenthaler, George W.; Glover, Fred W.; Woodcock, Gordon R.; Laguna, Manuel

    2005-01-01

    The 1/14/04 USA Space Exploratiofltilization Initiative invites all Space-faring Nations, all Space User Groups in Science, Space Entrepreneuring, Advocates of Robotic and Human Space Exploration, Space Tourism and Colonization Promoters, etc., to join an International Space Partnership. With more Space-faring Nations and Space User Groups each year, such a Partnership would require Multi-year (35 yr.-45 yr.) Space Mission Planning. With each Nation and Space User Group demanding priority for its missions, one needs a methodology for obiectively selecting the best mission sequences to be added annually to this 45 yr. Moving Space Mission Plan. How can this be done? Planners have suggested building a Reusable, Sustainable, Space Transportation Infrastructure (RSSn) to increase Mission synergism, reduce cost, and increase scientific and societal returns from this Space Initiative. Morgenthaler and Woodcock presented a Paper at the 55th IAC, Vancouver B.C., Canada, entitled Constrained Optimization Models For Optimizing Multi - Year Space Programs. This Paper showed that a Binary Integer Programming (BIP) Constrained Optimization Model combined with the NASA ATLAS Cost and Space System Operational Parameter Estimating Model has the theoretical capability to solve such problems. IAA Commission III, Space Technology and Space System Development, in its ACADEMY DAY meeting at Vancouver, requested that the Authors and NASA experts find several Space Exploration Architectures (SEAS), apply the combined BIP/ATLAS Models, and report the results at the 56th Fukuoka IAC. While the mathematical Model is in Ref.[2] this Paper presents the Application saga of that effort.

  13. Start small, dream big: Experiences of physical activity in public spaces in Colombia.

    PubMed

    Díaz Del Castillo, Adriana; González, Silvia Alejandra; Ríos, Ana Paola; Páez, Diana C; Torres, Andrea; Díaz, María Paula; Pratt, Michael; Sarmiento, Olga L

    2017-10-01

    Multi-sectoral strategies to promote active recreation and physical activity in public spaces are crucial to building a "culture of health". However, studies on the sustainability and scalability of these strategies are limited. This paper identifies the factors related to the sustainability and scaling up of two community-based programs offering physical activity classes in public spaces in Colombia: Bogotá's Recreovía and Colombia's "Healthy Habits and Lifestyles Program-HEVS". Both programs have been sustained for more than 10years, and have benefited 1455 communities. We used a mixed-methods approach including semi-structured interviews, document review and an analysis of data regarding the programs' history, characteristics, funding, capacity building and challenges. Interviews were conducted between May-October 2015. Based on the sustainability frameworks of Shediac-Rizkallah and Bone and Scheirer, we developed categories to independently code each interview. All information was independently analyzed by four of the authors and cross-compared between programs. Findings showed that these programs underwent adaptation processes to address the challenges that threatened their continuation and growth. The primary strategies included flexibility/adaptability, investing in the working conditions and training of instructors, allocating public funds and requesting accountability, diversifying resources, having community support and champions at different levels and positions, and carrying out continuous advocacy to include physical activity in public policies. Recreovía and HEVS illustrate sustainability as an incremental, multi-level process at different levels. Lessons learned for similar initiatives include the importance of individual actions and small events, a willingness to start small while dreaming big, being flexible, and prioritizing the human factor. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Toward an Innovative, Basic Program Model for the Improvement of Professional Instruction in Dental Education: A Review of the Literature.

    ERIC Educational Resources Information Center

    Wulf, Kathleen M.; And Others

    1980-01-01

    An analysis of the massive amount of literature pertaining to the improvement of professional instruction in dental education resulted in the formation of a comprehensive model of 10 categories, including Delphi technique; systems approach; agencies; workshops; multi-media, self-instruction; evaluation paradigms, measurement, courses, and…

  15. Ground-Based Research within NASA's Materials Science Program

    NASA Technical Reports Server (NTRS)

    Gillies, Donald C.; Curreri, Peter (Technical Monitor)

    2002-01-01

    Ground-based research in Materials Science for NASA's Microgravity program serves several purposes, and includes approximately four Principal Investigators for every one in the flight program. While exact classification is difficult. the ground program falls roughly into the following categories: (1) Intellectual Underpinning of the Flight Program - Theoretical Studies; (2) Intellectual Underpinning of the Flight Program - Bringing to Maturity New Research; (3) Intellectual Underpinning of the Flight Program - Enabling Characterization; (4) Intellectual Underpinning of the Flight Program - Thermophysical Property Determination; (5) Radiation Shielding; (6) Preliminary In Situ Resource Utilization; (7) Biomaterials; (8) Nanostructured Materials; (9) Materials Science for Advanced Space Propulsion. It must be noted that while the first four categories are aimed at using long duration low gravity conditions, the other categories pertain more to more recent NASA initiatives in materials science. These new initiatives address NASA's future materials science needs in the realms of crew health and safety, and exploration, and have been included in the most recent NASA Research Announcements (NRA). A description of each of these nine categories will be given together with examples of the kinds of research being undertaken.

  16. Multi-modal molecular diffuse optical tomography system for small animal imaging

    PubMed Central

    Guggenheim, James A.; Basevi, Hector R. A.; Frampton, Jon; Styles, Iain B.; Dehghani, Hamid

    2013-01-01

    A multi-modal optical imaging system for quantitative 3D bioluminescence and functional diffuse imaging is presented, which has no moving parts and uses mirrors to provide multi-view tomographic data for image reconstruction. It is demonstrated that through the use of trans-illuminated spectral near infrared measurements and spectrally constrained tomographic reconstruction, recovered concentrations of absorbing agents can be used as prior knowledge for bioluminescence imaging within the visible spectrum. Additionally, the first use of a recently developed multi-view optical surface capture technique is shown and its application to model-based image reconstruction and free-space light modelling is demonstrated. The benefits of model-based tomographic image recovery as compared to 2D planar imaging are highlighted in a number of scenarios where the internal luminescence source is not visible or is confounding in 2D images. The results presented show that the luminescence tomographic imaging method produces 3D reconstructions of individual light sources within a mouse-sized solid phantom that are accurately localised to within 1.5mm for a range of target locations and depths indicating sensitivity and accurate imaging throughout the phantom volume. Additionally the total reconstructed luminescence source intensity is consistent to within 15% which is a dramatic improvement upon standard bioluminescence imaging. Finally, results from a heterogeneous phantom with an absorbing anomaly are presented demonstrating the use and benefits of a multi-view, spectrally constrained coupled imaging system that provides accurate 3D luminescence images. PMID:24954977

  17. A Supply and Demand Management Perspective on the Accelerated Global Introductions of Inactivated Poliovirus Vaccine in a Constrained Supply Market

    PubMed Central

    Ottosen, Ann; Rubin, Jennifer; Blanc, Diana Chang; Zipursky, Simona; Wootton, Emily

    2017-01-01

    Abstract A total of 105 countries have introduced IPV as of September 2016 of which 85 have procured the vaccine through UNICEF. The Global Eradication and Endgame Strategic Plan 2013-2018 called for the rapid introduction of at least one dose of IPV into routine immunization schedules in 126 all OPV-using countries by the end of 2015. At the time of initiating the procurement process, demand was estimated based on global modeling rather than individual country indications. In its capacity as procurement agency for the Global Polio Eradication Initiative and Gavi, the Vaccine Alliance, UNICEF set out to secure access to IPV supply for around 100 countries. Based on offers received, sufficient supply was awarded to two manufacturers to meet projected routine requirements. However, due to technical issues scaling up vaccine production and an unforecasted demand for IPV use in campaigns to interrupt wild polio virus and to control type 2 vaccine derived polio virus outbreaks, IPV supplies are severely constrained. Activities to stretch supplies and to suppress demand have been ongoing since 2014, including delaying IPV introduction in countries where risks of type 2 reintroduction are lower, implementing the multi-dose vial policy, and encouraging the use of fractional dose delivered intradermally. Despite these efforts, there is still insufficient IPV supply to meet demand. The impact of the supply situation on IPV introduction timelines in countries are the focus of this article, and based on lessons learned with the IPV introductions, it is recommended for future health programs with accelerated scale up of programs, to take a cautious approach on supply commitments, putting in place clear allocation criteria in case of shortages or delays and establishing a communication strategy vis a vis beneficiaries. PMID:28838159

  18. Problems in the Multi-Service Acquisition of Less-Than-Major Ground Communications-Electronics Systems.

    DTIC Science & Technology

    1981-06-01

    eucation about nsiti-serviue program, especially at higher levels 1 1 2 4 Other 2 1 1 4 ,! provisioning. Two interviewees mentioned the need for a cost...generator project. Also, the Program Manager was only a Lieutenant Colonel. Colonel Haney felt that a higher [ I. rank would be required since the... higher than for the military in both services in all three categories. Table 3-4 illustrates average years of experience by system studied. Again

  19. Constrained orbital intercept-evasion

    NASA Astrophysics Data System (ADS)

    Zatezalo, Aleksandar; Stipanovic, Dusan M.; Mehra, Raman K.; Pham, Khanh

    2014-06-01

    An effective characterization of intercept-evasion confrontations in various space environments and a derivation of corresponding solutions considering a variety of real-world constraints are daunting theoretical and practical challenges. Current and future space-based platforms have to simultaneously operate as components of satellite formations and/or systems and at the same time, have a capability to evade potential collisions with other maneuver constrained space objects. In this article, we formulate and numerically approximate solutions of a Low Earth Orbit (LEO) intercept-maneuver problem in terms of game-theoretic capture-evasion guaranteed strategies. The space intercept-evasion approach is based on Liapunov methodology that has been successfully implemented in a number of air and ground based multi-player multi-goal game/control applications. The corresponding numerical algorithms are derived using computationally efficient and orbital propagator independent methods that are previously developed for Space Situational Awareness (SSA). This game theoretical but at the same time robust and practical approach is demonstrated on a realistic LEO scenario using existing Two Line Element (TLE) sets and Simplified General Perturbation-4 (SGP-4) propagator.

  20. An Examination of Strategy Implementation During Abstract Nonlinguistic Category Learning in Aphasia.

    PubMed

    Vallila-Rohter, Sofia; Kiran, Swathi

    2015-08-01

    Our purpose was to study strategy use during nonlinguistic category learning in aphasia. Twelve control participants without aphasia and 53 participants with aphasia (PWA) completed a computerized feedback-based category learning task consisting of training and testing phases. Accuracy rates of categorization in testing phases were calculated. To evaluate strategy use, strategy analyses were conducted over training and testing phases. Participant data were compared with model data that simulated complex multi-cue, single feature, and random pattern strategies. Learning success and strategy use were evaluated within the context of standardized cognitive-linguistic assessments. Categorization accuracy was higher among control participants than among PWA. The majority of control participants implemented suboptimal or optimal multi-cue and single-feature strategies by testing phases of the experiment. In contrast, a large subgroup of PWA implemented random patterns, or no strategy, during both training and testing phases of the experiment. Person-to-person variability arises not only in category learning ability but also in the strategies implemented to complete category learning tasks. PWA less frequently developed effective strategies during category learning tasks than control participants. Certain PWA may have impairments of strategy development or feedback processing not captured by language and currently probed cognitive abilities.

  1. Inexact fuzzy-stochastic mixed-integer programming approach for long-term planning of waste management--Part A: methodology.

    PubMed

    Guo, P; Huang, G H

    2009-01-01

    In this study, an inexact fuzzy chance-constrained two-stage mixed-integer linear programming (IFCTIP) approach is proposed for supporting long-term planning of waste-management systems under multiple uncertainties in the City of Regina, Canada. The method improves upon the existing inexact two-stage programming and mixed-integer linear programming techniques by incorporating uncertainties expressed as multiple uncertainties of intervals and dual probability distributions within a general optimization framework. The developed method can provide an effective linkage between the predefined environmental policies and the associated economic implications. Four special characteristics of the proposed method make it unique compared with other optimization techniques that deal with uncertainties. Firstly, it provides a linkage to predefined policies that have to be respected when a modeling effort is undertaken; secondly, it is useful for tackling uncertainties presented as intervals, probabilities, fuzzy sets and their incorporation; thirdly, it facilitates dynamic analysis for decisions of facility-expansion planning and waste-flow allocation within a multi-facility, multi-period, multi-level, and multi-option context; fourthly, the penalties are exercised with recourse against any infeasibility, which permits in-depth analyses of various policy scenarios that are associated with different levels of economic consequences when the promised solid waste-generation rates are violated. In a companion paper, the developed method is applied to a real case for the long-term planning of waste management in the City of Regina, Canada.

  2. Decoding Multiple Sound Categories in the Human Temporal Cortex Using High Resolution fMRI

    PubMed Central

    Zhang, Fengqing; Wang, Ji-Ping; Kim, Jieun; Parrish, Todd; Wong, Patrick C. M.

    2015-01-01

    Perception of sound categories is an important aspect of auditory perception. The extent to which the brain’s representation of sound categories is encoded in specialized subregions or distributed across the auditory cortex remains unclear. Recent studies using multivariate pattern analysis (MVPA) of brain activations have provided important insights into how the brain decodes perceptual information. In the large existing literature on brain decoding using MVPA methods, relatively few studies have been conducted on multi-class categorization in the auditory domain. Here, we investigated the representation and processing of auditory categories within the human temporal cortex using high resolution fMRI and MVPA methods. More importantly, we considered decoding multiple sound categories simultaneously through multi-class support vector machine-recursive feature elimination (MSVM-RFE) as our MVPA tool. Results show that for all classifications the model MSVM-RFE was able to learn the functional relation between the multiple sound categories and the corresponding evoked spatial patterns and classify the unlabeled sound-evoked patterns significantly above chance. This indicates the feasibility of decoding multiple sound categories not only within but across subjects. However, the across-subject variation affects classification performance more than the within-subject variation, as the across-subject analysis has significantly lower classification accuracies. Sound category-selective brain maps were identified based on multi-class classification and revealed distributed patterns of brain activity in the superior temporal gyrus and the middle temporal gyrus. This is in accordance with previous studies, indicating that information in the spatially distributed patterns may reflect a more abstract perceptual level of representation of sound categories. Further, we show that the across-subject classification performance can be significantly improved by averaging the fMRI images over items, because the irrelevant variations between different items of the same sound category are reduced and in turn the proportion of signals relevant to sound categorization increases. PMID:25692885

  3. Decoding multiple sound categories in the human temporal cortex using high resolution fMRI.

    PubMed

    Zhang, Fengqing; Wang, Ji-Ping; Kim, Jieun; Parrish, Todd; Wong, Patrick C M

    2015-01-01

    Perception of sound categories is an important aspect of auditory perception. The extent to which the brain's representation of sound categories is encoded in specialized subregions or distributed across the auditory cortex remains unclear. Recent studies using multivariate pattern analysis (MVPA) of brain activations have provided important insights into how the brain decodes perceptual information. In the large existing literature on brain decoding using MVPA methods, relatively few studies have been conducted on multi-class categorization in the auditory domain. Here, we investigated the representation and processing of auditory categories within the human temporal cortex using high resolution fMRI and MVPA methods. More importantly, we considered decoding multiple sound categories simultaneously through multi-class support vector machine-recursive feature elimination (MSVM-RFE) as our MVPA tool. Results show that for all classifications the model MSVM-RFE was able to learn the functional relation between the multiple sound categories and the corresponding evoked spatial patterns and classify the unlabeled sound-evoked patterns significantly above chance. This indicates the feasibility of decoding multiple sound categories not only within but across subjects. However, the across-subject variation affects classification performance more than the within-subject variation, as the across-subject analysis has significantly lower classification accuracies. Sound category-selective brain maps were identified based on multi-class classification and revealed distributed patterns of brain activity in the superior temporal gyrus and the middle temporal gyrus. This is in accordance with previous studies, indicating that information in the spatially distributed patterns may reflect a more abstract perceptual level of representation of sound categories. Further, we show that the across-subject classification performance can be significantly improved by averaging the fMRI images over items, because the irrelevant variations between different items of the same sound category are reduced and in turn the proportion of signals relevant to sound categorization increases.

  4. What makes up marginal lands and how can it be defined and classified?

    NASA Astrophysics Data System (ADS)

    Ivanina, Vadym

    2017-04-01

    Definitions of marginal lands are often not explicit. The term "marginal" is not supported by either a precise definition or research to determine which lands fall into this category. To identify marginal lands terminology/methodology is used which varies between physical characteristics and the current land use of a site as basic perspective. The term 'Marginal' is most commonly followed by 'degraded' lands, and other widely used terms such as 'abandoned', 'idle', 'pasture', 'surplus agricultural land', 'Conservation Reserve Programme' (CRP)', 'barren and carbon-poor land', etc. Some terms are used synonymously. To the category of "marginal" lands are predominantly included lands which are excluded from cultivation due to economic infeasibility or physical restriction for growing conventional crops. Such sites may still have potential to be used for alternative agricultural practice, e.g. bioenergy feedstock production. The existing categorizing of marginal lands does not allow evaluating soil fertility potential or to define type and level of constrains for growing crops as the reason of a low practical value with regards to land use planning. A new marginal land classification has to be established and developed. This classification should be built on criteria of soil biophysical properties, ecologic, environment and climate handicaps for growing crops, be easy in use and of high practical value. The SEEMLA consortium made steps to build such a marginal land classification which is based on direct criteria depicting soil properties and constrains, and defining their productivity potential. By this classification marginal lands are divided into eleven categories: shallow rooting, low fertility, stony texture, sandy texture, clay texture, salinic, sodicic, acidic, overwet, eroded, and contaminated. The basis of this classification was taken criteria modified after and adapted from Regulation EU (1305)2013. To define an area of marginal lands with climate and economic limitations, SEEMLA established and implemented the term "area of land marginality" with a broader on marginal lands. This term includes marginal lands themselves, evaluation of climate constrains and economic efficiency to grow crops. This approach allows to define, categorize and classify marginal land by direct indicators of soil biophysical properties, ecologic and environment constrains, and provides additional evaluation of lands marginality with regards to suitability for growing crops based on climate criteria.

  5. Phonological Phrase Boundaries Constrain the Online Syntactic Analysis of Spoken Sentences

    ERIC Educational Resources Information Center

    Millotte, Severine; Rene, Alice; Wales, Roger; Christophe, Anne

    2008-01-01

    Two experiments tested whether phonological phrase boundaries constrain online syntactic analysis in French. Pairs of homophones belonging to different syntactic categories (verb and adjective) were used to create sentences with a local syntactic ambiguity (e.g., [le petit chien "mort"], in English, the "dead" little dog, vs.…

  6. Superiorization-based multi-energy CT image reconstruction

    PubMed Central

    Yang, Q; Cong, W; Wang, G

    2017-01-01

    The recently-developed superiorization approach is efficient and robust for solving various constrained optimization problems. This methodology can be applied to multi-energy CT image reconstruction with the regularization in terms of the prior rank, intensity and sparsity model (PRISM). In this paper, we propose a superiorized version of the simultaneous algebraic reconstruction technique (SART) based on the PRISM model. Then, we compare the proposed superiorized algorithm with the Split-Bregman algorithm in numerical experiments. The results show that both the Superiorized-SART and the Split-Bregman algorithms generate good results with weak noise and reduced artefacts. PMID:28983142

  7. The dynamics of the multi-planet system orbiting Kepler-56

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Gongjie; Naoz, Smadar; Johnson, John Asher

    2014-10-20

    Kepler-56 is a multi-planet system containing two coplanar inner planets that are in orbits misaligned with respect to the spin axis of the host star, and an outer planet. Various mechanisms have been proposed to explain the broad distribution of spin-orbit angles among exoplanets, and these theories fall under two broad categories. The first is based on dynamical interactions in a multi-body system, while the other assumes that disk migration is the driving mechanism in planetary configuration and that the star (or disk) is titled with respect to the planetary plane. Here we show that the large observed obliquity ofmore » Kepler 56 system is consistent with a dynamical origin. In addition, we use observations by Huber et al. to derive the obliquity's probability distribution function, thus improving the constrained lower limit. The outer planet may be the cause of the inner planets' large obliquities, and we give the probability distribution function of its inclination, which depends on the initial orbital configuration of the planetary system. We show that even in the presence of precise measurement of the true obliquity, one cannot distinguish the initial configurations. Finally we consider the fate of the system as the star continues to evolve beyond the main sequence, and we find that the obliquity of the system will not undergo major variations as the star climbs the red giant branch. We follow the evolution of the system and find that the innermost planet will be engulfed in ∼129 Myr. Furthermore we put an upper limit of ∼155 Myr for the engulfment of the second planet. This corresponds to ∼3% of the current age of the star.« less

  8. A frozen Gaussian approximation-based multi-level particle swarm optimization for seismic inversion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Jinglai, E-mail: jinglaili@sjtu.edu.cn; Lin, Guang, E-mail: lin491@purdue.edu; Computational Sciences and Mathematics Division, Pacific Northwest National Laboratory, Richland, WA 99352

    2015-09-01

    In this paper, we propose a frozen Gaussian approximation (FGA)-based multi-level particle swarm optimization (MLPSO) method for seismic inversion of high-frequency wave data. The method addresses two challenges in it: First, the optimization problem is highly non-convex, which makes hard for gradient-based methods to reach global minima. This is tackled by MLPSO which can escape from undesired local minima. Second, the character of high-frequency of seismic waves requires a large number of grid points in direct computational methods, and thus renders an extremely high computational demand on the simulation of each sample in MLPSO. We overcome this difficulty by threemore » steps: First, we use FGA to compute high-frequency wave propagation based on asymptotic analysis on phase plane; Then we design a constrained full waveform inversion problem to prevent the optimization search getting into regions of velocity where FGA is not accurate; Last, we solve the constrained optimization problem by MLPSO that employs FGA solvers with different fidelity. The performance of the proposed method is demonstrated by a two-dimensional full-waveform inversion example of the smoothed Marmousi model.« less

  9. (In)Flexibility of Constituency in Japanese in Multi-Modal Categorial Grammar with Structured Phonology

    ERIC Educational Resources Information Center

    Kubota, Yusuke

    2010-01-01

    This dissertation proposes a theory of categorial grammar called Multi-Modal Categorial Grammar with Structured Phonology. The central feature that distinguishes this theory from the majority of contemporary syntactic theories is that it decouples (without completely segregating) two aspects of syntax--hierarchical organization (reflecting…

  10. How to constrain multi-objective calibrations of the SWAT model using water balance components

    USDA-ARS?s Scientific Manuscript database

    Automated procedures are often used to provide adequate fits between hydrologic model estimates and observed data. While the models may provide good fits based upon numeric criteria, they may still not accurately represent the basic hydrologic characteristics of the represented watershed. Here we ...

  11. The SDSS-III Multi-object Apo Radial-velocity Exoplanet Large-area Survey

    NASA Astrophysics Data System (ADS)

    Ge, Jian; Mahadevan, S.; Lee, B.; Wan, X.; Zhao, B.; van Eyken, J.; Kane, S.; Guo, P.; Ford, E. B.; Agol, E.; Gaudi, S.; Fleming, S.; Crepp, J.; Cohen, R.; Groot, J.; Galvez, M.; Liu, J.; Ford, H.; Schneider, D.; Seager, S.; Hawley, S. L.; Weinberg, D.; Eisenstein, D.

    2007-12-01

    As part of SDSS-III survey in 2008-2014, the Multi-object APO Radial-Velocity Exoplanet Large-area Survey (MARVELS) will conduct the largest ground-based Doppler planet survey to date using the SDSS telescope and new generation multi-object Doppler instruments with 120 object capability and 10-20 m/s Doppler precision. The baseline survey plan is to monitor a total of 11,000 V=8-12 stars ( 10,000 main sequence stars and 1000 giant stars) over 800 square degrees over the 6 years. The primary goal is to produce a large, statistically well defined sample of giant planets ( 200) with a wide range of masses ( 0.2-10 Jupiter masses) and orbits (1 day-2 years) drawn from a large of host stars with a diverse set of masses, compositions, and ages for studying the diversity of extrasolar planets and constraining planet formation, migration & dynamical evolution of planetary systems. The survey data will also be used for providing a statistical sample for theoretical comparison and discovering rare systems and identifying signposts for lower-mass or more distant planets. Early science results from the pilot program will be reported. We would like to thank the SDSS MC for allocation of the telescope time and the W.M. Keck Foundation, NSF, NASA and UF for support.

  12. Robust fuzzy control subject to state variance and passivity constraints for perturbed nonlinear systems with multiplicative noises.

    PubMed

    Chang, Wen-Jer; Huang, Bo-Jyun

    2014-11-01

    The multi-constrained robust fuzzy control problem is investigated in this paper for perturbed continuous-time nonlinear stochastic systems. The nonlinear system considered in this paper is represented by a Takagi-Sugeno fuzzy model with perturbations and state multiplicative noises. The multiple performance constraints considered in this paper include stability, passivity and individual state variance constraints. The Lyapunov stability theory is employed to derive sufficient conditions to achieve the above performance constraints. By solving these sufficient conditions, the contribution of this paper is to develop a parallel distributed compensation based robust fuzzy control approach to satisfy multiple performance constraints for perturbed nonlinear systems with multiplicative noises. At last, a numerical example for the control of perturbed inverted pendulum system is provided to illustrate the applicability and effectiveness of the proposed multi-constrained robust fuzzy control method. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  13. Geopotential Field Anomaly Continuation with Multi-Altitude Observations

    NASA Technical Reports Server (NTRS)

    Kim, Jeong Woo; Kim, Hyung Rae; von Frese, Ralph; Taylor, Patrick; Rangelova, Elena

    2012-01-01

    Conventional gravity and magnetic anomaly continuation invokes the standard Poisson boundary condition of a zero anomaly at an infinite vertical distance from the observation surface. This simple continuation is limited, however, where multiple altitude slices of the anomaly field have been observed. Increasingly, areas are becoming available constrained by multiple boundary conditions from surface, airborne, and satellite surveys. This paper describes the implementation of continuation with multi-altitude boundary conditions in Cartesian and spherical coordinates and investigates the advantages and limitations of these applications. Continuations by EPS (Equivalent Point Source) inversion and the FT (Fourier Transform), as well as by SCHA (Spherical Cap Harmonic Analysis) are considered. These methods were selected because they are especially well suited for analyzing multi-altitude data over finite patches of the earth such as covered by the ADMAP database. In general, continuations constrained by multi-altitude data surfaces are invariably superior to those constrained by a single altitude data surface due to anomaly measurement errors and the non-uniqueness of continuation.

  14. Geopotential Field Anomaly Continuation with Multi-Altitude Observations

    NASA Technical Reports Server (NTRS)

    Kim, Jeong Woo; Kim, Hyung Rae; vonFrese, Ralph; Taylor, Patrick; Rangelova, Elena

    2011-01-01

    Conventional gravity and magnetic anomaly continuation invokes the standard Poisson boundary condition of a zero anomaly at an infinite vertical distance from the observation surface. This simple continuation is limited, however, where multiple altitude slices of the anomaly field have been observed. Increasingly, areas are becoming available constrained by multiple boundary conditions from surface, airborne, and satellite surveys. This paper describes the implementation of continuation with multi-altitude boundary conditions in Cartesian and spherical coordinates and investigates the advantages and limitations of these applications. Continuations by EPS (Equivalent Point Source) inversion and the FT (Fourier Transform), as well as by SCHA (Spherical Cap Harmonic Analysis) are considered. These methods were selected because they are especially well suited for analyzing multi-altitude data over finite patches of the earth such as covered by the ADMAP database. In general, continuations constrained by multi-altitude data surfaces are invariably superior to those constrained by a single altitude data surface due to anomaly measurement errors and the non-uniqueness of continuation.

  15. Language Program Evaluation

    ERIC Educational Resources Information Center

    Norris, John M.

    2016-01-01

    Language program evaluation is a pragmatic mode of inquiry that illuminates the complex nature of language-related interventions of various kinds, the factors that foster or constrain them, and the consequences that ensue. Program evaluation enables a variety of evidence-based decisions and actions, from designing programs and implementing…

  16. An Examination of Strategy Implementation During Abstract Nonlinguistic Category Learning in Aphasia

    PubMed Central

    Kiran, Swathi

    2015-01-01

    Purpose Our purpose was to study strategy use during nonlinguistic category learning in aphasia. Method Twelve control participants without aphasia and 53 participants with aphasia (PWA) completed a computerized feedback-based category learning task consisting of training and testing phases. Accuracy rates of categorization in testing phases were calculated. To evaluate strategy use, strategy analyses were conducted over training and testing phases. Participant data were compared with model data that simulated complex multi-cue, single feature, and random pattern strategies. Learning success and strategy use were evaluated within the context of standardized cognitive–linguistic assessments. Results Categorization accuracy was higher among control participants than among PWA. The majority of control participants implemented suboptimal or optimal multi-cue and single-feature strategies by testing phases of the experiment. In contrast, a large subgroup of PWA implemented random patterns, or no strategy, during both training and testing phases of the experiment. Conclusions Person-to-person variability arises not only in category learning ability but also in the strategies implemented to complete category learning tasks. PWA less frequently developed effective strategies during category learning tasks than control participants. Certain PWA may have impairments of strategy development or feedback processing not captured by language and currently probed cognitive abilities. PMID:25908438

  17. Adaptive Fuzzy Output Constrained Control Design for Multi-Input Multioutput Stochastic Nonstrict-Feedback Nonlinear Systems.

    PubMed

    Li, Yongming; Tong, Shaocheng

    2017-12-01

    In this paper, an adaptive fuzzy output constrained control design approach is addressed for multi-input multioutput uncertain stochastic nonlinear systems in nonstrict-feedback form. The nonlinear systems addressed in this paper possess unstructured uncertainties, unknown gain functions and unknown stochastic disturbances. Fuzzy logic systems are utilized to tackle the problem of unknown nonlinear uncertainties. The barrier Lyapunov function technique is employed to solve the output constrained problem. In the framework of backstepping design, an adaptive fuzzy control design scheme is constructed. All the signals in the closed-loop system are proved to be bounded in probability and the system outputs are constrained in a given compact set. Finally, the applicability of the proposed controller is well carried out by a simulation example.

  18. Effect of e-learning program on risk assessment and pressure ulcer classification - A randomized study.

    PubMed

    Bredesen, Ida Marie; Bjøro, Karen; Gunningberg, Lena; Hofoss, Dag

    2016-05-01

    Pressure ulcers (PUs) are a problem in health care. Staff competency is paramount to PU prevention. Education is essential to increase skills in pressure ulcer classification and risk assessment. Currently, no pressure ulcer learning programs are available in Norwegian. Develop and test an e-learning program for assessment of pressure ulcer risk and pressure ulcer classification. Forty-four nurses working in acute care hospital wards or nursing homes participated and were assigned randomly into two groups: an e-learning program group (intervention) and a traditional classroom lecture group (control). Data was collected immediately before and after training, and again after three months. The study was conducted at one nursing home and two hospitals between May and December 2012. Accuracy of risk assessment (five patient cases) and pressure ulcer classification (40 photos [normal skin, pressure ulcer categories I-IV] split in two sets) were measured by comparing nurse evaluations in each of the two groups to a pre-established standard based on ratings by experts in pressure ulcer classification and risk assessment. Inter-rater reliability was measured by exact percent agreement and multi-rater Fleiss kappa. A Mann-Whitney U test was used for continuous sum score variables. An e-learning program did not improve Braden subscale scoring. For pressure ulcer classification, however, the intervention group scored significantly higher than the control group on several of the categories in post-test immediately after training. However, after three months there were no significant differences in classification skills between the groups. An e-learning program appears to have a greater effect on the accuracy of pressure ulcer classification than classroom teaching in the short term. For proficiency in Braden scoring, no significant effect of educational methods on learning results was detected. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. A Risk-Constrained Multi-Stage Decision Making Approach to the Architectural Analysis of Mars Missions

    NASA Technical Reports Server (NTRS)

    Kuwata, Yoshiaki; Pavone, Marco; Balaram, J. (Bob)

    2012-01-01

    This paper presents a novel risk-constrained multi-stage decision making approach to the architectural analysis of planetary rover missions. In particular, focusing on a 2018 Mars rover concept, which was considered as part of a potential Mars Sample Return campaign, we model the entry, descent, and landing (EDL) phase and the rover traverse phase as four sequential decision-making stages. The problem is to find a sequence of divert and driving maneuvers so that the rover drive is minimized and the probability of a mission failure (e.g., due to a failed landing) is below a user specified bound. By solving this problem for several different values of the model parameters (e.g., divert authority), this approach enables rigorous, accurate and systematic trade-offs for the EDL system vs. the mobility system, and, more in general, cross-domain trade-offs for the different phases of a space mission. The overall optimization problem can be seen as a chance-constrained dynamic programming problem, with the additional complexity that 1) in some stages the disturbances do not have any probabilistic characterization, and 2) the state space is extremely large (i.e, hundreds of millions of states for trade-offs with high-resolution Martian maps). To this purpose, we solve the problem by performing an unconventional combination of average and minimax cost analysis and by leveraging high efficient computation tools from the image processing community. Preliminary trade-off results are presented.

  20. Spectral Prior Image Constrained Compressed Sensing (Spectral PICCS) for Photon-Counting Computed Tomography

    PubMed Central

    Yu, Zhicong; Leng, Shuai; Li, Zhoubo; McCollough, Cynthia H.

    2016-01-01

    Photon-counting computed tomography (PCCT) is an emerging imaging technique that enables multi-energy imaging with only a single scan acquisition. To enable multi-energy imaging, the detected photons corresponding to the full x-ray spectrum are divided into several subgroups of bin data that correspond to narrower energy windows. Consequently, noise in each energy bin increases compared to the full-spectrum data. This work proposes an iterative reconstruction algorithm for noise suppression in the narrower energy bins used in PCCT imaging. The algorithm is based on the framework of prior image constrained compressed sensing (PICCS) and is called spectral PICCS; it uses the full-spectrum image reconstructed using conventional filtered back-projection as the prior image. The spectral PICCS algorithm is implemented using a constrained optimization scheme with adaptive iterative step sizes such that only two tuning parameters are required in most cases. The algorithm was first evaluated using computer simulations, and then validated by both physical phantoms and in-vivo swine studies using a research PCCT system. Results from both computer-simulation and experimental studies showed substantial image noise reduction in narrow energy bins (43~73%) without sacrificing CT number accuracy or spatial resolution. PMID:27551878

  1. Spectral prior image constrained compressed sensing (spectral PICCS) for photon-counting computed tomography

    NASA Astrophysics Data System (ADS)

    Yu, Zhicong; Leng, Shuai; Li, Zhoubo; McCollough, Cynthia H.

    2016-09-01

    Photon-counting computed tomography (PCCT) is an emerging imaging technique that enables multi-energy imaging with only a single scan acquisition. To enable multi-energy imaging, the detected photons corresponding to the full x-ray spectrum are divided into several subgroups of bin data that correspond to narrower energy windows. Consequently, noise in each energy bin increases compared to the full-spectrum data. This work proposes an iterative reconstruction algorithm for noise suppression in the narrower energy bins used in PCCT imaging. The algorithm is based on the framework of prior image constrained compressed sensing (PICCS) and is called spectral PICCS; it uses the full-spectrum image reconstructed using conventional filtered back-projection as the prior image. The spectral PICCS algorithm is implemented using a constrained optimization scheme with adaptive iterative step sizes such that only two tuning parameters are required in most cases. The algorithm was first evaluated using computer simulations, and then validated by both physical phantoms and in vivo swine studies using a research PCCT system. Results from both computer-simulation and experimental studies showed substantial image noise reduction in narrow energy bins (43-73%) without sacrificing CT number accuracy or spatial resolution.

  2. Content Analysis of Student Essays after Attending a Problem-Based Learning Course: Facilitating the Development of Critical Thinking and Communication Skills in Japanese Nursing Students.

    PubMed

    Itatani, Tomoya; Nagata, Kyoko; Yanagihara, Kiyoko; Tabuchi, Noriko

    2017-08-22

    The importance of active learning has continued to increase in Japan. The authors conducted classes for first-year students who entered the nursing program using the problem-based learning method which is a kind of active learning. Students discussed social topics in classes. The purposes of this study were to analyze the post-class essay, describe logical and critical thinking after attended a Problem-Based Learning (PBL) course. The authors used Mayring's methodology for qualitative content analysis and text mining. In the description about the skills required to resolve social issues, seven categories were extracted: (recognition of diverse social issues), (attitudes about resolving social issues), (discerning the root cause), (multi-lateral information processing skills), (making a path to resolve issues), (processivity in dealing with issues), and (reflecting). In the description about communication, five categories were extracted: (simple statement), (robust theories), (respecting the opponent), (communication skills), and (attractive presentations). As the result of text mining, the words extracted more than 100 times included "issue," "society," "resolve," "myself," "ability," "opinion," and "information." Education using PBL could be an effective means of improving skills that students described, and communication in general. Some students felt difficulty of communication resulting from characteristics of Japanese.

  3. Microgrid Optimal Scheduling With Chance-Constrained Islanding Capability

    DOE PAGES

    Liu, Guodong; Starke, Michael R.; Xiao, B.; ...

    2017-01-13

    To facilitate the integration of variable renewable generation and improve the resilience of electricity sup-ply in a microgrid, this paper proposes an optimal scheduling strategy for microgrid operation considering constraints of islanding capability. A new concept, probability of successful islanding (PSI), indicating the probability that a microgrid maintains enough spinning reserve (both up and down) to meet local demand and accommodate local renewable generation after instantaneously islanding from the main grid, is developed. The PSI is formulated as mixed-integer linear program using multi-interval approximation taking into account the probability distributions of forecast errors of wind, PV and load. With themore » goal of minimizing the total operating cost while preserving user specified PSI, a chance-constrained optimization problem is formulated for the optimal scheduling of mirogrids and solved by mixed integer linear programming (MILP). Numerical simulations on a microgrid consisting of a wind turbine, a PV panel, a fuel cell, a micro-turbine, a diesel generator and a battery demonstrate the effectiveness of the proposed scheduling strategy. Lastly, we verify the relationship between PSI and various factors.« less

  4. Novel methods for Solving Economic Dispatch of Security-Constrained Unit Commitment Based on Linear Programming

    NASA Astrophysics Data System (ADS)

    Guo, Sangang

    2017-09-01

    There are two stages in solving security-constrained unit commitment problems (SCUC) within Lagrangian framework: one is to obtain feasible units’ states (UC), the other is power economic dispatch (ED) for each unit. The accurate solution of ED is more important for enhancing the efficiency of the solution to SCUC for the fixed feasible units’ statues. Two novel methods named after Convex Combinatorial Coefficient Method and Power Increment Method respectively based on linear programming problem for solving ED are proposed by the piecewise linear approximation to the nonlinear convex fuel cost functions. Numerical testing results show that the methods are effective and efficient.

  5. Behavior-based aggregation of land categories for temporal change analysis

    NASA Astrophysics Data System (ADS)

    Aldwaik, Safaa Zakaria; Onsted, Jeffrey A.; Pontius, Robert Gilmore, Jr.

    2015-03-01

    Comparison between two time points of the same categorical variable for the same study extent can reveal changes among categories over time, such as transitions among land categories. If many categories exist, then analysis can be difficult to interpret. Category aggregation is the procedure that combines two or more categories to create a single broader category. Aggregation can simplify interpretation, and can also influence the sizes and types of changes. Some classifications have an a priori hierarchy to facilitate aggregation, but an a priori aggregation might make researchers blind to important category dynamics. We created an algorithm to aggregate categories in a sequence of steps based on the categories' behaviors in terms of gross losses and gross gains. The behavior-based algorithm aggregates net gaining categories with net gaining categories and aggregates net losing categories with net losing categories, but never aggregates a net gaining category with a net losing category. The behavior-based algorithm at each step in the sequence maintains net change and maximizes swap change. We present a case study where data from 2001 and 2006 for 64 land categories indicate change on 17% of the study extent. The behavior-based algorithm produces a set of 10 categories that maintains nearly the original amount of change. In contrast, an a priori aggregation produces 10 categories while reducing the change to 9%. We offer a free computer program to perform the behavior-based aggregation.

  6. Comprehensive review of the evidence regarding the effectiveness of community-based primary health care in improving maternal, neonatal and child health: 6. strategies used by effective projects.

    PubMed

    Perry, Henry B; Sacks, Emma; Schleiff, Meike; Kumapley, Richard; Gupta, Sundeep; Rassekh, Bahie M; Freeman, Paul A

    2017-06-01

    As part of our review of the evidence of the effectiveness of community-based primary health care (CBPHC) in improving maternal, neonatal and child health (MNCH), we summarize here the common delivery strategies of projects, programs and field research studies (collectively referred to as projects) that have demonstrated effectiveness in improving child mortality. Other articles in this series address specifically the effects of CBPHC on improving MNCH, while this paper explores the specific strategies used. We screened 12 166 published reports in PubMed of community-based approaches to improving maternal, neonatal and child health in high-mortality, resource-constrained settings from 1950-2015. A total of 700 assessments, including 148 reports from other publicly available sources (mostly unpublished evaluation reports and books) met the criteria for inclusion and were reviewed using a data extraction form. Here we identify and categorize key strategies used in project implementation. Six categories of strategies for program implementation were identified, all of which required working in partnership with communities and health systems: (a) program design and evaluation, (b) community collaboration, (c) education for community-level staff, volunteers, beneficiaries and community members, (d) health systems strengthening, (e) use of community-level workers, and (f) intervention delivery. Four specific strategies for intervention delivery were identified: (a) recognition, referral, and (when possible) treatment of serious childhood illness by mothers and/or trained community agents, (b) routine systematic visitation of all homes, (c) facilitator-led participatory women's groups, and (d) health service provision at outreach sites by mobile health teams. The strategies identified here provide useful starting points for program design in strengthening the effectiveness of CBPHC for improving MNCH.

  7. Comprehensive review of the evidence regarding the effectiveness of community–based primary health care in improving maternal, neonatal and child health: 6. strategies used by effective projects

    PubMed Central

    Perry, Henry B; Sacks, Emma; Schleiff, Meike; Kumapley, Richard; Gupta, Sundeep; Rassekh, Bahie M; Freeman, Paul A

    2017-01-01

    Background As part of our review of the evidence of the effectiveness of community–based primary health care (CBPHC) in improving maternal, neonatal and child health (MNCH), we summarize here the common delivery strategies of projects, programs and field research studies (collectively referred to as projects) that have demonstrated effectiveness in improving child mortality. Other articles in this series address specifically the effects of CBPHC on improving MNCH, while this paper explores the specific strategies used. Methods We screened 12 166 published reports in PubMed of community–based approaches to improving maternal, neonatal and child health in high–mortality, resource–constrained settings from 1950–2015. A total of 700 assessments, including 148 reports from other publicly available sources (mostly unpublished evaluation reports and books) met the criteria for inclusion and were reviewed using a data extraction form. Here we identify and categorize key strategies used in project implementation. Results Six categories of strategies for program implementation were identified, all of which required working in partnership with communities and health systems: (a) program design and evaluation, (b) community collaboration, (c) education for community–level staff, volunteers, beneficiaries and community members, (d) health systems strengthening, (e) use of community–level workers, and (f) intervention delivery. Four specific strategies for intervention delivery were identified: (a) recognition, referral, and (when possible) treatment of serious childhood illness by mothers and/or trained community agents, (b) routine systematic visitation of all homes, (c) facilitator–led participatory women’s groups, and (d) health service provision at outreach sites by mobile health teams. Conclusions The strategies identified here provide useful starting points for program design in strengthening the effectiveness of CBPHC for improving MNCH. PMID:28685044

  8. CMS Readiness for Multi-Core Workload Scheduling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perez-Calero Yzquierdo, A.; Balcas, J.; Hernandez, J.

    In the present run of the LHC, CMS data reconstruction and simulation algorithms benefit greatly from being executed as multiple threads running on several processor cores. The complexity of the Run 2 events requires parallelization of the code to reduce the memory-per- core footprint constraining serial execution programs, thus optimizing the exploitation of present multi-core processor architectures. The allocation of computing resources for multi-core tasks, however, becomes a complex problem in itself. The CMS workload submission infrastructure employs multi-slot partitionable pilots, built on HTCondor and GlideinWMS native features, to enable scheduling of single and multi-core jobs simultaneously. This provides amore » solution for the scheduling problem in a uniform way across grid sites running a diversity of gateways to compute resources and batch system technologies. This paper presents this strategy and the tools on which it has been implemented. The experience of managing multi-core resources at the Tier-0 and Tier-1 sites during 2015, along with the deployment phase to Tier-2 sites during early 2016 is reported. The process of performance monitoring and optimization to achieve efficient and flexible use of the resources is also described.« less

  9. CMS readiness for multi-core workload scheduling

    NASA Astrophysics Data System (ADS)

    Perez-Calero Yzquierdo, A.; Balcas, J.; Hernandez, J.; Aftab Khan, F.; Letts, J.; Mason, D.; Verguilov, V.

    2017-10-01

    In the present run of the LHC, CMS data reconstruction and simulation algorithms benefit greatly from being executed as multiple threads running on several processor cores. The complexity of the Run 2 events requires parallelization of the code to reduce the memory-per- core footprint constraining serial execution programs, thus optimizing the exploitation of present multi-core processor architectures. The allocation of computing resources for multi-core tasks, however, becomes a complex problem in itself. The CMS workload submission infrastructure employs multi-slot partitionable pilots, built on HTCondor and GlideinWMS native features, to enable scheduling of single and multi-core jobs simultaneously. This provides a solution for the scheduling problem in a uniform way across grid sites running a diversity of gateways to compute resources and batch system technologies. This paper presents this strategy and the tools on which it has been implemented. The experience of managing multi-core resources at the Tier-0 and Tier-1 sites during 2015, along with the deployment phase to Tier-2 sites during early 2016 is reported. The process of performance monitoring and optimization to achieve efficient and flexible use of the resources is also described.

  10. Gamma-Ray Burst Afterglows with ALMA

    NASA Astrophysics Data System (ADS)

    Urata, Y.; Huang, K.; Takahashi, S.

    2015-12-01

    We present multi-wavelength observations including sub-millimeter follow-ups for two GRB afterglows. The rapid SMA and multi-wavelength observations for GRB120326A revealed their complex emissions as the synchrotron self-inverse Compton radiation from reverse shock. The observations including ALMA for GRB131030A also showed the significant X-ray excess from the standard forward shock synchrotron model. Based on these results, we also discuss further observations for (A) constraining of the mass of progenitor with polarization, (B) the first confirmation of GRB jet collimation, and (C) revealing the origin of optically dark GRBs.

  11. Detection and 3D reconstruction of traffic signs from multiple view color images

    NASA Astrophysics Data System (ADS)

    Soheilian, Bahman; Paparoditis, Nicolas; Vallet, Bruno

    2013-03-01

    3D reconstruction of traffic signs is of great interest in many applications such as image-based localization and navigation. In order to reflect the reality, the reconstruction process should meet both accuracy and precision. In order to reach such a valid reconstruction from calibrated multi-view images, accurate and precise extraction of signs in every individual view is a must. This paper presents first an automatic pipeline for identifying and extracting the silhouette of signs in every individual image. Then, a multi-view constrained 3D reconstruction algorithm provides an optimum 3D silhouette for the detected signs. The first step called detection, tackles with a color-based segmentation to generate ROIs (Region of Interests) in image. The shape of every ROI is estimated by fitting an ellipse, a quadrilateral or a triangle to edge points. A ROI is rejected if none of the three shapes can be fitted sufficiently precisely. Thanks to the estimated shape the remained candidates ROIs are rectified to remove the perspective distortion and then matched with a set of reference signs using textural information. Poor matches are rejected and the types of remained ones are identified. The output of the detection algorithm is a set of identified road signs whose silhouette in image plane is represented by and ellipse, a quadrilateral or a triangle. The 3D reconstruction process is based on a hypothesis generation and verification. Hypotheses are generated by a stereo matching approach taking into account epipolar geometry and also the similarity of the categories. The hypotheses that are plausibly correspond to the same 3D road sign are identified and grouped during this process. Finally, all the hypotheses of the same group are merged to generate a unique 3D road sign by a multi-view algorithm integrating a priori knowledges about 3D shape of road signs as constraints. The algorithm is assessed on real and synthetic images and reached and average accuracy of 3.5cm for position and 4.5° for orientation.

  12. Separating Decision and Encoding Noise in Signal Detection Tasks

    PubMed Central

    Cabrera, Carlos Alexander; Lu, Zhong-Lin; Dosher, Barbara Anne

    2015-01-01

    In this paper we develop an extension to the Signal Detection Theory (SDT) framework to separately estimate internal noise arising from representational and decision processes. Our approach constrains SDT models with decision noise by combining a multi-pass external noise paradigm with confidence rating responses. In a simulation study we present evidence that representation and decision noise can be separately estimated over a range of representative underlying representational and decision noise level configurations. These results also hold across a number of decision rules and show resilience to rule miss-specification. The new theoretical framework is applied to a visual detection confidence-rating task with three and five response categories. This study compliments and extends the recent efforts of researchers (Benjamin, Diaz, & Wee, 2009; Mueller & Weidemann, 2008; Rosner & Kochanski, 2009, Kellen, Klauer, & Singmann, 2012) to separate and quantify underlying sources of response variability in signal detection tasks. PMID:26120907

  13. A New Model for Solving Time-Cost-Quality Trade-Off Problems in Construction

    PubMed Central

    Fu, Fang; Zhang, Tao

    2016-01-01

    A poor quality affects project makespan and its total costs negatively, but it can be recovered by repair works during construction. We construct a new non-linear programming model based on the classic multi-mode resource constrained project scheduling problem considering repair works. In order to obtain satisfactory quality without a high increase of project cost, the objective is to minimize total quality cost which consists of the prevention cost and failure cost according to Quality-Cost Analysis. A binary dependent normal distribution function is adopted to describe the activity quality; Cumulative quality is defined to determine whether to initiate repair works, according to the different relationships among activity qualities, namely, the coordinative and precedence relationship. Furthermore, a shuffled frog-leaping algorithm is developed to solve this discrete trade-off problem based on an adaptive serial schedule generation scheme and adjusted activity list. In the program of the algorithm, the frog-leaping progress combines the crossover operator of genetic algorithm and a permutation-based local search. Finally, an example of a construction project for a framed railway overpass is provided to examine the algorithm performance, and it assist in decision making to search for the appropriate makespan and quality threshold with minimal cost. PMID:27911939

  14. Mapping and localization for extraterrestrial robotic explorations

    NASA Astrophysics Data System (ADS)

    Xu, Fengliang

    In the exploration of an extraterrestrial environment such as Mars, orbital data, such as high-resolution imagery Mars Orbital Camera-Narrow Angle (MOC-NA), laser ranging data Mars Orbital Laser Altimeter (MOLA), and multi-spectral imagery Thermal Emission Imaging System (THEMIS), play more and more important roles. However, these remote sensing techniques can never replace the role of landers and rovers, which can provide a close up and inside view. Similarly, orbital mapping can not compete with ground-level close-range mapping in resolution, precision, and speed. This dissertation addresses two tasks related to robotic extraterrestrial exploration: mapping and rover localization. Image registration is also discussed as an important aspect for both of them. Techniques from computer vision and photogrammetry are applied for automation and precision. Image registration is classified into three sub-categories: intra-stereo, inter-stereo, and cross-site, according to the relationship between stereo images. In the intra-stereo registration, which is the most fundamental sub-category, interest point-based registration and verification by parallax continuity in the principal direction are proposed. Two other techniques, inter-scanline search with constrained dynamic programming for far range matching and Markov Random Field (MRF) based registration for big terrain variation, are explored as possible improvements. Creating using rover ground images mainly involves the generation of Digital Terrain Model (DTM) and ortho-rectified map (orthomap). The first task is to derive the spatial distribution statistics from the first panorama and model the DTM with a dual polynomial model. This model is used for interpolation of the DTM, using Kriging in the close range and Triangular Irregular Network (TIN) in the far range. To generate a uniformly illuminated orthomap from the DTM, a least-squares-based automatic intensity balancing method is proposed. Finally a seamless orthomap is constructed by a split-and-merge technique: the mapped area is split or subdivided into small regions of image overlap, and then each small map piece was processed and all of the pieces are merged together to form a seamless map. Rover localization has three stages, all of which use a least-squares adjustment procedure: (1) an initial localization which is accomplished by adjustment over features common to rover images and orbital images, (2) an adjustment of image pointing angles at a single site through inter and intra-stereo tie points, and (3) an adjustment of the rover traverse through manual cross-site tie points. The first stage is based on adjustment of observation angles of features. The second stage and third stage are based on bundle-adjustment. In the third-stage an incremental adjustment method was proposed. Automation in rover localization includes automatic intra/inter-stereo tie point selection, computer-assisted cross-site tie point selection, and automatic verification of accuracy. (Abstract shortened by UMI.)

  15. Multi-Level Partnerships Support a Comprehensive Faith-Based Health Promotion Program

    ERIC Educational Resources Information Center

    Hardison-Moody, Annie; Dunn, Carolyn; Hall, David; Jones, Lorelei; Newkirk, Jimmy; Thomas, Cathy

    2011-01-01

    This article examines the role of multi-level partnerships in implementing Faithful Families Eating Smart and Moving More, a faith-based health promotion program that works with low-resource faith communities in North Carolina. This program incorporates a nine-lesson individual behavior change program in concert with policy and environmental…

  16. Expanding the Use of Time-Based Metering: Multi-Center Traffic Management Advisor

    NASA Technical Reports Server (NTRS)

    Landry, Steven J.; Farley, Todd; Hoang, Ty

    2005-01-01

    Time-based metering is an efficient air traffic management alternative to the more common practice of distance-based metering (or "miles-in-trail spacing"). Despite having demonstrated significant operational benefit to airspace users and service providers, time-based metering is used in the United States for arrivals to just nine airports and is not used at all for non-arrival traffic flows. The Multi-Center Traffic Management Advisor promises to bring time-based metering into the mainstream of air traffic management techniques. Not constrained to operate solely on arrival traffic, Multi-Center Traffic Management Advisor is flexible enough to work in highly congested or heavily partitioned airspace for any and all traffic flows in a region. This broader and more general application of time-based metering is expected to bring the operational benefits of time-based metering to a much wider pool of beneficiaries than is possible with existing technology. It also promises to facilitate more collaborative traffic management on a regional basis. This paper focuses on the operational concept of the Multi-Center Traffic Management Advisor, touching also on its system architecture, field test results, and prospects for near-term deployment to the United States National Airspace System.

  17. Online and mobile technologies for self-management in bipolar disorder: A systematic review.

    PubMed

    Gliddon, Emma; Barnes, Steven J; Murray, Greg; Michalak, Erin E

    2017-09-01

    Internet (eHealth) and smartphone-based (mHealth) approaches to self-management for bipolar disorder are increasingly common. Evidence-based self-management strategies are available for bipolar disorder and provide a useful framework for reviewing existing eHealth/mHealth programs to determine whether these strategies are supported by current technologies. This review assesses which self-management strategies are most supported by technology. Based on 3 previous studies, 7 categories of self-management strategies related to bipolar disorder were identified, followed by a systematic literature review to identify existing eHealth and mHealth programs for this disorder. Searches were conducted by using PubMed, CINAHL, PsycINFO, EMBASE, and the Cochrane Database of Systematic Reviews for relevant peer-reviewed articles published January 2005 to May 2015. eHealth and mHealth programs were summarized and reviewed to identify which of the 7 self-management strategy categories were supported by eHealth or mHealth programs. From 1,654 publications, 15 papers were identified for inclusion. From these, 9 eHealth programs and 2 mHealth programs were identified. The most commonly supported self-management strategy categories were "ongoing monitoring," "maintaining hope," "education," and "planning for and taking action"; the least commonly supported categories were "relaxation" and "maintaining a healthy lifestyle." eHealth programs appear to provide more comprehensive coverage of self-management strategies compared with mHealth programs. Both eHealth and mHealth programs present a wide range of self-management strategies for bipolar disorder, although individuals seeking comprehensive interventions might be best served by eHealth programs, while those seeking more condensed and direct interventions might prefer mHealth programs. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  18. A role for the developing lexicon in phonetic category acquisition

    PubMed Central

    Feldman, Naomi H.; Griffiths, Thomas L.; Goldwater, Sharon; Morgan, James L.

    2013-01-01

    Infants segment words from fluent speech during the same period when they are learning phonetic categories, yet accounts of phonetic category acquisition typically ignore information about the words in which sounds appear. We use a Bayesian model to illustrate how feedback from segmented words might constrain phonetic category learning by providing information about which sounds occur together in words. Simulations demonstrate that word-level information can successfully disambiguate overlapping English vowel categories. Learning patterns in the model are shown to parallel human behavior from artificial language learning tasks. These findings point to a central role for the developing lexicon in phonetic category acquisition and provide a framework for incorporating top-down constraints into models of category learning. PMID:24219848

  19. Analogical and category-based inference: a theoretical integration with Bayesian causal models.

    PubMed

    Holyoak, Keith J; Lee, Hee Seung; Lu, Hongjing

    2010-11-01

    A fundamental issue for theories of human induction is to specify constraints on potential inferences. For inferences based on shared category membership, an analogy, and/or a relational schema, it appears that the basic goal of induction is to make accurate and goal-relevant inferences that are sensitive to uncertainty. People can use source information at various levels of abstraction (including both specific instances and more general categories), coupled with prior causal knowledge, to build a causal model for a target situation, which in turn constrains inferences about the target. We propose a computational theory in the framework of Bayesian inference and test its predictions (parameter-free for the cases we consider) in a series of experiments in which people were asked to assess the probabilities of various causal predictions and attributions about a target on the basis of source knowledge about generative and preventive causes. The theory proved successful in accounting for systematic patterns of judgments about interrelated types of causal inferences, including evidence that analogical inferences are partially dissociable from overall mapping quality.

  20. Optimizing Experimental Designs Relative to Costs and Effect Sizes.

    ERIC Educational Resources Information Center

    Headrick, Todd C.; Zumbo, Bruno D.

    A general model is derived for the purpose of efficiently allocating integral numbers of units in multi-level designs given prespecified power levels. The derivation of the model is based on a constrained optimization problem that maximizes a general form of a ratio of expected mean squares subject to a budget constraint. This model provides more…

  1. Programming support environment issues in the Byron programming environment

    NASA Technical Reports Server (NTRS)

    Larsen, Matthew J.

    1986-01-01

    Issues are discussed which programming support environments need to address in order to successfully support software engineering. These concerns are divided into two categories. The first category, issues of how software development is supported by an environment, includes support of the full life cycle, methodology flexibility, and support of software reusability. The second category contains issues of how environments should operate, such as tool reusability and integration, user friendliness, networking, and use of a central data base. This discussion is followed by an examination of Byron, an Ada based programming support environment developed at Intermetrics, focusing on the solutions Byron offers to these problems, including the support provided for software reusability and the test and maintenance phases of the life cycle. The use of Byron in project development is described briefly, and some suggestions for future Byron tools and user written tools are presented.

  2. A graph-based approach for the retrieval of multi-modality medical images.

    PubMed

    Kumar, Ashnil; Kim, Jinman; Wen, Lingfeng; Fulham, Michael; Feng, Dagan

    2014-02-01

    In this paper, we address the retrieval of multi-modality medical volumes, which consist of two different imaging modalities, acquired sequentially, from the same scanner. One such example, positron emission tomography and computed tomography (PET-CT), provides physicians with complementary functional and anatomical features as well as spatial relationships and has led to improved cancer diagnosis, localisation, and staging. The challenge of multi-modality volume retrieval for cancer patients lies in representing the complementary geometric and topologic attributes between tumours and organs. These attributes and relationships, which are used for tumour staging and classification, can be formulated as a graph. It has been demonstrated that graph-based methods have high accuracy for retrieval by spatial similarity. However, naïvely representing all relationships on a complete graph obscures the structure of the tumour-anatomy relationships. We propose a new graph structure derived from complete graphs that structurally constrains the edges connected to tumour vertices based upon the spatial proximity of tumours and organs. This enables retrieval on the basis of tumour localisation. We also present a similarity matching algorithm that accounts for different feature sets for graph elements from different imaging modalities. Our method emphasises the relationships between a tumour and related organs, while still modelling patient-specific anatomical variations. Constraining tumours to related anatomical structures improves the discrimination potential of graphs, making it easier to retrieve similar images based on tumour location. We evaluated our retrieval methodology on a dataset of clinical PET-CT volumes. Our results showed that our method enabled the retrieval of multi-modality images using spatial features. Our graph-based retrieval algorithm achieved a higher precision than several other retrieval techniques: gray-level histograms as well as state-of-the-art methods such as visual words using the scale- invariant feature transform (SIFT) and relational matrices representing the spatial arrangements of objects. Copyright © 2013 Elsevier B.V. All rights reserved.

  3. Designing an optimal software intensive system acquisition: A game theoretic approach

    NASA Astrophysics Data System (ADS)

    Buettner, Douglas John

    The development of schedule-constrained software-intensive space systems is challenging. Case study data from national security space programs developed at the U.S. Air Force Space and Missile Systems Center (USAF SMC) provide evidence of the strong desire by contractors to skip or severely reduce software development design and early defect detection methods in these schedule-constrained environments. The research findings suggest recommendations to fully address these issues at numerous levels. However, the observations lead us to investigate modeling and theoretical methods to fundamentally understand what motivated this behavior in the first place. As a result, Madachy's inspection-based system dynamics model is modified to include unit testing and an integration test feedback loop. This Modified Madachy Model (MMM) is used as a tool to investigate the consequences of this behavior on the observed defect dynamics for two remarkably different case study software projects. Latin Hypercube sampling of the MMM with sample distributions for quality, schedule and cost-driven strategies demonstrate that the higher cost and effort quality-driven strategies provide consistently better schedule performance than the schedule-driven up-front effort-reduction strategies. Game theory reasoning for schedule-driven engineers cutting corners on inspections and unit testing is based on the case study evidence and Austin's agency model to describe the observed phenomena. Game theory concepts are then used to argue that the source of the problem and hence the solution to developers cutting corners on quality for schedule-driven system acquisitions ultimately lies with the government. The game theory arguments also lead to the suggestion that the use of a multi-player dynamic Nash bargaining game provides a solution for our observed lack of quality game between the government (the acquirer) and "large-corporation" software developers. A note is provided that argues this multi-player dynamic Nash bargaining game also provides the solution to Freeman Dyson's problem, for a way to place a label of good or bad on systems.

  4. Only multi-taxon studies show the full range of arthropod responses to fire

    PubMed Central

    Pryke, James S.; Gaigher, René; Samways, Michael J.

    2018-01-01

    Fire is a major driver in many ecosystems. Yet, little is known about how different ground-living arthropods survive fire. Using three sampling methods, and time-since-fire (last fire event: 3 months, 1 year, and 7 years), we investigate how ground-living arthropod diversity responds to fire, and how species richness, diversity, abundance, and composition of the four dominant taxa: ants, beetles, cockroaches and mites, respond. We did this in the naturally fire-prone Mediterranean-type scrubland vegetation (fynbos) of the Cape Floristic Region. Surprisingly, overall species richness and diversity was the same for all time-since-fire categories. However, when each dominant taxon was analysed separately, effect of fire on species richness and abundance varied among taxa. This emphasizes that many taxa must be investigated to really understand fire-driven events. We also highlight the importance of using different diversity measures, as fire did not influence species richness and abundance of particular taxa, while it affected others, overall greatly affecting assemblages of all taxa. Rockiness affected species richness, abundance and composition of a few taxa. We found that all time-since-fire categories supported distinctive assemblages. Some indicator species occurred across all time-since-fire categories, while others were restricted to a single time-since-fire category, showing that there is a wide range of responses to fire between taxa. Details of local landscape structure, abiotic and biotic, and frequency and intensity of fire add complexity to the fire-arthropod interaction. Overall, we show that the relationship between fire and arthropods is phylogenetically constrained, having been honed by many millennia of fire events, and highly complex. Present-day species manifest a variety of adaptations for surviving the great natural selective force of fire. PMID:29614132

  5. Constrained maximum consistency multi-path mitigation

    NASA Astrophysics Data System (ADS)

    Smith, George B.

    2003-10-01

    Blind deconvolution algorithms can be useful as pre-processors for signal classification algorithms in shallow water. These algorithms remove the distortion of the signal caused by multipath propagation when no knowledge of the environment is available. A framework in which filters that produce signal estimates from each data channel that are as consistent with each other as possible in a least-squares sense has been presented [Smith, J. Acoust. Soc. Am. 107 (2000)]. This framework provides a solution to the blind deconvolution problem. One implementation of this framework yields the cross-relation on which EVAM [Gurelli and Nikias, IEEE Trans. Signal Process. 43 (1995)] and Rietsch [Rietsch, Geophysics 62(6) (1997)] processing are based. In this presentation, partially blind implementations that have good noise stability properties are compared using Classification Operating Characteristics (CLOC) analysis. [Work supported by ONR under Program Element 62747N and NRL, Stennis Space Center, MS.

  6. A content-based news video retrieval system: NVRS

    NASA Astrophysics Data System (ADS)

    Liu, Huayong; He, Tingting

    2009-10-01

    This paper focus on TV news programs and design a content-based news video browsing and retrieval system, NVRS, which is convenient for users to fast browsing and retrieving news video by different categories such as political, finance, amusement, etc. Combining audiovisual features and caption text information, the system automatically segments a complete news program into separate news stories. NVRS supports keyword-based news story retrieval, category-based news story browsing and generates key-frame-based video abstract for each story. Experiments show that the method of story segmentation is effective and the retrieval is also efficient.

  7. Life Cycle Assessment of Mixed Municipal Solid Waste: Multi-input versus multi-output perspective.

    PubMed

    Fiorentino, G; Ripa, M; Protano, G; Hornsby, C; Ulgiati, S

    2015-12-01

    This paper analyses four strategies for managing the Mixed Municipal Solid Waste (MMSW) in terms of their environmental impacts and potential advantages by means of Life Cycle Assessment (LCA) methodology. To this aim, both a multi-input and a multi-output approach are applied to evaluate the effect of these perspectives on selected impact categories. The analyzed management options include direct landfilling with energy recovery (S-1), Mechanical-Biological Treatment (MBT) followed by Waste-to-Energy (WtE) conversion (S-2), a combination of an innovative MBT/MARSS (Material Advanced Recovery Sustainable Systems) process and landfill disposal (S-3), and finally a combination of the MBT/MARSS process with WtE conversion (S-4). The MARSS technology, developed within an European LIFE PLUS framework and currently implemented at pilot plant scale, is an innovative MBT plant having the main goal to yield a Renewable Refined Biomass Fuel (RRBF) to be used for combined heat and power production (CHP) under the regulations enforced for biomass-based plants instead of Waste-to-Energy systems, for increased environmental performance. The four scenarios are characterized by different resource investment for plant and infrastructure construction and different quantities of matter, heat and electricity recovery and recycling. Results, calculated per unit mass of waste treated and per unit exergy delivered, under both multi-input and multi-output LCA perspectives, point out improved performance for scenarios characterized by increased matter and energy recovery. Although none of the investigated scenarios is capable to provide the best performance in all the analyzed impact categories, the scenario S-4 shows the best LCA results in the human toxicity and freshwater eutrophication categories, i.e. the ones with highest impacts in all waste management processes. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. 50 CFR 679.50 - Groundfish Observer Program.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... following: (A) Identification of the management, organizational structure, and ownership structure of the.../processors. A catcher/processor will be assigned to a fishery category based on the retained groundfish catch... in Federal waters will be assigned to a fishery category based on the retained groundfish catch...

  9. Multi-categorical deep learning neural network to classify retinal images: A pilot study employing small database.

    PubMed

    Choi, Joon Yul; Yoo, Tae Keun; Seo, Jeong Gi; Kwak, Jiyong; Um, Terry Taewoong; Rim, Tyler Hyungtaek

    2017-01-01

    Deep learning emerges as a powerful tool for analyzing medical images. Retinal disease detection by using computer-aided diagnosis from fundus image has emerged as a new method. We applied deep learning convolutional neural network by using MatConvNet for an automated detection of multiple retinal diseases with fundus photographs involved in STructured Analysis of the REtina (STARE) database. Dataset was built by expanding data on 10 categories, including normal retina and nine retinal diseases. The optimal outcomes were acquired by using a random forest transfer learning based on VGG-19 architecture. The classification results depended greatly on the number of categories. As the number of categories increased, the performance of deep learning models was diminished. When all 10 categories were included, we obtained results with an accuracy of 30.5%, relative classifier information (RCI) of 0.052, and Cohen's kappa of 0.224. Considering three integrated normal, background diabetic retinopathy, and dry age-related macular degeneration, the multi-categorical classifier showed accuracy of 72.8%, 0.283 RCI, and 0.577 kappa. In addition, several ensemble classifiers enhanced the multi-categorical classification performance. The transfer learning incorporated with ensemble classifier of clustering and voting approach presented the best performance with accuracy of 36.7%, 0.053 RCI, and 0.225 kappa in the 10 retinal diseases classification problem. First, due to the small size of datasets, the deep learning techniques in this study were ineffective to be applied in clinics where numerous patients suffering from various types of retinal disorders visit for diagnosis and treatment. Second, we found that the transfer learning incorporated with ensemble classifiers can improve the classification performance in order to detect multi-categorical retinal diseases. Further studies should confirm the effectiveness of algorithms with large datasets obtained from hospitals.

  10. Learning Incoherent Sparse and Low-Rank Patterns from Multiple Tasks

    PubMed Central

    Chen, Jianhui; Liu, Ji; Ye, Jieping

    2013-01-01

    We consider the problem of learning incoherent sparse and low-rank patterns from multiple tasks. Our approach is based on a linear multi-task learning formulation, in which the sparse and low-rank patterns are induced by a cardinality regularization term and a low-rank constraint, respectively. This formulation is non-convex; we convert it into its convex surrogate, which can be routinely solved via semidefinite programming for small-size problems. We propose to employ the general projected gradient scheme to efficiently solve such a convex surrogate; however, in the optimization formulation, the objective function is non-differentiable and the feasible domain is non-trivial. We present the procedures for computing the projected gradient and ensuring the global convergence of the projected gradient scheme. The computation of projected gradient involves a constrained optimization problem; we show that the optimal solution to such a problem can be obtained via solving an unconstrained optimization subproblem and an Euclidean projection subproblem. We also present two projected gradient algorithms and analyze their rates of convergence in details. In addition, we illustrate the use of the presented projected gradient algorithms for the proposed multi-task learning formulation using the least squares loss. Experimental results on a collection of real-world data sets demonstrate the effectiveness of the proposed multi-task learning formulation and the efficiency of the proposed projected gradient algorithms. PMID:24077658

  11. Learning Incoherent Sparse and Low-Rank Patterns from Multiple Tasks.

    PubMed

    Chen, Jianhui; Liu, Ji; Ye, Jieping

    2012-02-01

    We consider the problem of learning incoherent sparse and low-rank patterns from multiple tasks. Our approach is based on a linear multi-task learning formulation, in which the sparse and low-rank patterns are induced by a cardinality regularization term and a low-rank constraint, respectively. This formulation is non-convex; we convert it into its convex surrogate, which can be routinely solved via semidefinite programming for small-size problems. We propose to employ the general projected gradient scheme to efficiently solve such a convex surrogate; however, in the optimization formulation, the objective function is non-differentiable and the feasible domain is non-trivial. We present the procedures for computing the projected gradient and ensuring the global convergence of the projected gradient scheme. The computation of projected gradient involves a constrained optimization problem; we show that the optimal solution to such a problem can be obtained via solving an unconstrained optimization subproblem and an Euclidean projection subproblem. We also present two projected gradient algorithms and analyze their rates of convergence in details. In addition, we illustrate the use of the presented projected gradient algorithms for the proposed multi-task learning formulation using the least squares loss. Experimental results on a collection of real-world data sets demonstrate the effectiveness of the proposed multi-task learning formulation and the efficiency of the proposed projected gradient algorithms.

  12. Object-oriented and pixel-based classification approach for land cover using airborne long-wave infrared hyperspectral data

    NASA Astrophysics Data System (ADS)

    Marwaha, Richa; Kumar, Anil; Kumar, Arumugam Senthil

    2015-01-01

    Our primary objective was to explore a classification algorithm for thermal hyperspectral data. Minimum noise fraction is applied to thermal hyperspectral data and eight pixel-based classifiers, i.e., constrained energy minimization, matched filter, spectral angle mapper (SAM), adaptive coherence estimator, orthogonal subspace projection, mixture-tuned matched filter, target-constrained interference-minimized filter, and mixture-tuned target-constrained interference minimized filter are tested. The long-wave infrared (LWIR) has not yet been exploited for classification purposes. The LWIR data contain emissivity and temperature information about an object. A highest overall accuracy of 90.99% was obtained using the SAM algorithm for the combination of thermal data with a colored digital photograph. Similarly, an object-oriented approach is applied to thermal data. The image is segmented into meaningful objects based on properties such as geometry, length, etc., which are grouped into pixels using a watershed algorithm and an applied supervised classification algorithm, i.e., support vector machine (SVM). The best algorithm in the pixel-based category is the SAM technique. SVM is useful for thermal data, providing a high accuracy of 80.00% at a scale value of 83 and a merge value of 90, whereas for the combination of thermal data with a colored digital photograph, SVM gives the highest accuracy of 85.71% at a scale value of 82 and a merge value of 90.

  13. Optimization of Regional Geodynamic Models for Mantle Dynamics

    NASA Astrophysics Data System (ADS)

    Knepley, M.; Isaac, T.; Jadamec, M. A.

    2016-12-01

    The SubductionGenerator program is used to construct high resolution, 3D regional thermal structures for mantle convection simulations using a variety of data sources, including sea floor ages and geographically referenced 3D slab locations based on seismic observations. The initial bulk temperature field is constructed using a half-space cooling model or plate cooling model, and related smoothing functions based on a diffusion length-scale analysis. In this work, we seek to improve the 3D thermal model and test different model geometries and dynamically driven flow fields using constraints from observed seismic velocities and plate motions. Through a formal adjoint analysis, we construct the primal-dual version of the multi-objective PDE-constrained optimization problem for the plate motions and seismic misfit. We have efficient, scalable preconditioners for both the forward and adjoint problems based upon a block preconditioning strategy, and a simple gradient update is used to improve the control residual. The full optimal control problem is formulated on a nested hierarchy of grids, allowing a nonlinear multigrid method to accelerate the solution.

  14. Prioritizing Genes Related to Nicotine Addiction Via a Multi-source-Based Approach.

    PubMed

    Liu, Xinhua; Liu, Meng; Li, Xia; Zhang, Lihua; Fan, Rui; Wang, Ju

    2015-08-01

    Nicotine has a broad impact on both the central and peripheral nervous systems. Over the past decades, an increasing number of genes potentially involved in nicotine addiction have been identified by different technical approaches. However, the molecular mechanisms underlying nicotine addiction remain largely unknown. Under such situation, prioritizing the candidate genes for further investigation is becoming increasingly important. In this study, we presented a multi-source-based gene prioritization approach for nicotine addiction by utilizing the vast amounts of information generated from for nicotine addiction study during the past years. In this approach, we first collected and curated genes from studies in four categories, i.e., genetic association analysis, genetic linkage analysis, high-throughput gene/protein expression analysis, and literature search of single gene/protein-based studies. Based on these resources, the genes were scored and a weight value was determined for each category. Finally, the genes were ranked by their combined scores, and 220 genes were selected as the prioritized nicotine addiction-related genes. Evaluation suggested the prioritized genes were promising targets for further analysis and replication study.

  15. A Supply and Demand Management Perspective on the Accelerated Global Introductions of Inactivated Poliovirus Vaccine in a Constrained Supply Market.

    PubMed

    Lewis, Ian; Ottosen, Ann; Rubin, Jennifer; Blanc, Diana Chang; Zipursky, Simona; Wootton, Emily

    2017-07-01

    A total of 105 countries have introduced IPV as of September 2016 of which 85 have procured the vaccine through UNICEF. The Global Eradication and Endgame Strategic Plan 2013-2018 called for the rapid introduction of at least one dose of IPV into routine immunization schedules in 126 all OPV-using countries by the end of 2015. At the time of initiating the procurement process, demand was estimated based on global modeling rather than individual country indications. In its capacity as procurement agency for the Global Polio Eradication Initiative and Gavi, the Vaccine Alliance, UNICEF set out to secure access to IPV supply for around 100 countries. Based on offers received, sufficient supply was awarded to two manufacturers to meet projected routine requirements. However, due to technical issues scaling up vaccine production and an unforecasted demand for IPV use in campaigns to interrupt wild polio virus and to control type 2 vaccine derived polio virus outbreaks, IPV supplies are severely constrained. Activities to stretch supplies and to suppress demand have been ongoing since 2014, including delaying IPV introduction in countries where risks of type 2 reintroduction are lower, implementing the multi-dose vial policy, and encouraging the use of fractional dose delivered intradermally. Despite these efforts, there is still insufficient IPV supply to meet demand. The impact of the supply situation on IPV introduction timelines in countries are the focus of this article, and based on lessons learned with the IPV introductions, it is recommended for future health programs with accelerated scale up of programs, to take a cautious approach on supply commitments, putting in place clear allocation criteria in case of shortages or delays and establishing a communication strategy vis a vis beneficiaries. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America.

  16. Effectiveness of employee internet-based weight management program.

    PubMed

    Petersen, Ruth; Sill, Stewart; Lu, Chifung; Young, Joyce; Edington, Dee W

    2008-02-01

    To evaluate an employee Internet-based weight management program. Changes in eating habits, stage of change, body weight, and weight categories were compared between enrollment and 6 months after enrollment. Weights and weight categories were compared among a subset of participants and non-participants at 12 months. Seven thousand seven hundred forty-three International Business Machines employees enrolled in the program between December 2004 and February 2006, and 74% were overweight or obese (body mass index > or =25). At 6 months, follow-up survey respondents (1639) had significantly increased most healthy eating habits (eg, 20% decrease in junk foods) and the frequency of healthy foods eaten (eg, 12% increase in fruits). The percentage of participants in the normal weight category had increased from 27.0% to 29.8%, while average weight decreased from 182.6 to 180.2 lbs (P < 0.05). Increased web site usage was associated with increased weight loss and stage of change improvements. At 12 months, a higher percentage of participants had moved into the normal weight category compared with the percentage of non-participants (+2.0% points; P < 0.05), although there were no differences in average weight change. Despite issues of limited penetration and potential self-selection, this Internet-based program had utility in reaching a large number of employees in dispersed work settings, and it led to improved eating habits and improved stage of change at 6 months and more individuals moving into the normal weight category at 6 and 12 months.

  17. Characterizing multi-pollutant air pollution in China: Comparison of three air quality indices.

    PubMed

    Hu, Jianlin; Ying, Qi; Wang, Yungang; Zhang, Hongliang

    2015-11-01

    Multi-pollutant air pollution (i.e., several pollutants reaching very high concentrations simultaneously) frequently occurs in many regions across China. Air quality index (AQI) is used worldwide to inform the public about levels of air pollution and associated health risks. The current AQI approach used in China is based on the maximum value of individual pollutants, and does not consider the combined health effects of exposure to multiple pollutants. In this study, two novel alternative indices--aggregate air quality index (AAQI) and health-risk based air quality index (HAQI)--were calculated based on data collected in six megacities of China (Beijing, Shanghai, Guangzhou, Shjiazhuang, Xi'an, and Wuhan) during 2013 to 2014. Both AAQI and HAQI take into account the combined health effects of various pollutants, and the HAQI considers the exposure (or concentration)-response relationships of pollutants. AAQI and HAQI were compared to AQI to examine the effectiveness of the current AQI in characterizing multi-pollutant air pollution in China. The AAQI and HAQI values are higher than the AQI on days when two or more pollutants simultaneously exceed the Chinese Ambient Air Quality Standards (CAAQS) 24-hour Grade II standards. The results of the comparison of the classification of risk categories based on the three indices indicate that the current AQI approach underestimates the severity of health risk associated with exposure to multi-pollutant air pollution. For the AQI-based risk category of 'unhealthy', 96% and 80% of the days would be 'very unhealthy' or 'hazardous' if based on AAQI and HAQI, respectively; and for the AQI-based risk category of 'very unhealthy', 67% and 75% of the days would be 'hazardous' if based on AAQI and HAQI, respectively. The results suggest that the general public, especially sensitive population groups such as children and the elderly, should take more stringent actions than those currently suggested based on the AQI approach during high air pollution events. Sensitivity studies were conducted to examine the assumptions used in the AAQI and HAQI approaches. Results show that AAQI is sensitive to the choice of pollutant irrelevant constant. HAQI is sensitive to the choice of both threshold values and pollutants included in total risk calculation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Striatal and Hippocampal Entropy and Recognition Signals in Category Learning: Simultaneous Processes Revealed by Model-Based fMRI

    PubMed Central

    Davis, Tyler; Love, Bradley C.; Preston, Alison R.

    2012-01-01

    Category learning is a complex phenomenon that engages multiple cognitive processes, many of which occur simultaneously and unfold dynamically over time. For example, as people encounter objects in the world, they simultaneously engage processes to determine their fit with current knowledge structures, gather new information about the objects, and adjust their representations to support behavior in future encounters. Many techniques that are available to understand the neural basis of category learning assume that the multiple processes that subserve it can be neatly separated between different trials of an experiment. Model-based functional magnetic resonance imaging offers a promising tool to separate multiple, simultaneously occurring processes and bring the analysis of neuroimaging data more in line with category learning’s dynamic and multifaceted nature. We use model-based imaging to explore the neural basis of recognition and entropy signals in the medial temporal lobe and striatum that are engaged while participants learn to categorize novel stimuli. Consistent with theories suggesting a role for the anterior hippocampus and ventral striatum in motivated learning in response to uncertainty, we find that activation in both regions correlates with a model-based measure of entropy. Simultaneously, separate subregions of the hippocampus and striatum exhibit activation correlated with a model-based recognition strength measure. Our results suggest that model-based analyses are exceptionally useful for extracting information about cognitive processes from neuroimaging data. Models provide a basis for identifying the multiple neural processes that contribute to behavior, and neuroimaging data can provide a powerful test bed for constraining and testing model predictions. PMID:22746951

  19. Community-based implementation and effectiveness in a randomized trial of a risk reduction intervention for HIV-serodiscordant couples: study protocol

    PubMed Central

    2014-01-01

    Background The HIV/AIDS epidemic continues to disproportionately affect African American communities in the US, particularly those located in urban areas. Despite the fact that HIV is often transmitted from one sexual partner to another, most HIV prevention interventions have focused only on individuals, rather than couples. This five-year study investigates community-based implementation, effectiveness, and sustainability of ‘Eban II,’ an evidence-based risk reduction intervention for African-American heterosexual, serodiscordant couples. Methods/design This hybrid implementation/effectiveness implementation study is guided by organizational change theory as conceptualized in the Texas Christian University Program Change Model (PCM), a model of phased organizational change from exposure to adoption, implementation, and sustainability. The primary implementation aims are to assist 10 community-based organizations (CBOs) to implement and sustain Eban II; specifically, to partner with CBOs to expose providers to the intervention; facilitate its adoption, implementation and sustainment; and to evaluate processes and determinants of implementation, effectiveness, fidelity, and sustainment. The primary effectiveness aim is to evaluate the effect of Eban II on participant (n = 200 couples) outcomes, specifically incidents of protected sex and proportion of condom use. We will also determine the cost-effectiveness of implementation, as measured by implementation costs and potential cost savings. A mixed methods evaluation will examine implementation at the agency level; staff members from the CBOs will complete baseline measures of organizational context and climate, while key stakeholders will be interviewed periodically throughout implementation. Effectiveness of Eban II will be assessed using a randomized delayed enrollment (waitlist) control design to evaluate the impact of treatment on outcomes at posttest and three-month follow-up. Multi-level hierarchical modeling with a multi-level nested structure will be used to evaluate the effects of agency- and couples-level characteristics on couples-level outcomes (e.g., condom use). Discussion This study will produce important information regarding the value of the Eban II program and a theory-guided implementation process and tools designed for use in implementing Eban II and other evidence-based programs in demographically diverse, resource-constrained treatment settings. Trial registration NCT00644163 PMID:24950708

  20. Community-based implementation and effectiveness in a randomized trial of a risk reduction intervention for HIV-serodiscordant couples: study protocol.

    PubMed

    Hamilton, Alison B; Mittman, Brian S; Williams, John K; Liu, Honghu H; Eccles, Alicia M; Hutchinson, Craig S; Wyatt, Gail E

    2014-06-20

    The HIV/AIDS epidemic continues to disproportionately affect African American communities in the US, particularly those located in urban areas. Despite the fact that HIV is often transmitted from one sexual partner to another, most HIV prevention interventions have focused only on individuals, rather than couples. This five-year study investigates community-based implementation, effectiveness, and sustainability of 'Eban II,' an evidence-based risk reduction intervention for African-American heterosexual, serodiscordant couples. This hybrid implementation/effectiveness implementation study is guided by organizational change theory as conceptualized in the Texas Christian University Program Change Model (PCM), a model of phased organizational change from exposure to adoption, implementation, and sustainability. The primary implementation aims are to assist 10 community-based organizations (CBOs) to implement and sustain Eban II; specifically, to partner with CBOs to expose providers to the intervention; facilitate its adoption, implementation and sustainment; and to evaluate processes and determinants of implementation, effectiveness, fidelity, and sustainment. The primary effectiveness aim is to evaluate the effect of Eban II on participant (n = 200 couples) outcomes, specifically incidents of protected sex and proportion of condom use. We will also determine the cost-effectiveness of implementation, as measured by implementation costs and potential cost savings. A mixed methods evaluation will examine implementation at the agency level; staff members from the CBOs will complete baseline measures of organizational context and climate, while key stakeholders will be interviewed periodically throughout implementation. Effectiveness of Eban II will be assessed using a randomized delayed enrollment (waitlist) control design to evaluate the impact of treatment on outcomes at posttest and three-month follow-up. Multi-level hierarchical modeling with a multi-level nested structure will be used to evaluate the effects of agency- and couples-level characteristics on couples-level outcomes (e.g., condom use). This study will produce important information regarding the value of the Eban II program and a theory-guided implementation process and tools designed for use in implementing Eban II and other evidence-based programs in demographically diverse, resource-constrained treatment settings. NCT00644163.

  1. The crustal structure in the transition zone between the western and eastern Barents Sea

    NASA Astrophysics Data System (ADS)

    Shulgin, Alexey; Mjelde, Rolf; Faleide, Jan Inge; Høy, Tore; Flueh, Ernst; Thybo, Hans

    2018-04-01

    We present a crustal-scale seismic profile in the Barents Sea based on new data. Wide-angle seismic data were recorded along a 600 km long profile at 38 ocean bottom seismometer and 52 onshore station locations. The modeling uses the joint refraction/reflection tomography approach where co-located multi-channel seismic reflection data constrain the sedimentary structure. Further, forward gravity modeling is based on the seismic model. We also calculate net regional erosion based on the calculated shallow velocity structure.

  2. Social class rank, essentialism, and punitive judgment.

    PubMed

    Kraus, Michael W; Keltner, Dacher

    2013-08-01

    Recent evidence suggests that perceptions of social class rank influence a variety of social cognitive tendencies, from patterns of causal attribution to moral judgment. In the present studies we tested the hypotheses that upper-class rank individuals would be more likely to endorse essentialist lay theories of social class categories (i.e., that social class is founded in genetically based, biological differences) than would lower-class rank individuals and that these beliefs would decrease support for restorative justice--which seeks to rehabilitate offenders, rather than punish unlawful action. Across studies, higher social class rank was associated with increased essentialism of social class categories (Studies 1, 2, and 4) and decreased support for restorative justice (Study 4). Moreover, manipulated essentialist beliefs decreased preferences for restorative justice (Study 3), and the association between social class rank and class-based essentialist theories was explained by the tendency to endorse beliefs in a just world (Study 2). Implications for how class-based essentialist beliefs potentially constrain social opportunity and mobility are discussed.

  3. Outcome Evaluation of a Community Center-Based Program for Mothers at High Psychosocial Risk

    ERIC Educational Resources Information Center

    Rodrigo, Maria Jose; Maiquez, Maria Luisa; Correa, Ana Delia; Martin, Juan Carlos; Rodriguez, Guacimara

    2006-01-01

    Objective: This study reported the outcome evaluation of the "Apoyo Personal y Familiar" (APF) program for poorly-educated mothers from multi-problem families, showing inadequate behavior with their children. APF is a community-based multi-site program delivered through weekly group meetings in municipal resource centers. Method: A total…

  4. A Matter of Timing: Identifying Significant Multi-Dose Radiotherapy Improvements by Numerical Simulation and Genetic Algorithm Search

    PubMed Central

    Angus, Simon D.; Piotrowska, Monika Joanna

    2014-01-01

    Multi-dose radiotherapy protocols (fraction dose and timing) currently used in the clinic are the product of human selection based on habit, received wisdom, physician experience and intra-day patient timetabling. However, due to combinatorial considerations, the potential treatment protocol space for a given total dose or treatment length is enormous, even for relatively coarse search; well beyond the capacity of traditional in-vitro methods. In constrast, high fidelity numerical simulation of tumor development is well suited to the challenge. Building on our previous single-dose numerical simulation model of EMT6/Ro spheroids, a multi-dose irradiation response module is added and calibrated to the effective dose arising from 18 independent multi-dose treatment programs available in the experimental literature. With the developed model a constrained, non-linear, search for better performing cadidate protocols is conducted within the vicinity of two benchmarks by genetic algorithm (GA) techniques. After evaluating less than 0.01% of the potential benchmark protocol space, candidate protocols were identified by the GA which conferred an average of 9.4% (max benefit 16.5%) and 7.1% (13.3%) improvement (reduction) on tumour cell count compared to the two benchmarks, respectively. Noticing that a convergent phenomenon of the top performing protocols was their temporal synchronicity, a further series of numerical experiments was conducted with periodic time-gap protocols (10 h to 23 h), leading to the discovery that the performance of the GA search candidates could be replicated by 17–18 h periodic candidates. Further dynamic irradiation-response cell-phase analysis revealed that such periodicity cohered with latent EMT6/Ro cell-phase temporal patterning. Taken together, this study provides powerful evidence towards the hypothesis that even simple inter-fraction timing variations for a given fractional dose program may present a facile, and highly cost-effecitive means of significantly improving clinical efficacy. PMID:25460164

  5. A matter of timing: identifying significant multi-dose radiotherapy improvements by numerical simulation and genetic algorithm search.

    PubMed

    Angus, Simon D; Piotrowska, Monika Joanna

    2014-01-01

    Multi-dose radiotherapy protocols (fraction dose and timing) currently used in the clinic are the product of human selection based on habit, received wisdom, physician experience and intra-day patient timetabling. However, due to combinatorial considerations, the potential treatment protocol space for a given total dose or treatment length is enormous, even for relatively coarse search; well beyond the capacity of traditional in-vitro methods. In constrast, high fidelity numerical simulation of tumor development is well suited to the challenge. Building on our previous single-dose numerical simulation model of EMT6/Ro spheroids, a multi-dose irradiation response module is added and calibrated to the effective dose arising from 18 independent multi-dose treatment programs available in the experimental literature. With the developed model a constrained, non-linear, search for better performing cadidate protocols is conducted within the vicinity of two benchmarks by genetic algorithm (GA) techniques. After evaluating less than 0.01% of the potential benchmark protocol space, candidate protocols were identified by the GA which conferred an average of 9.4% (max benefit 16.5%) and 7.1% (13.3%) improvement (reduction) on tumour cell count compared to the two benchmarks, respectively. Noticing that a convergent phenomenon of the top performing protocols was their temporal synchronicity, a further series of numerical experiments was conducted with periodic time-gap protocols (10 h to 23 h), leading to the discovery that the performance of the GA search candidates could be replicated by 17-18 h periodic candidates. Further dynamic irradiation-response cell-phase analysis revealed that such periodicity cohered with latent EMT6/Ro cell-phase temporal patterning. Taken together, this study provides powerful evidence towards the hypothesis that even simple inter-fraction timing variations for a given fractional dose program may present a facile, and highly cost-effecitive means of significantly improving clinical efficacy.

  6. An inexact log-normal distribution-based stochastic chance-constrained model for agricultural water quality management

    NASA Astrophysics Data System (ADS)

    Wang, Yu; Fan, Jie; Xu, Ye; Sun, Wei; Chen, Dong

    2018-05-01

    In this study, an inexact log-normal-based stochastic chance-constrained programming model was developed for solving the non-point source pollution issues caused by agricultural activities. Compared to the general stochastic chance-constrained programming model, the main advantage of the proposed model is that it allows random variables to be expressed as a log-normal distribution, rather than a general normal distribution. Possible deviations in solutions caused by irrational parameter assumptions were avoided. The agricultural system management in the Erhai Lake watershed was used as a case study, where critical system factors, including rainfall and runoff amounts, show characteristics of a log-normal distribution. Several interval solutions were obtained under different constraint-satisfaction levels, which were useful in evaluating the trade-off between system economy and reliability. The applied results show that the proposed model could help decision makers to design optimal production patterns under complex uncertainties. The successful application of this model is expected to provide a good example for agricultural management in many other watersheds.

  7. Tau lepton production and decays: perspective of multi-dimensional distributions and Monte Carlo methods

    NASA Astrophysics Data System (ADS)

    Was, Z.

    2017-06-01

    Status of τ lepton decay Monte Carlo generator TAUOLA, its main applications and recent developments are reviewed. It is underlined, that in recent efforts on development of new hadronic currents, the multi-dimensional nature of distributions of the experimental data must be taken with a great care: lesson from comparison and fits to the BaBar and Belle data is recalled. It was found, that as in the past at a time of comparisons with CLEO and ALEPH data, proper fitting, to as detailed as possible representation of the experimental data, is essential for appropriate developments of models of τ decay dynamic. This multi-dimensional nature of distributions is also important for observables where τ leptons are used to constrain experimental data. In later part of the presentation, use of the TAUOLA program for phenomenology of W, Z, H decays at LHC is addressed, in particular in the context of the Higgs boson parity measurements. Some new results, relevant for QED lepton pair emission are mentioned as well.

  8. Camouflage target reconnaissance based on hyperspectral imaging technology

    NASA Astrophysics Data System (ADS)

    Hua, Wenshen; Guo, Tong; Liu, Xun

    2015-08-01

    Efficient camouflaged target reconnaissance technology makes great influence on modern warfare. Hyperspectral images can provide large spectral range and high spectral resolution, which are invaluable in discriminating between camouflaged targets and backgrounds. Hyperspectral target detection and classification technology are utilized to achieve single class and multi-class camouflaged targets reconnaissance respectively. Constrained energy minimization (CEM), a widely used algorithm in hyperspectral target detection, is employed to achieve one class camouflage target reconnaissance. Then, support vector machine (SVM), a classification method, is proposed to achieve multi-class camouflage target reconnaissance. Experiments have been conducted to demonstrate the efficiency of the proposed method.

  9. Content Analysis of Student Essays after Attending a Problem-Based Learning Course: Facilitating the Development of Critical Thinking and Communication Skills in Japanese Nursing Students

    PubMed Central

    Itatani, Tomoya; Nagata, Kyoko; Yanagihara, Kiyoko; Tabuchi, Noriko

    2017-01-01

    The importance of active learning has continued to increase in Japan. The authors conducted classes for first-year students who entered the nursing program using the problem-based learning method which is a kind of active learning. Students discussed social topics in classes. The purposes of this study were to analyze the post-class essay, describe logical and critical thinking after attended a Problem-Based Learning (PBL) course. The authors used Mayring’s methodology for qualitative content analysis and text mining. In the description about the skills required to resolve social issues, seven categories were extracted: (recognition of diverse social issues), (attitudes about resolving social issues), (discerning the root cause), (multi-lateral information processing skills), (making a path to resolve issues), (processivity in dealing with issues), and (reflecting). In the description about communication, five categories were extracted: (simple statement), (robust theories), (respecting the opponent), (communication skills), and (attractive presentations). As the result of text mining, the words extracted more than 100 times included “issue,” “society,” “resolve,” “myself,” “ability,” “opinion,” and “information.” Education using PBL could be an effective means of improving skills that students described, and communication in general. Some students felt difficulty of communication resulting from characteristics of Japanese. PMID:28829362

  10. Periodic Forced Response of Structures Having Three-Dimensional Frictional Constraints

    NASA Astrophysics Data System (ADS)

    CHEN, J. J.; YANG, B. D.; MENQ, C. H.

    2000-01-01

    Many mechanical systems have moving components that are mutually constrained through frictional contacts. When subjected to cyclic excitations, a contact interface may undergo constant changes among sticks, slips and separations, which leads to very complex contact kinematics. In this paper, a 3-D friction contact model is employed to predict the periodic forced response of structures having 3-D frictional constraints. Analytical criteria based on this friction contact model are used to determine the transitions among sticks, slips and separations of the friction contact, and subsequently the constrained force which consists of the induced stick-slip friction force on the contact plane and the contact normal load. The resulting constrained force is often a periodic function and can be considered as a feedback force that influences the response of the constrained structures. By using the Multi-Harmonic Balance Method along with Fast Fourier Transform, the constrained force can be integrated with the receptance of the structures so as to calculate the forced response of the constrained structures. It results in a set of non-linear algebraic equations that can be solved iteratively to yield the relative motion as well as the constrained force at the friction contact. This method is used to predict the periodic response of a frictionally constrained 3-d.o.f. oscillator. The predicted results are compared with those of the direct time integration method so as to validate the proposed method. In addition, the effect of super-harmonic components on the resonant response and jump phenomenon is examined.

  11. Two Pathways to Stimulus Encoding in Category Learning?

    PubMed Central

    Davis, Tyler; Love, Bradley C.; Maddox, W. Todd

    2008-01-01

    Category learning theorists tacitly assume that stimuli are encoded by a single pathway. Motivated by theories of object recognition, we evaluate a dual-pathway account of stimulus encoding. The part-based pathway establishes mappings between sensory input and symbols that encode discrete stimulus features, whereas the image-based pathway applies holistic templates to sensory input. Our experiments use rule-plus-exception structures in which one exception item in each category violates a salient regularity and must be distinguished from other items. In Experiment 1, we find that discrete representations are crucial for recognition of exceptions following brief training. Experiments 2 and 3 involve multi-session training regimens designed to encourage either part or image-based encoding. We find that both pathways are able to support exception encoding, but have unique characteristics. We speculate that one advantage of the part-based pathway is the ability to generalize across domains, whereas the image-based pathway provides faster and more effortless recognition. PMID:19460948

  12. Multi-physics optimization of three-dimensional microvascular polymeric components

    NASA Astrophysics Data System (ADS)

    Aragón, Alejandro M.; Saksena, Rajat; Kozola, Brian D.; Geubelle, Philippe H.; Christensen, Kenneth T.; White, Scott R.

    2013-01-01

    This work discusses the computational design of microvascular polymeric materials, which aim at mimicking the behavior found in some living organisms that contain a vascular system. The optimization of the topology of the embedded three-dimensional microvascular network is carried out by coupling a multi-objective constrained genetic algorithm with a finite-element based physics solver, the latter validated through experiments. The optimization is carried out on multiple conflicting objective functions, namely the void volume fraction left by the network, the energy required to drive the fluid through the network and the maximum temperature when the material is subjected to thermal loads. The methodology presented in this work results in a viable alternative for the multi-physics optimization of these materials for active-cooling applications.

  13. SLA-based optimisation of virtualised resource for multi-tier web applications in cloud data centres

    NASA Astrophysics Data System (ADS)

    Bi, Jing; Yuan, Haitao; Tie, Ming; Tan, Wei

    2015-10-01

    Dynamic virtualised resource allocation is the key to quality of service assurance for multi-tier web application services in cloud data centre. In this paper, we develop a self-management architecture of cloud data centres with virtualisation mechanism for multi-tier web application services. Based on this architecture, we establish a flexible hybrid queueing model to determine the amount of virtual machines for each tier of virtualised application service environments. Besides, we propose a non-linear constrained optimisation problem with restrictions defined in service level agreement. Furthermore, we develop a heuristic mixed optimisation algorithm to maximise the profit of cloud infrastructure providers, and to meet performance requirements from different clients as well. Finally, we compare the effectiveness of our dynamic allocation strategy with two other allocation strategies. The simulation results show that the proposed resource allocation method is efficient in improving the overall performance and reducing the resource energy cost.

  14. 14 CFR Appendix G to Part 135 - Extended Operations (ETOPS)

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... the FAA; (b) The operation is conducted in a multi-engine transport category turbine-powered airplane... Mexico) with multi-engine transport category turbine-engine powered airplanes. The certificate holder may... speed, corrected for wind and temperature) may not exceed the time specified in the Airplane Flight...

  15. 14 CFR Appendix G to Part 135 - Extended Operations (ETOPS)

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... the FAA; (b) The operation is conducted in a multi-engine transport category turbine-powered airplane... Mexico) with multi-engine transport category turbine-engine powered airplanes. The certificate holder may... speed, corrected for wind and temperature) may not exceed the time specified in the Airplane Flight...

  16. 14 CFR Appendix G to Part 135 - Extended Operations (ETOPS)

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... the FAA; (b) The operation is conducted in a multi-engine transport category turbine-powered airplane... Mexico) with multi-engine transport category turbine-engine powered airplanes. The certificate holder may... speed, corrected for wind and temperature) may not exceed the time specified in the Airplane Flight...

  17. 14 CFR Appendix G to Part 135 - Extended Operations (ETOPS)

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... the FAA; (b) The operation is conducted in a multi-engine transport category turbine-powered airplane... Mexico) with multi-engine transport category turbine-engine powered airplanes. The certificate holder may... speed, corrected for wind and temperature) may not exceed the time specified in the Airplane Flight...

  18. 14 CFR Appendix G to Part 135 - Extended Operations (ETOPS)

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... the FAA; (b) The operation is conducted in a multi-engine transport category turbine-powered airplane... Mexico) with multi-engine transport category turbine-engine powered airplanes. The certificate holder may... speed, corrected for wind and temperature) may not exceed the time specified in the Airplane Flight...

  19. Reliability of Multi-Category Rating Scales

    ERIC Educational Resources Information Center

    Parker, Richard I.; Vannest, Kimberly J.; Davis, John L.

    2013-01-01

    The use of multi-category scales is increasing for the monitoring of IEP goals, classroom and school rules, and Behavior Improvement Plans (BIPs). Although they require greater inference than traditional data counting, little is known about the inter-rater reliability of these scales. This simulation study examined the performance of nine…

  20. Fuzzy robust credibility-constrained programming for environmental management and planning.

    PubMed

    Zhang, Yimei; Hang, Guohe

    2010-06-01

    In this study, a fuzzy robust credibility-constrained programming (FRCCP) is developed and applied to the planning for waste management systems. It incorporates the concepts of credibility-based chance-constrained programming and robust programming within an optimization framework. The developed method can reflect uncertainties presented as possibility-density by fuzzy-membership functions. Fuzzy credibility constraints are transformed to the crisp equivalents with different credibility levels, and ordinary fuzzy inclusion constraints are determined by their robust deterministic constraints by setting a-cut levels. The FRCCP method can provide different system costs under different credibility levels (lambda). From the results of sensitivity analyses, the operation cost of the landfill is a critical parameter. For the management, any factors that would induce cost fluctuation during landfilling operation would deserve serious observation and analysis. By FRCCP, useful solutions can be obtained to provide decision-making support for long-term planning of solid waste management systems. It could be further enhanced through incorporating methods of inexact analysis into its framework. It can also be applied to other environmental management problems.

  1. A New Approach to Modeling Densities and Equilibria of Ice and Gas Hydrate Phases

    NASA Astrophysics Data System (ADS)

    Zyvoloski, G.; Lucia, A.; Lewis, K. C.

    2011-12-01

    The Gibbs-Helmholtz Constrained (GHC) equation is a new cubic equation of state that was recently derived by Lucia (2010) and Lucia et al. (2011) by constraining the energy parameter in the Soave form of the Redlich-Kwong equation to satisfy the Gibbs-Helmholtz equation. The key attributes of the GHC equation are: 1) It is a multi-scale equation because it uses the internal energy of departure, UD, as a natural bridge between the molecular and bulk phase length scales. 2) It does not require acentric factors, volume translation, regression of parameters to experimental data, binary (kij) interaction parameters, or other forms of empirical correlations. 3) It is a predictive equation of state because it uses a database of values of UD determined from NTP Monte Carlo simulations. 4) It can readily account for differences in molecular size and shape. 5) It has been successfully applied to non-electrolyte mixtures as well as weak and strong aqueous electrolyte mixtures over wide ranges of temperature, pressure and composition to predict liquid density and phase equilibrium with up to four phases. 6) It has been extensively validated with experimental data. 7) The AAD% error between predicted and experimental liquid density is 1% while the AAD% error in phase equilibrium predictions is 2.5%. 8) It has been used successfully within the subsurface flow simulation program FEHM. In this work we describe recent extensions of the multi-scale predictive GHC equation to modeling the phase densities and equilibrium behavior of hexagonal ice and gas hydrates. In particular, we show that radial distribution functions, which can be determined by NTP Monte Carlo simulations, can be used to establish correct standard state fugacities of 1h ice and gas hydrates. From this, it is straightforward to determine both the phase density of ice or gas hydrates as well as any equilibrium involving ice and/or hydrate phases. A number of numerical results for mixtures of N2, O2, CH4, CO2, water, and NaCl in permafrost conditions are presented to illustrate the predictive capabilities of the multi-scale GHC equation. In particular, we show that the GHC equation correctly predicts 1) The density of 1h ice and methane hydrate to within 1%. 2) The melting curve for hexagonal ice. 3) The hydrate-gas phase co-existence curve. 4) Various phase equilibrium involving ice and hydrate phases. We also show that the GHC equation approach can be readily incorporated into subsurface flow simulation programs like FEHM to predict the behavior of permafrost and other reservoirs where ice and/or hydrates are present. Many geometric illustrations are used to elucidate key concepts. References A. Lucia, A Multi-Scale Gibbs Helmholtz Constrained Cubic Equation of State. J. Thermodynamics: Special Issue on Advances in Gas Hydrate Thermodynamics and Transport Properties. Available on-line [doi:10.1155/2010/238365]. A. Lucia, B.M. Bonk, A. Roy and R.R. Waterman, A Multi-Scale Framework for Multi-Phase Equilibrium Flash. Comput. Chem. Engng. In press.

  2. Impact of model complexity and multi-scale data integration on the estimation of hydrogeological parameters in a dual-porosity aquifer

    NASA Astrophysics Data System (ADS)

    Tamayo-Mas, Elena; Bianchi, Marco; Mansour, Majdi

    2018-03-01

    This study investigates the impact of model complexity and multi-scale prior hydrogeological data on the interpretation of pumping test data in a dual-porosity aquifer (the Chalk aquifer in England, UK). In order to characterize the hydrogeological properties, different approaches ranging from a traditional analytical solution (Theis approach) to more sophisticated numerical models with automatically calibrated input parameters are applied. Comparisons of results from the different approaches show that neither traditional analytical solutions nor a numerical model assuming a homogenous and isotropic aquifer can adequately explain the observed drawdowns. A better reproduction of the observed drawdowns in all seven monitoring locations is instead achieved when medium and local-scale prior information about the vertical hydraulic conductivity (K) distribution is used to constrain the model calibration process. In particular, the integration of medium-scale vertical K variations based on flowmeter measurements lead to an improvement in the goodness-of-fit of the simulated drawdowns of about 30%. Further improvements (up to 70%) were observed when a simple upscaling approach was used to integrate small-scale K data to constrain the automatic calibration process of the numerical model. Although the analysis focuses on a specific case study, these results provide insights about the representativeness of the estimates of hydrogeological properties based on different interpretations of pumping test data, and promote the integration of multi-scale data for the characterization of heterogeneous aquifers in complex hydrogeological settings.

  3. Probability-based constrained MPC for structured uncertain systems with state and random input delays

    NASA Astrophysics Data System (ADS)

    Lu, Jianbo; Li, Dewei; Xi, Yugeng

    2013-07-01

    This article is concerned with probability-based constrained model predictive control (MPC) for systems with both structured uncertainties and time delays, where a random input delay and multiple fixed state delays are included. The process of input delay is governed by a discrete-time finite-state Markov chain. By invoking an appropriate augmented state, the system is transformed into a standard structured uncertain time-delay Markov jump linear system (MJLS). For the resulting system, a multi-step feedback control law is utilised to minimise an upper bound on the expected value of performance objective. The proposed design has been proved to stabilise the closed-loop system in the mean square sense and to guarantee constraints on control inputs and system states. Finally, a numerical example is given to illustrate the proposed results.

  4. Polynomial Size Formulations for the Distance and Capacity Constrained Vehicle Routing Problem

    NASA Astrophysics Data System (ADS)

    Kara, Imdat; Derya, Tusan

    2011-09-01

    The Distance and Capacity Constrained Vehicle Routing Problem (DCVRP) is an extension of the well known Traveling Salesman Problem (TSP). DCVRP arises in distribution and logistics problems. It would be beneficial to construct new formulations, which is the main motivation and contribution of this paper. We focused on two indexed integer programming formulations for DCVRP. One node based and one arc (flow) based formulation for DCVRP are presented. Both formulations have O(n2) binary variables and O(n2) constraints, i.e., the number of the decision variables and constraints grows with a polynomial function of the nodes of the underlying graph. It is shown that proposed arc based formulation produces better lower bound than the existing one (this refers to the Water's formulation in the paper). Finally, various problems from literature are solved with the node based and arc based formulations by using CPLEX 8.0. Preliminary computational analysis shows that, arc based formulation outperforms the node based formulation in terms of linear programming relaxation.

  5. Direct care worker's perceptions of job satisfaction following implementation of work-based learning.

    PubMed

    Lopez, Cynthia; White, Diana L; Carder, Paula C

    2014-02-01

    The purpose of this study was to understand the impact of a work-based learning program on the work lives of Direct Care Workers (DCWs) at assisted living (AL) residences. The research questions were addressed using focus group data collected as part of a larger evaluation of a work-based learning (WBL) program called Jobs to Careers. The theoretical perspective of symbolic interactionism was used to frame the qualitative data analysis. Results indicated that the WBL program impacted DCWs' job satisfaction through the program curriculum and design and through three primary categories: relational aspects of work, worker identity, and finding time. This article presents a conceptual model for understanding how these categories are interrelated and the implications for WBL programs. Job satisfaction is an important topic that has been linked to quality of care and reduced turnover in long-term care settings.

  6. Inexact nonlinear improved fuzzy chance-constrained programming model for irrigation water management under uncertainty

    NASA Astrophysics Data System (ADS)

    Zhang, Chenglong; Zhang, Fan; Guo, Shanshan; Liu, Xiao; Guo, Ping

    2018-01-01

    An inexact nonlinear mλ-measure fuzzy chance-constrained programming (INMFCCP) model is developed for irrigation water allocation under uncertainty. Techniques of inexact quadratic programming (IQP), mλ-measure, and fuzzy chance-constrained programming (FCCP) are integrated into a general optimization framework. The INMFCCP model can deal with not only nonlinearities in the objective function, but also uncertainties presented as discrete intervals in the objective function, variables and left-hand side constraints and fuzziness in the right-hand side constraints. Moreover, this model improves upon the conventional fuzzy chance-constrained programming by introducing a linear combination of possibility measure and necessity measure with varying preference parameters. To demonstrate its applicability, the model is then applied to a case study in the middle reaches of Heihe River Basin, northwest China. An interval regression analysis method is used to obtain interval crop water production functions in the whole growth period under uncertainty. Therefore, more flexible solutions can be generated for optimal irrigation water allocation. The variation of results can be examined by giving different confidence levels and preference parameters. Besides, it can reflect interrelationships among system benefits, preference parameters, confidence levels and the corresponding risk levels. Comparison between interval crop water production functions and deterministic ones based on the developed INMFCCP model indicates that the former is capable of reflecting more complexities and uncertainties in practical application. These results can provide more reliable scientific basis for supporting irrigation water management in arid areas.

  7. Integrating male sexual diversity into violence prevention efforts with men and boys: evidence from the Asia-Pacific Region.

    PubMed

    Miedema, Stephanie S; Yount, Kathryn M; Chirwa, Esnat; Dunkle, Kristin; Fulu, Emma

    2017-02-01

    Men's perpetration of gender-based violence remains a global public health issue. Violence prevention experts call for engagement of boys and men to change social norms around masculinity in order to prevent gender-based violence. Yet, men do not comprise a homogenous category. Drawing on probability estimates of men who report same-sex practices and preferences captured in a multi-country gender-based violence prevention survey in the Asia-Pacific region, we test the effects of sexuality-related factors on men's adverse life experiences. We find that sexual minority men face statistically higher risk of lifetime adversity related to gender-based violence, stemming from gender inequitable norms in society. Sexuality is thus a key axis of differentiation among men in the Asia-Pacific region, influencing health and wellbeing and reflecting men's differential engagement with dominant norms of masculinity. Integrating awareness of male sexual diversity into gender-based violence prevention interventions, particularly those that work with boys and men, and bridging violence prevention programming between sexual minority communities and women, are essential to tackle the root drivers of violence.

  8. Developing Inventory and Monitoring Programs Based on Multiple Objectives

    Treesearch

    Daniel L. Schmoldt; David L. Peterson; David G. Silsbee

    1995-01-01

    Resource inventory and monitoring (I&M) programs in national parks combine multiple objectives in order to create a plan of action over a finite time horizon. Because all program activities are constrained by time and money, it is critical to plan I&M activities that make the best use of available agency resources. However, multiple objectives complicate a...

  9. Locality constrained joint dynamic sparse representation for local matching based face recognition.

    PubMed

    Wang, Jianzhong; Yi, Yugen; Zhou, Wei; Shi, Yanjiao; Qi, Miao; Zhang, Ming; Zhang, Baoxue; Kong, Jun

    2014-01-01

    Recently, Sparse Representation-based Classification (SRC) has attracted a lot of attention for its applications to various tasks, especially in biometric techniques such as face recognition. However, factors such as lighting, expression, pose and disguise variations in face images will decrease the performances of SRC and most other face recognition techniques. In order to overcome these limitations, we propose a robust face recognition method named Locality Constrained Joint Dynamic Sparse Representation-based Classification (LCJDSRC) in this paper. In our method, a face image is first partitioned into several smaller sub-images. Then, these sub-images are sparsely represented using the proposed locality constrained joint dynamic sparse representation algorithm. Finally, the representation results for all sub-images are aggregated to obtain the final recognition result. Compared with other algorithms which process each sub-image of a face image independently, the proposed algorithm regards the local matching-based face recognition as a multi-task learning problem. Thus, the latent relationships among the sub-images from the same face image are taken into account. Meanwhile, the locality information of the data is also considered in our algorithm. We evaluate our algorithm by comparing it with other state-of-the-art approaches. Extensive experiments on four benchmark face databases (ORL, Extended YaleB, AR and LFW) demonstrate the effectiveness of LCJDSRC.

  10. Real-time Kinematic Positioning of INS Tightly Aided Multi-GNSS Ionospheric Constrained PPP

    PubMed Central

    Gao, Zhouzheng; Shen, Wenbin; Zhang, Hongping; Niu, Xiaoji; Ge, Maorong

    2016-01-01

    Real-time Precise Point Positioning (PPP) technique is being widely applied for providing precise positioning services with the significant improvement on satellite precise products accuracy. With the rapid development of the multi-constellation Global Navigation Satellite Systems (multi-GNSS), currently, about 80 navigation satellites are operational in orbit. Obviously, PPP performance is dramatically improved with all satellites compared to that of GPS-only PPP. However, the performance of PPP could be evidently affected by unexpected and unavoidable severe observing environments, especially in the dynamic applications. Consequently, we apply Inertial Navigation System (INS) to the Ionospheric-Constrained (IC) PPP to overcome such drawbacks. The INS tightly aided multi-GNSS IC-PPP model can make full use of GNSS and INS observations to improve the PPP performance in terms of accuracy, availability, continuity, and convergence speed. Then, a set of airborne data is analyzed to evaluate and validate the improvement of multi-GNSS and INS on the performance of IC-PPP. PMID:27470270

  11. Real-time Kinematic Positioning of INS Tightly Aided Multi-GNSS Ionospheric Constrained PPP.

    PubMed

    Gao, Zhouzheng; Shen, Wenbin; Zhang, Hongping; Niu, Xiaoji; Ge, Maorong

    2016-07-29

    Real-time Precise Point Positioning (PPP) technique is being widely applied for providing precise positioning services with the significant improvement on satellite precise products accuracy. With the rapid development of the multi-constellation Global Navigation Satellite Systems (multi-GNSS), currently, about 80 navigation satellites are operational in orbit. Obviously, PPP performance is dramatically improved with all satellites compared to that of GPS-only PPP. However, the performance of PPP could be evidently affected by unexpected and unavoidable severe observing environments, especially in the dynamic applications. Consequently, we apply Inertial Navigation System (INS) to the Ionospheric-Constrained (IC) PPP to overcome such drawbacks. The INS tightly aided multi-GNSS IC-PPP model can make full use of GNSS and INS observations to improve the PPP performance in terms of accuracy, availability, continuity, and convergence speed. Then, a set of airborne data is analyzed to evaluate and validate the improvement of multi-GNSS and INS on the performance of IC-PPP.

  12. Very high-energy gamma-ray follow-up program using neutrino triggers from IceCube

    NASA Astrophysics Data System (ADS)

    IceCube Collaboration; Aartsen, M. G.; Abraham, K.; Ackermann, M.; Adams, J.; Aguilar, J. A.; Ahlers, M.; Ahrens, M.; Altmann, D.; Andeen, K.; Anderson, T.; Ansseau, I.; Anton, G.; Archinger, M.; Argüelles, C.; Auffenberg, J.; Axani, S.; Bai, X.; Barwick, S. W.; Baum, V.; Bay, R.; Beatty, J. J.; Becker Tjus, J.; Becker, K.-H.; BenZvi, S.; Berley, D.; Bernardini, E.; Bernhard, A.; Besson, D. Z.; Binder, G.; Bindig, D.; Bissok, M.; Blaufuss, E.; Blot, S.; Bohm, C.; Börner, M.; Bos, F.; Bose, D.; Böser, S.; Botner, O.; Braun, J.; Brayeur, L.; Bretz, H.-P.; Bron, S.; Burgman, A.; Carver, T.; Casier, M.; Cheung, E.; Chirkin, D.; Christov, A.; Clark, K.; Classen, L.; Coenders, S.; Collin, G. H.; Conrad, J. M.; Cowen, D. F.; Cross, R.; Day, M.; de André, J. P. A. M.; De Clercq, C.; del Pino Rosendo, E.; Dembinski, H.; De Ridder, S.; Desiati, P.; de Vries, K. D.; de Wasseige, G.; de With, M.; DeYoung, T.; Díaz-Vélez, J. C.; di Lorenzo, V.; Dujmovic, H.; Dumm, J. P.; Dunkman, M.; Eberhardt, B.; Ehrhardt, T.; Eichmann, B.; Eller, P.; Euler, S.; Evenson, P. A.; Fahey, S.; Fazely, A. R.; Feintzeig, J.; Felde, J.; Filimonov, K.; Finley, C.; Flis, S.; Fösig, C.-C.; Franckowiak, A.; Franke, R.; Friedman, E.; Fuchs, T.; Gaisser, T. K.; Gallagher, J.; Gerhardt, L.; Ghorbani, K.; Giang, W.; Gladstone, L.; Glauch, T.; Glüsenkamp, T.; Goldschmidt, A.; Golup, G.; Gonzalez, J. G.; Grant, D.; Griffith, Z.; Haack, C.; Haj Ismail, A.; Hallgren, A.; Halzen, F.; Hansen, E.; Hansmann, T.; Hanson, K.; Hebecker, D.; Heereman, D.; Helbing, K.; Hellauer, R.; Hickford, S.; Hignight, J.; Hill, G. C.; Hoffman, K. D.; Hoffmann, R.; Holzapfel, K.; Hoshina, K.; Huang, F.; Huber, M.; Hultqvist, K.; In, S.; Ishihara, A.; Jacobi, E.; Japaridze, G. S.; Jeong, M.; Jero, K.; Jones, B. J. P.; Jurkovic, M.; Kappes, A.; Karg, T.; Karle, A.; Katz, U.; Kauer, M.; Keivani, A.; Kelley, J. L.; Kheirandish, A.; Kim, M.; Kintscher, T.; Kiryluk, J.; Kittler, T.; Klein, S. R.; Kohnen, G.; Koirala, R.; Kolanoski, H.; Konietz, R.; Köpke, L.; Kopper, C.; Kopper, S.; Koskinen, D. J.; Kowalski, M.; Krings, K.; Kroll, M.; Krückl, G.; Krüger, C.; Kunnen, J.; Kunwar, S.; Kurahashi, N.; Kuwabara, T.; Labare, M.; Lanfranchi, J. L.; Larson, M. J.; Lauber, F.; Lennarz, D.; Lesiak-Bzdak, M.; Leuermann, M.; Lu, L.; Lünemann, J.; Madsen, J.; Maggi, G.; Mahn, K. B. M.; Mancina, S.; Mandelartz, M.; Maruyama, R.; Mase, K.; Maunu, R.; McNally, F.; Meagher, K.; Medici, M.; Meier, M.; Meli, A.; Menne, T.; Merino, G.; Meures, T.; Miarecki, S.; Mohrmann, L.; Montaruli, T.; Moulai, M.; Nahnhauer, R.; Naumann, U.; Neer, G.; Niederhausen, H.; Nowicki, S. C.; Nygren, D. R.; Obertacke Pollmann, A.; Olivas, A.; O'Murchadha, A.; Palczewski, T.; Pandya, H.; Pankova, D. V.; Peiffer, P.; Penek, Ö.; Pepper, J. A.; Pérez de los Heros, C.; Pieloth, D.; Pinat, E.; Price, P. B.; Przybylski, G. T.; Quinnan, M.; Raab, C.; Rädel, L.; Rameez, M.; Rawlins, K.; Reimann, R.; Relethford, B.; Relich, M.; Resconi, E.; Rhode, W.; Richman, M.; Riedel, B.; Robertson, S.; Rongen, M.; Rott, C.; Ruhe, T.; Ryckbosch, D.; Rysewyk, D.; Sabbatini, L.; Sanchez Herrera, S. E.; Sandrock, A.; Sandroos, J.; Sarkar, S.; Satalecka, K.; Schlunder, P.; Schmidt, T.; Schoenen, S.; Schöneberg, S.; Schumacher, L.; Seckel, D.; Seunarine, S.; Soldin, D.; Song, M.; Spiczak, G. M.; Spiering, C.; Stanev, T.; Stasik, A.; Stettner, J.; Steuer, A.; Stezelberger, T.; Stokstad, R. G.; Stößl, A.; Ström, R.; Strotjohann, N. L.; Sullivan, G. W.; Sutherland, M.; Taavola, H.; Taboada, I.; Tatar, J.; Tenholt, F.; Ter-Antonyan, S.; Terliuk, A.; Tešić, G.; Tilav, S.; Toale, P. A.; Tobin, M. N.; Toscano, S.; Tosi, D.; Tselengidou, M.; Turcati, A.; Unger, E.; Usner, M.; Vandenbroucke, J.; van Eijndhoven, N.; Vanheule, S.; van Rossem, M.; van Santen, J.; Veenkamp, J.; Vehring, M.; Voge, M.; Vogel, E.; Vraeghe, M.; Walck, C.; Wallace, A.; Wallraff, M.; Wandkowsky, N.; Weaver, Ch.; Weiss, M. J.; Wendt, C.; Westerhoff, S.; Whelan, B. J.; Wickmann, S.; Wiebe, K.; Wiebusch, C. H.; Wille, L.; Williams, D. R.; Wills, L.; Wolf, M.; Wood, T. R.; Woolsey, E.; Woschnagg, K.; Xu, D. L.; Xu, X. W.; Xu, Y.; Yanez, J. P.; Yodh, G.; Yoshida, S.; Zoll, M.; MAGIC Collaboration; Ahnen, M. L.; Ansoldi, S.; Antonelli, L. A.; Antoranz, P.; Babic, A.; Banerjee, B.; Bangale, P.; Barres de Almeida, U.; Barrio, J. A.; Becerra González, J.; Bednarek, W.; Bernardini, E.; Berti, A.; Biasuzzi, B.; Biland, A.; Blanch, O.; Bonnefoy, S.; Bonnoli, G.; Borracci, F.; Bretz, T.; Buson, S.; Carosi, A.; Chatterjee, A.; Clavero, R.; Colin, P.; Colombo, E.; Contreras, J. L.; Cortina, J.; Covino, S.; Da Vela, P.; Dazzi, F.; De Angelis, A.; De Lotto, B.; de Oña Wilhelmi, E.; Di Pierro, F.; Doert, M.; Domínguez, A.; Dominis Prester, D.; Dorner, D.; Doro, M.; Einecke, S.; Eisenacher Glawion, D.; Elsaesser, D.; Engelkemeier, M.; Fallah Ramazani, V.; Fernández-Barral, A.; Fidalgo, D.; Fonseca, M. V.; Font, L.; Frantzen, K.; Fruck, C.; Galindo, D.; García López, R. J.; Garczarczyk, M.; Garrido Terrats, D.; Gaug, M.; Giammaria, P.; Godinović, N.; González Muñoz, A.; Góra, D.; Guberman, D.; Hadasch, D.; Hahn, A.; Hanabata, Y.; Hayashida, M.; Herrera, J.; Hose, J.; Hrupec, D.; Hughes, G.; Idec, W.; Kodani, K.; Konno, Y.; Kubo, H.; Kushida, J.; La Barbera, A.; Lelas, D.; Lindfors, E.; Lombardi, S.; Longo, F.; López, M.; López-Coto, R.; Majumdar, P.; Makariev, M.; Mallot, K.; Maneva, G.; Manganaro, M.; Mannheim, K.; Maraschi, L.; Marcote, B.; Mariotti, M.; Martínez, M.; Mazin, D.; Menzel, U.; Miranda, J. M.; Mirzoyan, R.; Moralejo, A.; Moretti, E.; Nakajima, D.; Neustroev, V.; Niedzwiecki, A.; Nievas Rosillo, M.; Nilsson, K.; Nishijima, K.; Noda, K.; Nogués, L.; Overkemping, A.; Paiano, S.; Palacio, J.; Palatiello, M.; Paneque, D.; Paoletti, R.; Paredes, J. M.; Paredes-Fortuny, X.; Pedaletti, G.; Peresano, M.; Perri, L.; Persic, M.; Poutanen, J.; Prada Moroni, P. G.; Prandini, E.; Puljak, I.; Reichardt, I.; Rhode, W.; Ribó, M.; Rico, J.; Rodriguez Garcia, J.; Saito, T.; Satalecka, K.; Schroeder, S.; Schultz, C.; Schweizer, T.; Sillanpää, A.; Sitarek, J.; Snidaric, I.; Sobczynska, D.; Stamerra, A.; Steinbring, T.; Strzys, M.; Surić, T.; Takalo, L.; Tavecchio, F.; Temnikov, P.; Terzić, T.; Tescaro, D.; Teshima, M.; Thaele, J.; Torres, D. F.; Toyama, T.; Treves, A.; Vanzo, G.; Verguilov, V.; Vovk, I.; Ward, J. E.; Will, M.; Wu, M. H.; Zanin, .; VERITAS Collaboration; Abeysekara, A. U.; Archambault, S.; Archer, A.; Benbow, W.; Bird, R.; Bourbeau, E.; Buchovecky, M.; Bugaev, V.; Byrum, K.; Cardenzana, J. V.; Cerruti, M.; Ciupik, L.; Connolly, M. P.; Cui, W.; Dickinson, H. J.; Dumm, J.; Eisch, J. D.; Errando, M.; Falcone, A.; Feng, Q.; Finley, J. P.; Fleischhack, H.; Flinders, A.; Fortson, L.; Furniss, A.; Gillanders, G. H.; Griffin, S.; Hütten, J. Grube M.; Håkansson, N.; Hervet, O.; Holder, J.; Humensky, T. B.; Johnson, C. A.; Kaaret, P.; Kar, P.; Kelley-Hoskins, N.; Kertzman, M.; Kieda, D.; Krause, M.; Krennrich, F.; Kumar, S.; Lang, M. J.; Maier, G.; McArthur, S.; McCann, A.; Moriarty, P.; Mukherjee, R.; Nguyen, T.; Nieto, D.; O'Brien, S.; Ong, R. A.; Otte, A. N.; Park, N.; Pohl, M.; Popkow, A.; Pueschel, E.; Quinn, J.; Ragan, K.; Reynolds, P. T.; Richards, G. T.; Roache, E.; Rulten, C.; Sadeh, I.; Santander, M.; Sembroski, G. H.; Shahinyan, K.; Staszak, D.; Telezhinsky, I.; Tucci, J. V.; Tyler, J.; Wakely, S. P.; Weinstein, A.; Wilcox, P.; Wilhelm, A.; Williams, D. A.; Zitzer, B.

    2016-11-01

    We describe and report the status of a neutrino-triggered program in IceCube that generates real-time alerts for gamma-ray follow-up observations by atmospheric-Cherenkov telescopes (MAGIC and VERITAS). While IceCube is capable of monitoring the whole sky continuously, high-energy gamma-ray telescopes have restricted fields of view and in general are unlikely to be observing a potential neutrino-flaring source at the time such neutrinos are recorded. The use of neutrino-triggered alerts thus aims at increasing the availability of simultaneous multi-messenger data during potential neutrino flaring activity, which can increase the discovery potential and constrain the phenomenological interpretation of the high-energy emission of selected source classes (e.g. blazars). The requirements of a fast and stable online analysis of potential neutrino signals and its operation are presented, along with first results of the program operating between 14 March 2012 and 31 December 2015.

  13. The effect of health care reform on academic medicine in Canada. Editorial Committee of the Canadian Institute for Academic Medicine.

    PubMed

    Hollenberg, C H

    1996-05-15

    Although Canadian health care reform has constrained costs and improved efficiency, it has had a profound and mixed effect on Canadian academic medicine. Teaching hospitals have been reduced in number and size, and in patient programs have shifted to ambulatory and community settings. Specialized care programs are now multi-institutional and multidisciplinary. Furthermore, the influence of regional planning bodies has grown markedly. Although these changes have likely improved clinical service, their impact on the quality of clinical education is uncertain. Within the academic clinical department, recruitment of young faculty has been greatly complicated by constraints on licensing, billing numbers, fee-for-service income and research funding. The departmental practice plan based on university funds and fee-for-service income is being replaced by less favourable funding arrangements. However, emphasis on multidisciplinary programs has rendered these departments more flexible in structure. The future of Canadian academic medicine depends on an effective alliance with government. Academia and government must agree, particularly on human-resource requirements, research objectives and the delivery of clinical and academic programs in regional and community settings. The establishment of focal points for academic health sciences planning within academic health sciences centres and within governments would assist in these developments. Finally, government and the academic health sciences sector must work together to remove the current impediments to the recruitment of highly qualified young faculty.

  14. A fault-tolerant control architecture for unmanned aerial vehicles

    NASA Astrophysics Data System (ADS)

    Drozeski, Graham R.

    Research has presented several approaches to achieve varying degrees of fault-tolerance in unmanned aircraft. Approaches in reconfigurable flight control are generally divided into two categories: those which incorporate multiple non-adaptive controllers and switch between them based on the output of a fault detection and identification element, and those that employ a single adaptive controller capable of compensating for a variety of fault modes. Regardless of the approach for reconfigurable flight control, certain fault modes dictate system restructuring in order to prevent a catastrophic failure. System restructuring enables active control of actuation not employed by the nominal system to recover controllability of the aircraft. After system restructuring, continued operation requires the generation of flight paths that adhere to an altered flight envelope. The control architecture developed in this research employs a multi-tiered hierarchy to allow unmanned aircraft to generate and track safe flight paths despite the occurrence of potentially catastrophic faults. The hierarchical architecture increases the level of autonomy of the system by integrating five functionalities with the baseline system: fault detection and identification, active system restructuring, reconfigurable flight control; reconfigurable path planning, and mission adaptation. Fault detection and identification algorithms continually monitor aircraft performance and issue fault declarations. When the severity of a fault exceeds the capability of the baseline flight controller, active system restructuring expands the controllability of the aircraft using unconventional control strategies not exploited by the baseline controller. Each of the reconfigurable flight controllers and the baseline controller employ a proven adaptive neural network control strategy. A reconfigurable path planner employs an adaptive model of the vehicle to re-shape the desired flight path. Generation of the revised flight path is posed as a linear program constrained by the response of the degraded system. Finally, a mission adaptation component estimates limitations on the closed-loop performance of the aircraft and adjusts the aircraft mission accordingly. A combination of simulation and flight test results using two unmanned helicopters validates the utility of the hierarchical architecture.

  15. A joint precoding scheme for indoor downlink multi-user MIMO VLC systems

    NASA Astrophysics Data System (ADS)

    Zhao, Qiong; Fan, Yangyu; Kang, Bochao

    2017-11-01

    In this study, we aim to improve the system performance and reduce the implementation complexity of precoding scheme for visible light communication (VLC) systems. By incorporating the power-method algorithm and the block diagonalization (BD) algorithm, we propose a joint precoding scheme for indoor downlink multi-user multi-input-multi-output (MU-MIMO) VLC systems. In this scheme, we apply the BD algorithm to eliminate the co-channel interference (CCI) among users firstly. Secondly, the power-method algorithm is used to search the precoding weight for each user based on the optimal criterion of signal to interference plus noise ratio (SINR) maximization. Finally, the optical power restrictions of VLC systems are taken into account to constrain the precoding weight matrix. Comprehensive computer simulations in two scenarios indicate that the proposed scheme always has better bit error rate (BER) performance and lower computation complexity than that of the traditional scheme.

  16. Multi-point objective-oriented sequential sampling strategy for constrained robust design

    NASA Astrophysics Data System (ADS)

    Zhu, Ping; Zhang, Siliang; Chen, Wei

    2015-03-01

    Metamodelling techniques are widely used to approximate system responses of expensive simulation models. In association with the use of metamodels, objective-oriented sequential sampling methods have been demonstrated to be effective in balancing the need for searching an optimal solution versus reducing the metamodelling uncertainty. However, existing infilling criteria are developed for deterministic problems and restricted to one sampling point in one iteration. To exploit the use of multiple samples and identify the true robust solution in fewer iterations, a multi-point objective-oriented sequential sampling strategy is proposed for constrained robust design problems. In this article, earlier development of objective-oriented sequential sampling strategy for unconstrained robust design is first extended to constrained problems. Next, a double-loop multi-point sequential sampling strategy is developed. The proposed methods are validated using two mathematical examples followed by a highly nonlinear automotive crashworthiness design example. The results show that the proposed method can mitigate the effect of both metamodelling uncertainty and design uncertainty, and identify the robust design solution more efficiently than the single-point sequential sampling approach.

  17. An adaptive evolutionary multi-objective approach based on simulated annealing.

    PubMed

    Li, H; Landa-Silva, D

    2011-01-01

    A multi-objective optimization problem can be solved by decomposing it into one or more single objective subproblems in some multi-objective metaheuristic algorithms. Each subproblem corresponds to one weighted aggregation function. For example, MOEA/D is an evolutionary multi-objective optimization (EMO) algorithm that attempts to optimize multiple subproblems simultaneously by evolving a population of solutions. However, the performance of MOEA/D highly depends on the initial setting and diversity of the weight vectors. In this paper, we present an improved version of MOEA/D, called EMOSA, which incorporates an advanced local search technique (simulated annealing) and adapts the search directions (weight vectors) corresponding to various subproblems. In EMOSA, the weight vector of each subproblem is adaptively modified at the lowest temperature in order to diversify the search toward the unexplored parts of the Pareto-optimal front. Our computational results show that EMOSA outperforms six other well established multi-objective metaheuristic algorithms on both the (constrained) multi-objective knapsack problem and the (unconstrained) multi-objective traveling salesman problem. Moreover, the effects of the main algorithmic components and parameter sensitivities on the search performance of EMOSA are experimentally investigated.

  18. Multi-objective optimization in quantum parameter estimation

    NASA Astrophysics Data System (ADS)

    Gong, BeiLi; Cui, Wei

    2018-04-01

    We investigate quantum parameter estimation based on linear and Kerr-type nonlinear controls in an open quantum system, and consider the dissipation rate as an unknown parameter. We show that while the precision of parameter estimation is improved, it usually introduces a significant deformation to the system state. Moreover, we propose a multi-objective model to optimize the two conflicting objectives: (1) maximizing the Fisher information, improving the parameter estimation precision, and (2) minimizing the deformation of the system state, which maintains its fidelity. Finally, simulations of a simplified ɛ-constrained model demonstrate the feasibility of the Hamiltonian control in improving the precision of the quantum parameter estimation.

  19. TU-CD-BRB-09: Prediction of Chemo-Radiation Outcome for Rectal Cancer Based On Radiomics of Tumor Clinical Characteristics and Multi-Parametric MRI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nie, K; Yue, N; Shi, L

    2015-06-15

    Purpose: To evaluate the tumor clinical characteristics and quantitative multi-parametric MR imaging features for prediction of response to chemo-radiation treatment (CRT) in locally advanced rectal cancer (LARC). Methods: Forty-three consecutive patients (59.7±6.9 years, from 09/2013 – 06/2014) receiving neoadjuvant CRT followed by surgery were enrolled. All underwent MRI including anatomical T1/T2, Dynamic Contrast Enhanced (DCE)-MRI and Diffusion-Weighted MRI (DWI) prior to the treatment. A total of 151 quantitative features, including morphology/Gray Level Co-occurrence Matrix (GLCM) texture from T1/T2, enhancement kinetics and the voxelized distribution from DCE-MRI, apparent diffusion coefficient (ADC) from DWI, along with clinical information (carcinoembryonic antigen CEA level,more » TNM staging etc.), were extracted for each patient. Response groups were separated based on down-staging, good response and pathological complete response (pCR) status. Logistic regression analysis (LRA) was used to select the best predictors to classify different groups and the predictive performance were calculated using receiver operating characteristic (ROC) analysis. Results: Individual imaging category or clinical charateristics might yield certain level of power in assessing the response. However, the combined model outperformed than any category alone in prediction. With selected features as Volume, GLCM AutoCorrelation (T2), MaxEnhancementProbability (DCE-MRI), and MeanADC (DWI), the down-staging prediciton accuracy (area under the ROC curve, AUC) could be 0.95, better than individual tumor metrics with AUC from 0.53–0.85. While for the pCR prediction, the best set included CEA (clinical charateristics), Homogeneity (DCE-MRI) and MeanADC (DWI) with an AUC of 0.89, more favorable compared to conventional tumor metrics with an AUC ranging from 0.511–0.79. Conclusion: Through a systematic analysis of multi-parametric MR imaging features, we are able to build models with improved predictive value over conventional imaging or clinical metrics. This is encouraging, suggesting the wealth of imaging radiomics should be further explored to help tailor the treatment into the era of personalized medicine. This work is supported by the National Science Foundation of China (NSFC Grant No. 81201091), National High Technology Research and Development Program of China (863 program, Grant No. 2015AA020917), and Fund Project for Excellent Abroad Scholar Personnel in Science and Technology.« less

  20. Privacy Act System of Records: Federal Lead-Based Paint Program System of Records, EPA-54

    EPA Pesticide Factsheets

    Learn about the Federal Lead-Based Paint Program System of Records (FLPPSOR), including the security classification, individuals covered by the system, categories of records, routine uses of the records, and other security procedures.

  1. Interior tomography in microscopic CT with image reconstruction constrained by full field of view scan at low spatial resolution

    NASA Astrophysics Data System (ADS)

    Luo, Shouhua; Shen, Tao; Sun, Yi; Li, Jing; Li, Guang; Tang, Xiangyang

    2018-04-01

    In high resolution (microscopic) CT applications, the scan field of view should cover the entire specimen or sample to allow complete data acquisition and image reconstruction. However, truncation may occur in projection data and results in artifacts in reconstructed images. In this study, we propose a low resolution image constrained reconstruction algorithm (LRICR) for interior tomography in microscopic CT at high resolution. In general, the multi-resolution acquisition based methods can be employed to solve the data truncation problem if the project data acquired at low resolution are utilized to fill up the truncated projection data acquired at high resolution. However, most existing methods place quite strict restrictions on the data acquisition geometry, which greatly limits their utility in practice. In the proposed LRICR algorithm, full and partial data acquisition (scan) at low and high resolutions, respectively, are carried out. Using the image reconstructed from sparse projection data acquired at low resolution as the prior, a microscopic image at high resolution is reconstructed from the truncated projection data acquired at high resolution. Two synthesized digital phantoms, a raw bamboo culm and a specimen of mouse femur, were utilized to evaluate and verify performance of the proposed LRICR algorithm. Compared with the conventional TV minimization based algorithm and the multi-resolution scout-reconstruction algorithm, the proposed LRICR algorithm shows significant improvement in reduction of the artifacts caused by data truncation, providing a practical solution for high quality and reliable interior tomography in microscopic CT applications. The proposed LRICR algorithm outperforms the multi-resolution scout-reconstruction method and the TV minimization based reconstruction for interior tomography in microscopic CT.

  2. Impact of a hospital-wide hand hygiene promotion strategy on healthcare-associated infections.

    PubMed

    Ling, Moi Lin; How, Kue Bien

    2012-03-23

    During the Severe Acute Respiratory Syndrome (SARS) outbreak, high compliance in healthcare workers to hand hygiene was primarily driven by fear. However, the post-SARS period confirmed that this practice was not sustainable. At the Singapore General Hospital, a 1,600-bedded acute tertiary care hospital, the hand hygiene program was revised in early 2007 following Singapore's signing of the pledge to the World Health Organization (WHO) "Clean Care is Safer Care" program. A multi-prong approach was used in designing the hand hygiene program. This included system change; training and education; evaluation and feedback; reminders in the workplace; and institutional safety climate. Hand hygiene compliance rate improved from 20% (in January 2007) to 61% (2010). Improvement was also seen annually in the compliance to each of the 5 moments as well as in all staff categories. Healthcare-associated MRSA infections were reduced from 0.6 (2007) to 0.3 (2010) per 1000 patient-days. Leadership's support of the program evidenced through visible leadership presence, messaging and release of resources is the key factor in helping to make the program a true success. The hospital was recognised as a Global Hand Hygiene Expert Centre in January 2011. The WHO multi-prong interventions work in improving compliance and reducing healthcare associated infections.

  3. Agent Based Intelligence in a Tetrahedral Rover

    NASA Technical Reports Server (NTRS)

    Phelps, Peter; Truszkowski, Walt

    2007-01-01

    A tetrahedron is a 4-node 6-strut pyramid structure which is being used by the NASA - Goddard Space Flight Center as the basic building block for a new approach to robotic motion. The struts are extendable; it is by the sequence of activities: strut-extension, changing the center of gravity and falling that the tetrahedron "moves". Currently, strut-extension is handled by human remote control. There is an effort underway to make the movement of the tetrahedron autonomous, driven by an attempt to achieve a goal. The approach being taken is to associate an intelligent agent with each node. Thus, the autonomous tetrahedron is realized as a constrained multi-agent system, where the constraints arise from the fact that between any two agents there is an extendible strut. The hypothesis of this work is that, by proper composition of such automated tetrahedra, robotic structures of various levels of complexity can be developed which will support more complex dynamic motions. This is the basis of the new approach to robotic motion which is under investigation. A Java-based simulator for the single tetrahedron, realized as a constrained multi-agent system, has been developed and evaluated. This paper reports on this project and presents a discussion of the structure and dynamics of the simulator.

  4. Risk-based analysis and decision making in multi-disciplinary environments

    NASA Technical Reports Server (NTRS)

    Feather, Martin S.; Cornford, Steven L.; Moran, Kelly

    2003-01-01

    A risk-based decision-making process conceived of and developed at JPL and NASA, has been used to help plan and guide novel technology applications for use on spacecraft. These applications exemplify key challenges inherent in multi-disciplinary design of novel technologies deployed in mission-critical settings. 1) Cross-disciplinary concerns are numerous (e.g., spacecraft involve navigation, propulsion, telecommunications). These concems are cross-coupled and interact in multiple ways (e.g., electromagnetic interference, heat transfer). 2) Time and budget pressures constrain development, operational resources constrain the resulting system (e.g., mass, volume, power). 3) Spacecraft are critical systems that must operate correctly the first time in only partially understood environments, with no chance for repair. 4) Past experience provides only a partial guide: New mission concepts are enhanced and enabled by new technologies, for which past experience is lacking. The decision-making process rests on quantitative assessments of the relationships between three classes of information - objectives (the things the system is to accomplish and constraints on its operation and development), risks (whose occurrence detracts from objectives), and mitigations (options for reducing the likelihood and or severity of risks). The process successfully guides experts to pool their knowledge, using custom-built software to support information gathering and decision-making.

  5. Multi-categorical deep learning neural network to classify retinal images: A pilot study employing small database

    PubMed Central

    Seo, Jeong Gi; Kwak, Jiyong; Um, Terry Taewoong; Rim, Tyler Hyungtaek

    2017-01-01

    Deep learning emerges as a powerful tool for analyzing medical images. Retinal disease detection by using computer-aided diagnosis from fundus image has emerged as a new method. We applied deep learning convolutional neural network by using MatConvNet for an automated detection of multiple retinal diseases with fundus photographs involved in STructured Analysis of the REtina (STARE) database. Dataset was built by expanding data on 10 categories, including normal retina and nine retinal diseases. The optimal outcomes were acquired by using a random forest transfer learning based on VGG-19 architecture. The classification results depended greatly on the number of categories. As the number of categories increased, the performance of deep learning models was diminished. When all 10 categories were included, we obtained results with an accuracy of 30.5%, relative classifier information (RCI) of 0.052, and Cohen’s kappa of 0.224. Considering three integrated normal, background diabetic retinopathy, and dry age-related macular degeneration, the multi-categorical classifier showed accuracy of 72.8%, 0.283 RCI, and 0.577 kappa. In addition, several ensemble classifiers enhanced the multi-categorical classification performance. The transfer learning incorporated with ensemble classifier of clustering and voting approach presented the best performance with accuracy of 36.7%, 0.053 RCI, and 0.225 kappa in the 10 retinal diseases classification problem. First, due to the small size of datasets, the deep learning techniques in this study were ineffective to be applied in clinics where numerous patients suffering from various types of retinal disorders visit for diagnosis and treatment. Second, we found that the transfer learning incorporated with ensemble classifiers can improve the classification performance in order to detect multi-categorical retinal diseases. Further studies should confirm the effectiveness of algorithms with large datasets obtained from hospitals. PMID:29095872

  6. Hydrologic and hydraulic flood forecasting constrained by remote sensing data

    NASA Astrophysics Data System (ADS)

    Li, Y.; Grimaldi, S.; Pauwels, V. R. N.; Walker, J. P.; Wright, A. J.

    2017-12-01

    Flooding is one of the most destructive natural disasters, resulting in many deaths and billions of dollars of damages each year. An indispensable tool to mitigate the effect of floods is to provide accurate and timely forecasts. An operational flood forecasting system typically consists of a hydrologic model, converting rainfall data into flood volumes entering the river system, and a hydraulic model, converting these flood volumes into water levels and flood extents. Such a system is prone to various sources of uncertainties from the initial conditions, meteorological forcing, topographic data, model parameters and model structure. To reduce those uncertainties, current forecasting systems are typically calibrated and/or updated using ground-based streamflow measurements, and such applications are limited to well-gauged areas. The recent increasing availability of spatially distributed remote sensing (RS) data offers new opportunities to improve flood forecasting skill. Based on an Australian case study, this presentation will discuss the use of 1) RS soil moisture to constrain a hydrologic model, and 2) RS flood extent and level to constrain a hydraulic model.The GRKAL hydrological model is calibrated through a joint calibration scheme using both ground-based streamflow and RS soil moisture observations. A lag-aware data assimilation approach is tested through a set of synthetic experiments to integrate RS soil moisture to constrain the streamflow forecasting in real-time.The hydraulic model is LISFLOOD-FP which solves the 2-dimensional inertial approximation of the Shallow Water Equations. Gauged water level time series and RS-derived flood extent and levels are used to apply a multi-objective calibration protocol. The effectiveness with which each data source or combination of data sources constrained the parameter space will be discussed.

  7. Identifying and assessing highly hazardous drugs within quality risk management programs.

    PubMed

    Sussman, Robert G; Schatz, Anthony R; Kimmel, Tracy A; Ader, Allan; Naumann, Bruce D; Weideman, Patricia A

    2016-08-01

    Historically, pharmaceutical industry regulatory guidelines have assigned certain active pharmaceutical ingredients (APIs) to various categories of concern, such as "cytotoxic", "hormones", and "steroids". These categories have been used to identify APIs requiring segregation or dedication in order to prevent cross-contamination and protect the quality and safety of drug products. Since these terms were never defined by regulatory authorities, and many novel pharmacological mechanisms challenge these categories, there is a recognized need to modify the historical use of these terms. The application of a risk-based approach using a health-based limit, such as an acceptable daily exposure (ADE), is more appropriate for the development of a Quality Risk Management Program (QRMP) than the use of categories of concern. The toxicological and pharmacological characteristics of these categories are discussed to help identify and prioritize compounds requiring special attention. Controlling airborne concentrations and the contamination of product contact surfaces in accordance with values derived from quantitative risk assessments can prevent adverse effects in workers and patients, regardless of specific categorical designations to which these APIs have been assigned. The authors acknowledge the movement away from placing compounds into categories and, while not yet universal, the importance of basing QRMPs on compound-specific ADEs and risk assessments. Based on the results of a risk assessment, segregation and dedication may also be required for some compounds to prevent cross contamination during manufacture of APIs. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. PrimeSupplier Cross-Program Impact Analysis and Supplier Stability Indicator Simulation Model

    NASA Technical Reports Server (NTRS)

    Calluzzi, Michael

    2009-01-01

    PrimeSupplier, a supplier cross-program and element-impact simulation model, with supplier solvency indicator (SSI), has been developed so that the shuttle program can see early indicators of supplier and product line stability, while identifying the various elements and/or programs that have a particular supplier or product designed into the system. The model calculates two categories of benchmarks to determine the SSI, with one category focusing on agency programmatic data and the other focusing on a supplier's financial liquidity. PrimeSupplier was developed to help NASA smoothly transition design, manufacturing, and repair operations from the Shuttle program to the Constellation program, without disruption in the industrial supply base.

  9. RESIDUAL RISK ASSESSMENTS - RESIDUAL RISK ...

    EPA Pesticide Factsheets

    This source category previously subjected to a technology-based standard will be examined to determine if health or ecological risks are significant enough to warrant further regulation for Coke Ovens. These assesments utilize existing models and data bases to examine the multi-media and multi-pollutant impacts of air toxics emissions on human health and the environment. Details on the assessment process and methodologies can be found in EPA's Residual Risk Report to Congress issued in March of 1999 (see web site). To assess the health risks imposed by air toxics emissions from Coke Ovens to determine if control technology standards previously established are adequately protecting public health.

  10. Evaluating the effects of real power losses in optimal power flow based storage integration

    DOE PAGES

    Castillo, Anya; Gayme, Dennice

    2017-03-27

    This study proposes a DC optimal power flow (DCOPF) with losses formulation (the `-DCOPF+S problem) and uses it to investigate the role of real power losses in OPF based grid-scale storage integration. We derive the `- DCOPF+S problem by augmenting a standard DCOPF with storage (DCOPF+S) problem to include quadratic real power loss approximations. This procedure leads to a multi-period nonconvex quadratically constrained quadratic program, which we prove can be solved to optimality using either a semidefinite or second order cone relaxation. Our approach has some important benefits over existing models. It is more computationally tractable than ACOPF with storagemore » (ACOPF+S) formulations and the provably exact convex relaxations guarantee that an optimal solution can be attained for a feasible problem. Adding loss approximations to a DCOPF+S model leads to a more accurate representation of locational marginal prices, which have been shown to be critical to determining optimal storage dispatch and siting in prior ACOPF+S based studies. Case studies demonstrate the improved accuracy of the `-DCOPF+S model over a DCOPF+S model and the computational advantages over an ACOPF+S formulation.« less

  11. Automatic classification for mammogram backgrounds based on bi-rads complexity definition and on a multi content analysis framework

    NASA Astrophysics Data System (ADS)

    Wu, Jie; Besnehard, Quentin; Marchessoux, Cédric

    2011-03-01

    Clinical studies for the validation of new medical imaging devices require hundreds of images. An important step in creating and tuning the study protocol is the classification of images into "difficult" and "easy" cases. This consists of classifying the image based on features like the complexity of the background, the visibility of the disease (lesions). Therefore, an automatic medical background classification tool for mammograms would help for such clinical studies. This classification tool is based on a multi-content analysis framework (MCA) which was firstly developed to recognize image content of computer screen shots. With the implementation of new texture features and a defined breast density scale, the MCA framework is able to automatically classify digital mammograms with a satisfying accuracy. BI-RADS (Breast Imaging Reporting Data System) density scale is used for grouping the mammograms, which standardizes the mammography reporting terminology and assessment and recommendation categories. Selected features are input into a decision tree classification scheme in MCA framework, which is the so called "weak classifier" (any classifier with a global error rate below 50%). With the AdaBoost iteration algorithm, these "weak classifiers" are combined into a "strong classifier" (a classifier with a low global error rate) for classifying one category. The results of classification for one "strong classifier" show the good accuracy with the high true positive rates. For the four categories the results are: TP=90.38%, TN=67.88%, FP=32.12% and FN =9.62%.

  12. Current-State Constrained Filter Bank for Wald Testing of Spacecraft Conjunctions

    NASA Technical Reports Server (NTRS)

    Carpenter, J. Russell; Markley, F. Landis

    2012-01-01

    We propose a filter bank consisting of an ordinary current-state extended Kalman filter, and two similar but constrained filters: one is constrained by a null hypothesis that the miss distance between two conjuncting spacecraft is inside their combined hard body radius at the predicted time of closest approach, and one is constrained by an alternative complementary hypothesis. The unconstrained filter is the basis of an initial screening for close approaches of interest. Once the initial screening detects a possibly risky conjunction, the unconstrained filter also governs measurement editing for all three filters, and predicts the time of closest approach. The constrained filters operate only when conjunctions of interest occur. The computed likelihoods of the innovations of the two constrained filters form a ratio for a Wald sequential probability ratio test. The Wald test guides risk mitigation maneuver decisions based on explicit false alarm and missed detection criteria. Since only current-state Kalman filtering is required to compute the innovations for the likelihood ratio, the present approach does not require the mapping of probability density forward to the time of closest approach. Instead, the hard-body constraint manifold is mapped to the filter update time by applying a sigma-point transformation to a projection function. Although many projectors are available, we choose one based on Lambert-style differential correction of the current-state velocity. We have tested our method using a scenario based on the Magnetospheric Multi-Scale mission, scheduled for launch in late 2014. This mission involves formation flight in highly elliptical orbits of four spinning spacecraft equipped with antennas extending 120 meters tip-to-tip. Eccentricities range from 0.82 to 0.91, and close approaches generally occur in the vicinity of perigee, where rapid changes in geometry may occur. Testing the method using two 12,000-case Monte Carlo simulations, we found the method achieved a missed detection rate of 0.1%, and a false alarm rate of 2%.

  13. Combustion Research Aboard the ISS Utilizing the Combustion Integrated Rack and Microgravity Science Glovebox

    NASA Technical Reports Server (NTRS)

    Sutliff, Thomas J.; Otero, Angel M.; Urban, David L.

    2002-01-01

    The Physical Sciences Research Program of NASA sponsors a broad suite of peer-reviewed research investigating fundamental combustion phenomena and applied combustion research topics. This research is performed through both ground-based and on-orbit research capabilities. The International Space Station (ISS) and two facilities, the Combustion Integrated Rack and the Microgravity Science Glovebox, are key elements in the execution of microgravity combustion flight research planned for the foreseeable future. This paper reviews the Microgravity Combustion Science research planned for the International Space Station implemented from 2003 through 2012. Examples of selected research topics, expected outcomes, and potential benefits will be provided. This paper also summarizes a multi-user hardware development approach, recapping the progress made in preparing these research hardware systems. Within the description of this approach, an operational strategy is presented that illustrates how utilization of constrained ISS resources may be maximized dynamically to increase science through design decisions made during hardware development.

  14. Constrained non-linear multi-objective optimisation of preventive maintenance scheduling for offshore wind farms

    NASA Astrophysics Data System (ADS)

    Zhong, Shuya; Pantelous, Athanasios A.; Beer, Michael; Zhou, Jian

    2018-05-01

    Offshore wind farm is an emerging source of renewable energy, which has been shown to have tremendous potential in recent years. In this blooming area, a key challenge is that the preventive maintenance of offshore turbines should be scheduled reasonably to satisfy the power supply without failure. In this direction, two significant goals should be considered simultaneously as a trade-off. One is to maximise the system reliability and the other is to minimise the maintenance related cost. Thus, a non-linear multi-objective programming model is proposed including two newly defined objectives with thirteen families of constraints suitable for the preventive maintenance of offshore wind farms. In order to solve our model effectively, the nondominated sorting genetic algorithm II, especially for the multi-objective optimisation is utilised and Pareto-optimal solutions of schedules can be obtained to offer adequate support to decision-makers. Finally, an example is given to illustrate the performances of the devised model and algorithm, and explore the relationships of the two targets with the help of a contrast model.

  15. MONSS: A multi-objective nonlinear simplex search approach

    NASA Astrophysics Data System (ADS)

    Zapotecas-Martínez, Saúl; Coello Coello, Carlos A.

    2016-01-01

    This article presents a novel methodology for dealing with continuous box-constrained multi-objective optimization problems (MOPs). The proposed algorithm adopts a nonlinear simplex search scheme in order to obtain multiple elements of the Pareto optimal set. The search is directed by a well-distributed set of weight vectors, each of which defines a scalarization problem that is solved by deforming a simplex according to the movements described by Nelder and Mead's method. Considering an MOP with n decision variables, the simplex is constructed using n+1 solutions which minimize different scalarization problems defined by n+1 neighbor weight vectors. All solutions found in the search are used to update a set of solutions considered to be the minima for each separate problem. In this way, the proposed algorithm collectively obtains multiple trade-offs among the different conflicting objectives, while maintaining a proper representation of the Pareto optimal front. In this article, it is shown that a well-designed strategy using just mathematical programming techniques can be competitive with respect to the state-of-the-art multi-objective evolutionary algorithms against which it was compared.

  16. Structural optimization: Status and promise

    NASA Astrophysics Data System (ADS)

    Kamat, Manohar P.

    Chapters contained in this book include fundamental concepts of optimum design, mathematical programming methods for constrained optimization, function approximations, approximate reanalysis methods, dual mathematical programming methods for constrained optimization, a generalized optimality criteria method, and a tutorial and survey of multicriteria optimization in engineering. Also included are chapters on the compromise decision support problem and the adaptive linear programming algorithm, sensitivity analyses of discrete and distributed systems, the design sensitivity analysis of nonlinear structures, optimization by decomposition, mixed elements in shape sensitivity analysis of structures based on local criteria, and optimization of stiffened cylindrical shells subjected to destabilizing loads. Other chapters are on applications to fixed-wing aircraft and spacecraft, integrated optimum structural and control design, modeling concurrency in the design of composite structures, and tools for structural optimization. (No individual items are abstracted in this volume)

  17. Energy-Efficient Deadline-Aware Data-Gathering Scheme Using Multiple Mobile Data Collectors.

    PubMed

    Dasgupta, Rumpa; Yoon, Seokhoon

    2017-04-01

    In wireless sensor networks, the data collected by sensors are usually forwarded to the sink through multi-hop forwarding. However, multi-hop forwarding can be inefficient due to the energy hole problem and high communications overhead. Moreover, when the monitored area is large and the number of sensors is small, sensors cannot send the data via multi-hop forwarding due to the lack of network connectivity. In order to address those problems of multi-hop forwarding, in this paper, we consider a data collection scheme that uses mobile data collectors (MDCs), which visit sensors and collect data from them. Due to the recent breakthroughs in wireless power transfer technology, MDCs can also be used to recharge the sensors to keep them from draining their energy. In MDC-based data-gathering schemes, a big challenge is how to find the MDCs' traveling paths in a balanced way, such that their energy consumption is minimized and the packet-delay constraint is satisfied. Therefore, in this paper, we aim at finding the MDCs' paths, taking energy efficiency and delay constraints into account. We first define an optimization problem, named the delay-constrained energy minimization (DCEM) problem, to find the paths for MDCs. An integer linear programming problem is formulated to find the optimal solution. We also propose a two-phase path-selection algorithm to efficiently solve the DCEM problem. Simulations are performed to compare the performance of the proposed algorithms with two heuristics algorithms for the vehicle routing problem under various scenarios. The simulation results show that the proposed algorithms can outperform existing algorithms in terms of energy efficiency and packet delay.

  18. Magnetic MIMO Signal Processing and Optimization for Wireless Power Transfer

    NASA Astrophysics Data System (ADS)

    Yang, Gang; Moghadam, Mohammad R. Vedady; Zhang, Rui

    2017-06-01

    In magnetic resonant coupling (MRC) enabled multiple-input multiple-output (MIMO) wireless power transfer (WPT) systems, multiple transmitters (TXs) each with one single coil are used to enhance the efficiency of simultaneous power transfer to multiple single-coil receivers (RXs) by constructively combining their induced magnetic fields at the RXs, a technique termed "magnetic beamforming". In this paper, we study the optimal magnetic beamforming design in a multi-user MIMO MRC-WPT system. We introduce the multi-user power region that constitutes all the achievable power tuples for all RXs, subject to the given total power constraint over all TXs as well as their individual peak voltage and current constraints. We characterize each boundary point of the power region by maximizing the sum-power deliverable to all RXs subject to their minimum harvested power constraints. For the special case without the TX peak voltage and current constraints, we derive the optimal TX current allocation for the single-RX setup in closed-form as well as that for the multi-RX setup. In general, the problem is a non-convex quadratically constrained quadratic programming (QCQP), which is difficult to solve. For the case of one single RX, we show that the semidefinite relaxation (SDR) of the problem is tight. For the general case with multiple RXs, based on SDR we obtain two approximate solutions by applying time-sharing and randomization, respectively. Moreover, for practical implementation of magnetic beamforming, we propose a novel signal processing method to estimate the magnetic MIMO channel due to the mutual inductances between TXs and RXs. Numerical results show that our proposed magnetic channel estimation and adaptive beamforming schemes are practically effective, and can significantly improve the power transfer efficiency and multi-user performance trade-off in MIMO MRC-WPT systems.

  19. Energy-Efficient Deadline-Aware Data-Gathering Scheme Using Multiple Mobile Data Collectors

    PubMed Central

    Dasgupta, Rumpa; Yoon, Seokhoon

    2017-01-01

    In wireless sensor networks, the data collected by sensors are usually forwarded to the sink through multi-hop forwarding. However, multi-hop forwarding can be inefficient due to the energy hole problem and high communications overhead. Moreover, when the monitored area is large and the number of sensors is small, sensors cannot send the data via multi-hop forwarding due to the lack of network connectivity. In order to address those problems of multi-hop forwarding, in this paper, we consider a data collection scheme that uses mobile data collectors (MDCs), which visit sensors and collect data from them. Due to the recent breakthroughs in wireless power transfer technology, MDCs can also be used to recharge the sensors to keep them from draining their energy. In MDC-based data-gathering schemes, a big challenge is how to find the MDCs’ traveling paths in a balanced way, such that their energy consumption is minimized and the packet-delay constraint is satisfied. Therefore, in this paper, we aim at finding the MDCs’ paths, taking energy efficiency and delay constraints into account. We first define an optimization problem, named the delay-constrained energy minimization (DCEM) problem, to find the paths for MDCs. An integer linear programming problem is formulated to find the optimal solution. We also propose a two-phase path-selection algorithm to efficiently solve the DCEM problem. Simulations are performed to compare the performance of the proposed algorithms with two heuristics algorithms for the vehicle routing problem under various scenarios. The simulation results show that the proposed algorithms can outperform existing algorithms in terms of energy efficiency and packet delay. PMID:28368300

  20. Cost versus life cycle assessment-based environmental impact optimization of drinking water production plants.

    PubMed

    Capitanescu, F; Rege, S; Marvuglia, A; Benetto, E; Ahmadi, A; Gutiérrez, T Navarrete; Tiruta-Barna, L

    2016-07-15

    Empowering decision makers with cost-effective solutions for reducing industrial processes environmental burden, at both design and operation stages, is nowadays a major worldwide concern. The paper addresses this issue for the sector of drinking water production plants (DWPPs), seeking for optimal solutions trading-off operation cost and life cycle assessment (LCA)-based environmental impact while satisfying outlet water quality criteria. This leads to a challenging bi-objective constrained optimization problem, which relies on a computationally expensive intricate process-modelling simulator of the DWPP and has to be solved with limited computational budget. Since mathematical programming methods are unusable in this case, the paper examines the performances in tackling these challenges of six off-the-shelf state-of-the-art global meta-heuristic optimization algorithms, suitable for such simulation-based optimization, namely Strength Pareto Evolutionary Algorithm (SPEA2), Non-dominated Sorting Genetic Algorithm (NSGA-II), Indicator-based Evolutionary Algorithm (IBEA), Multi-Objective Evolutionary Algorithm based on Decomposition (MOEA/D), Differential Evolution (DE), and Particle Swarm Optimization (PSO). The results of optimization reveal that good reduction in both operating cost and environmental impact of the DWPP can be obtained. Furthermore, NSGA-II outperforms the other competing algorithms while MOEA/D and DE perform unexpectedly poorly. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Multi-Cultural Competency-Based Vocational Curricula. Fiberglass Technician. Multi-Cultural Competency-Based Vocational/Technical Curricula Series.

    ERIC Educational Resources Information Center

    Hepburn, Larry; Shin, Masako

    This document, one of eight in a multi-cultural competency-based vocational/technical curricula series, is on fiberglass technician. This program covers 12 instructional areas: orientation, safety, introduction to fiberglass-reinforced plastics, hand lay-up, introduction to equipment operation, chopper operation, gel-coat equipment, finish and…

  2. Therapist strategies early in therapy associated with good or poor outcomes among clients with low proactive agency.

    PubMed

    von der Lippe, Anna Louise; Oddli, Hanne Weie; Halvorsen, Margrethe Seeger

    2017-09-10

    Within a mixed methods program of research the present study aimed at expanding knowledge about interactions in the initial therapeutic collaboration by combining focus on client interpersonal style and therapist contribution. The study involves in-depth analyses of therapist-client interactions in the initial two sessions of good and poor outcome therapies. Based on interpersonal theory and previous research, the Inventory of Interpersonal Problems (IIP-64-C) was used to define poor outcome cases, that is, low proactive agency cases. To compare good and poor outcome cases matched on this interpersonal pattern, cases were drawn from two different samples; nine poor outcome cases from a large multi-site outpatient clinic study and nine good outcome cases from a process-outcome study of highly experienced therapists. Qualitative analysis of therapist behaviors resulted in 2 main categories, fostering client's proactive agentic involvement in change work and discouraging client's proactive agentic involvement in change work, 8 categories and 22 sub-categories. The findings revealed distinct and cohesive differences in therapist behaviors between the two outcome groups, and point to the particular therapist role of fostering client agency through engagement in a shared work on change when clients display strong unassertiveness and low readiness for change. Clinical or Methodological Significance Summary: The present analysis combines focus on client interpersonal style, therapist strategies/process and outcome. The categories generated from the present grounded theory analysis may serve as a foundation for identifying interactions that are associated with agentic involvement in future process research and practice, and hence we have formulated principles/strategies that were identified by the analysis.

  3. The Introductory Anthropology Course: A Multi-Track Approach for Community College Instruction

    ERIC Educational Resources Information Center

    Foster, Daniel J.

    1976-01-01

    Asserts that the most basic types of understandings that students should gain from the beginning anthropology course could be grouped into two broad categories based upon two very important precepts of anthropology: overcoming anthropocentrism and combating ethnocentrism. Using this as a guide, two lists of course objectives were compiled and the…

  4. The Religious-Secular Interface and Representations of Islam in Phenomenological Religious Education

    ERIC Educational Resources Information Center

    Thobani, Shiraz

    2017-01-01

    Alongside community-based education, a principal agency which has contributed to defining multi-faith identities in England and Wales over the past five decades has been the subject of religious education in state maintained schools. Over this period, formulations of the social category of "Muslims" and the curricular concept of…

  5. How can we further improve the LDL-cholesterol target level achievement rate based on the Hungarian MULTI GAP 2011 study results and considering the new European dyslipidemia guidelines?

    PubMed

    Mark, Laszlo; Paragh, György; Karadi, Istvan; Reiber, Istvan; Pados, Gyula; Kiss, Zoltan

    2012-09-08

    Despite the continuous improvement of the quality of lipid lowering therapy the achievement of target values is still not satisfactory, mainly in the very high cardiovascular risk category patients, where the goal of low density lipoprotein cholesterol (LDL-C) is 1.80 mmol/l. The trends in lipid lowering treatment of 17420 patients from different studies conducted between 2004 and 2010 were compared to that of 1626 patients of MULTI GAP (MULTI Goal Attainment Problem) 2011 treated by general practitioners (GPs) and specialists. In MULTI GAP 2011 the mean LDL-C level ± SD) of patients treated by GPs was found to be 2.87 ±1.01 mmol/l, the target value of 2.50 was achieved by 40% of them, in the specialists' patients the mean LDL-C level proved to be 2.77 ±1.10 mmol/l and the achievement rate was 45%. In the 2.50 mmol/l achievement rate of GPs' patients a satisfactory improvement was observed in the studied years, but the 1.80 mmol/l LDL-C goal in 2011 was attained only in 11% of very high risk cases. There was a linear correlation between the patient compliance estimated by the physicians and the LDL-C achievement rate. As the number of very high risk category patients has been increased according to the new European dyslipidemia guidelines, growing attention needs to be placed on attainment of the 1.80 mmol/l LDL-C level. Based on the results of the MULTI GAP studies, improving patients' adherence and the continuous training of physicians are necessary.

  6. Sequential Adaptive Multi-Modality Target Detection and Classification Using Physics Based Models

    DTIC Science & Technology

    2006-09-01

    estimation," R. Raghuram, R. Raich and A.O. Hero, IEEE Intl. Conf. on Acoustics, Speech , and Signal Processing, Toulouse France, June 2006, <http...can then be solved using off-the-shelf classifiers such as radial basis functions, SVM, or kNN classifier structures. When applied to mine detection we...stage waveform selection for adaptive resource constrained state estimation," 2006 IEEE Intl. Conf. on Acoustics, Speech , and Signal Processing

  7. Re-engineering NASA's space communications to remain viable in a constrained fiscal environment

    NASA Astrophysics Data System (ADS)

    Hornstein, Rhoda Shaller; Hei, Donald J., Jr.; Kelly, Angelita C.; Lightfoot, Patricia C.; Bell, Holland T.; Cureton-Snead, Izeller E.; Hurd, William J.; Scales, Charles H.

    1994-11-01

    Along with the Red and Blue Teams commissioned by the NASA Administrator in 1992, NASA's Associate Administrator for Space Communications commissioned a Blue Team to review the Office of Space Communications (Code O) Core Program and determine how the program could be conducted faster, better, and cheaper. Since there was no corresponding Red Team for the Code O Blue Team, the Blue Team assumed a Red Team independent attitude and challenged the status quo, including current work processes, functional distinctions, interfaces, and information flow, as well as traditional management and system development practices. The Blue Team's unconstrained, non-parochial, and imaginative look at NASA's space communications program produced a simplified representation of the space communications infrastructure that transcends organizational and functional boundaries, in addition to existing systems and facilities. Further, the Blue Team adapted the 'faster, better, cheaper' charter to be relevant to the multi-mission, continuous nature of the space communications program and to serve as a gauge for improving customer services concurrent with achieving more efficient operations and infrastructure life cycle economies. This simplified representation, together with the adapted metrics, offers a future view and process model for reengineering NASA's space communications to remain viable in a constrained fiscal environment. Code O remains firm in its commitment to improve productivity, effectiveness, and efficiency. In October 1992, the Associate Administrator reconstituted the Blue Team as the Code O Success Team (COST) to serve as a catalyst for change. In this paper, the COST presents the chronicle and significance of the simplified representation and adapted metrics, and their application during the FY 1993-1994 activities.

  8. Re-engineering NASA's space communications to remain viable in a constrained fiscal environment

    NASA Technical Reports Server (NTRS)

    Hornstein, Rhoda Shaller; Hei, Donald J., Jr.; Kelly, Angelita C.; Lightfoot, Patricia C.; Bell, Holland T.; Cureton-Snead, Izeller E.; Hurd, William J.; Scales, Charles H.

    1994-01-01

    Along with the Red and Blue Teams commissioned by the NASA Administrator in 1992, NASA's Associate Administrator for Space Communications commissioned a Blue Team to review the Office of Space Communications (Code O) Core Program and determine how the program could be conducted faster, better, and cheaper. Since there was no corresponding Red Team for the Code O Blue Team, the Blue Team assumed a Red Team independent attitude and challenged the status quo, including current work processes, functional distinctions, interfaces, and information flow, as well as traditional management and system development practices. The Blue Team's unconstrained, non-parochial, and imaginative look at NASA's space communications program produced a simplified representation of the space communications infrastructure that transcends organizational and functional boundaries, in addition to existing systems and facilities. Further, the Blue Team adapted the 'faster, better, cheaper' charter to be relevant to the multi-mission, continuous nature of the space communications program and to serve as a gauge for improving customer services concurrent with achieving more efficient operations and infrastructure life cycle economies. This simplified representation, together with the adapted metrics, offers a future view and process model for reengineering NASA's space communications to remain viable in a constrained fiscal environment. Code O remains firm in its commitment to improve productivity, effectiveness, and efficiency. In October 1992, the Associate Administrator reconstituted the Blue Team as the Code O Success Team (COST) to serve as a catalyst for change. In this paper, the COST presents the chronicle and significance of the simplified representation and adapted metrics, and their application during the FY 1993-1994 activities.

  9. Assessing the Altitude and Dispersion of Volcanic Plumes Using MISR Multi-angle Imaging from Space: Sixteen Years of Volcanic Activity in the Kamchatka Peninsula, Russia

    NASA Technical Reports Server (NTRS)

    Flower, Verity J. B.; Kahn, Ralph A.

    2017-01-01

    Volcanic eruptions represent a significant source of atmospheric aerosols and can display local, regional and global effects, impacting earth systems and human populations. In order to assess the relative impacts of these events, accurate plume injection altitude measurements are needed. In this work, volcanic plumes generated from seven Kamchatka Peninsula volcanoes (Shiveluch, Kliuchevskoi, Bezymianny, Tolbachik, Kizimen, Karymsky and Zhupanovsky), were identified using over 16 years of Multi-angle Imaging SpectroRadimeter (MISR) measurements. Eighty-eight volcanic plumes were observed by MISR, capturing 3-25% of reported events at individual volcanoes. Retrievals were most successful where high intensity events persisted over a period of weeks to months. Compared with existing ground and airborne observations, and alternative satellite-based reports compiled by the Global Volcanism Program (GVP), MISR plume height retrievals showed general consistency; the comparison reports appear to be skewed towards the region of highest concentration observed in MISR-constrained vertical plume extent. The report observations display less discrepancy with MISR toward the end of the analysis period, with improvements in the suborbital data likely the result of the deployment of new instrumentation. Conversely, the general consistency of MISR plume heights with conventionally reported observations supports the use of MISR in the ongoing assessment of volcanic activity globally, especially where other types of volcanic plume observations are unavailable. Differences between the northern (Shiveluch, Kliuchevskoi, Bezymianny and Tolbachik) and southern (Kizimen, Karymsky and Zhupanovsky) volcanoes broadly correspond to the Central Kamchatka Depression (CKD) and Eastern Volcanic Front (EVF), respectively, geological sub-regions of Kamchatka distinguished by varying magma composition. For example, by comparison with reanalysis-model simulations of local meteorological conditions, CKD plumes generally were less constrained by mid-tropospheric (< 6 km) layers of vertical stability above the boundary layer, suggesting that these eruptions were more energetic than those in the EVF

  10. Constrained multiple indicator kriging using sequential quadratic programming

    NASA Astrophysics Data System (ADS)

    Soltani-Mohammadi, Saeed; Erhan Tercan, A.

    2012-11-01

    Multiple indicator kriging (MIK) is a nonparametric method used to estimate conditional cumulative distribution functions (CCDF). Indicator estimates produced by MIK may not satisfy the order relations of a valid CCDF which is ordered and bounded between 0 and 1. In this paper a new method has been presented that guarantees the order relations of the cumulative distribution functions estimated by multiple indicator kriging. The method is based on minimizing the sum of kriging variances for each cutoff under unbiasedness and order relations constraints and solving constrained indicator kriging system by sequential quadratic programming. A computer code is written in the Matlab environment to implement the developed algorithm and the method is applied to the thickness data.

  11. Protein classification using sequential pattern mining.

    PubMed

    Exarchos, Themis P; Papaloukas, Costas; Lampros, Christos; Fotiadis, Dimitrios I

    2006-01-01

    Protein classification in terms of fold recognition can be employed to determine the structural and functional properties of a newly discovered protein. In this work sequential pattern mining (SPM) is utilized for sequence-based fold recognition. One of the most efficient SPM algorithms, cSPADE, is employed for protein primary structure analysis. Then a classifier uses the extracted sequential patterns for classifying proteins of unknown structure in the appropriate fold category. The proposed methodology exhibited an overall accuracy of 36% in a multi-class problem of 17 candidate categories. The classification performance reaches up to 65% when the three most probable protein folds are considered.

  12. Competency Based Competitive Events. Integrating DECA into the DE Instructional Program.

    ERIC Educational Resources Information Center

    Cosgrove, Glenna; Moore, Harold W.

    Designed to be integrated into a competency-based distributive education program, these competitive DECA (Distributive Education Clubs of America) events were developed, utilized, and evaluated by distributive education and cooperative education coordinators in Arkansas. These events are organized under the following occupational categories: food…

  13. Approximate error conjugation gradient minimization methods

    DOEpatents

    Kallman, Jeffrey S

    2013-05-21

    In one embodiment, a method includes selecting a subset of rays from a set of all rays to use in an error calculation for a constrained conjugate gradient minimization problem, calculating an approximate error using the subset of rays, and calculating a minimum in a conjugate gradient direction based on the approximate error. In another embodiment, a system includes a processor for executing logic, logic for selecting a subset of rays from a set of all rays to use in an error calculation for a constrained conjugate gradient minimization problem, logic for calculating an approximate error using the subset of rays, and logic for calculating a minimum in a conjugate gradient direction based on the approximate error. In other embodiments, computer program products, methods, and systems are described capable of using approximate error in constrained conjugate gradient minimization problems.

  14. Intelligent microchip networks: an agent-on-chip synthesis framework for the design of smart and robust sensor networks

    NASA Astrophysics Data System (ADS)

    Bosse, Stefan

    2013-05-01

    Sensorial materials consisting of high-density, miniaturized, and embedded sensor networks require new robust and reliable data processing and communication approaches. Structural health monitoring is one major field of application for sensorial materials. Each sensor node provides some kind of sensor, electronics, data processing, and communication with a strong focus on microchip-level implementation to meet the goals of miniaturization and low-power energy environments, a prerequisite for autonomous behaviour and operation. Reliability requires robustness of the entire system in the presence of node, link, data processing, and communication failures. Interaction between nodes is required to manage and distribute information. One common interaction model is the mobile agent. An agent approach provides stronger autonomy than a traditional object or remote-procedure-call based approach. Agents can decide for themselves, which actions are performed, and they are capable of flexible behaviour, reacting on the environment and other agents, providing some degree of robustness. Traditionally multi-agent systems are abstract programming models which are implemented in software and executed on program controlled computer architectures. This approach does not well scale to micro-chip level and requires full equipped computers and communication structures, and the hardware architecture does not consider and reflect the requirements for agent processing and interaction. We propose and demonstrate a novel design paradigm for reliable distributed data processing systems and a synthesis methodology and framework for multi-agent systems implementable entirely on microchip-level with resource and power constrained digital logic supporting Agent-On-Chip architectures (AoC). The agent behaviour and mobility is fully integrated on the micro-chip using pipelined communicating processes implemented with finite-state machines and register-transfer logic. The agent behaviour, interaction (communication), and mobility features are modelled and specified on a machine-independent abstract programming level using a state-based agent behaviour language (APL). With this APL a high-level agent compiler is able to synthesize a hardware model (RTL, VHDL), a software model (C, ML), or a simulation model (XML) suitable to simulate a multi-agent system using the SeSAm simulator framework. Agent communication is provided by a simple tuple-space database implemented on node level providing fault tolerant access of global data. A novel synthesis development kit (SynDK) based on a graph-structured database approach is introduced to support the rapid development of compilers and synthesis tools, used for example for the design and implementation of the APL compiler.

  15. R programming for parameters estimation of geographically weighted ordinal logistic regression (GWOLR) model based on Newton Raphson

    NASA Astrophysics Data System (ADS)

    Zuhdi, Shaifudin; Saputro, Dewi Retno Sari

    2017-03-01

    GWOLR model used for represent relationship between dependent variable has categories and scale of category is ordinal with independent variable influenced the geographical location of the observation site. Parameters estimation of GWOLR model use maximum likelihood provide system of nonlinear equations and hard to be found the result in analytic resolution. By finishing it, it means determine the maximum completion, this thing associated with optimizing problem. The completion nonlinear system of equations optimize use numerical approximation, which one is Newton Raphson method. The purpose of this research is to make iteration algorithm Newton Raphson and program using R software to estimate GWOLR model. Based on the research obtained that program in R can be used to estimate the parameters of GWOLR model by forming a syntax program with command "while".

  16. Multi-gene genetic programming based predictive models for municipal solid waste gasification in a fluidized bed gasifier.

    PubMed

    Pandey, Daya Shankar; Pan, Indranil; Das, Saptarshi; Leahy, James J; Kwapinski, Witold

    2015-03-01

    A multi-gene genetic programming technique is proposed as a new method to predict syngas yield production and the lower heating value for municipal solid waste gasification in a fluidized bed gasifier. The study shows that the predicted outputs of the municipal solid waste gasification process are in good agreement with the experimental dataset and also generalise well to validation (untrained) data. Published experimental datasets are used for model training and validation purposes. The results show the effectiveness of the genetic programming technique for solving complex nonlinear regression problems. The multi-gene genetic programming are also compared with a single-gene genetic programming model to show the relative merits and demerits of the technique. This study demonstrates that the genetic programming based data-driven modelling strategy can be a good candidate for developing models for other types of fuels as well. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Application of constrained k-means clustering in ground motion simulation validation

    NASA Astrophysics Data System (ADS)

    Khoshnevis, N.; Taborda, R.

    2017-12-01

    The validation of ground motion synthetics has received increased attention over the last few years due to the advances in physics-based deterministic and hybrid simulation methods. Unlike for low frequency simulations (f ≤ 0.5 Hz), for which it has become reasonable to expect a good match between synthetics and data, in the case of high-frequency simulations (f ≥ 1 Hz) it is not possible to match results on a wiggle-by-wiggle basis. This is mostly due to the various complexities and uncertainties involved in earthquake ground motion modeling. Therefore, in order to compare synthetics with data we turn to different time series metrics, which are used as a means to characterize how the synthetics match the data on qualitative and statistical sense. In general, these metrics provide GOF scores that measure the level of similarity in the time and frequency domains. It is common for these scores to be scaled from 0 to 10, with 10 representing a perfect match. Although using individual metrics for particular applications is considered more adequate, there is no consensus or a unified method to classify the comparison between a set of synthetic and recorded seismograms when the various metrics offer different scores. We study the relationship among these metrics through a constrained k-means clustering approach. We define 4 hypothetical stations with scores 3, 5, 7, and 9 for all metrics. We put these stations in the category of cannot-link constraints. We generate the dataset through the validation of the results from a deterministic (physics-based) ground motion simulation for a moderate magnitude earthquake in the greater Los Angeles basin using three velocity models. The maximum frequency of the simulation is 4 Hz. The dataset involves over 300 stations and 11 metrics, or features, as they are understood in the clustering process, where the metrics form a multi-dimensional space. We address the high-dimensional feature effects with a subspace-clustering analysis, generate a final labeled dataset of stations, and discuss the within-class statistical characteristics of each metric. Labeling these stations is the first step towards developing a unified metric to evaluate ground motion simulations in an application-independent manner.

  18. Global velocity constrained cloud motion prediction for short-term solar forecasting

    NASA Astrophysics Data System (ADS)

    Chen, Yanjun; Li, Wei; Zhang, Chongyang; Hu, Chuanping

    2016-09-01

    Cloud motion is the primary reason for short-term solar power output fluctuation. In this work, a new cloud motion estimation algorithm using a global velocity constraint is proposed. Compared to the most used Particle Image Velocity (PIV) algorithm, which assumes the homogeneity of motion vectors, the proposed method can capture the accurate motion vector for each cloud block, including both the motional tendency and morphological changes. Specifically, global velocity derived from PIV is first calculated, and then fine-grained cloud motion estimation can be achieved by global velocity based cloud block researching and multi-scale cloud block matching. Experimental results show that the proposed global velocity constrained cloud motion prediction achieves comparable performance to the existing PIV and filtered PIV algorithms, especially in a short prediction horizon.

  19. Novel Texture-based Visualization Methods for High-dimensional Multi-field Data Sets

    DTIC Science & Technology

    2013-07-06

    project: In standard format showing authors, title, journal, issue, pages, and date, for each category list the following: b) papers published...visual- isation [18]. Novel image acquisition and simulation tech- niques have made is possible to record a large number of co-located data fields...function, structure, anatomical changes, metabolic activity, blood perfusion, and cellular re- modelling. In this paper we investigate texture-based

  20. Management of cryotherapy-ineligible women in a “screen-and-treat” cervical cancer prevention program targeting HIV-infected women in Zambia: Lessons from the field

    PubMed Central

    Pfaendler, Krista S.; Mwanahamuntu, Mulindi H.; Sahasrabuddhe, Vikrant V.; Mudenda, Victor; Stringer, Jeffrey S.A.; Parham, Groesbeck P.

    2009-01-01

    Objective We demonstrate the feasibility of implementing a referral and management system for cryotherapy-ineligible women in a “screen-and-treat” cervical cancer prevention program targeting HIV-infected women in Zambia. Methods We established criteria for patient referral, developed a training program for loop electrosurgical excision procedure (LEEP) providers, and adapted LEEP to a resource-constrained setting. Results We successfully trained 15 nurses to perform visual inspection with acetic acid (VIA) followed by immediate cryotherapy. Women with positive tests but ineligible for cryotherapy were referred for further evaluation. We trained four Zambian physicians to evaluate referrals, perform punch biopsy, LEEP, and manage intra-operative and post-operative complications. From January 2006 through October 2007, a total of 8823 women (41.5% HIV seropositive) were evaluated by nurses in outlying prevention clinics; of these, 1477 (16.7%) were referred for physician evaluation based on established criteria. Of the 875 (59.2% of 1147 referred) that presented for evaluation, 748 (8.4% of total screened) underwent histologic evaluation in the form of punch biopsy or LEEP. Complications associated with LEEP included anesthesia reaction (n=2) which spontaneously resolved, intra-operative (n=12) and post-operative (n=2) bleeding managed by local measures, and post-operative infection (n=12) managed with antibiotics. Conclusion With adaptations for a resource-constrained environment, we have demonstrated that performing LEEP is feasible and safe, with low rates of complications that can be managed locally. It is important to establish referral and management systems using LEEP-based excisional evaluation for women with cryotherapy-ineligible lesions in VIA-based “screen-and-treat” protocols nested within HIV-care programs in resource-constrained settings. PMID:18556050

  1. Management of cryotherapy-ineligible women in a "screen-and-treat" cervical cancer prevention program targeting HIV-infected women in Zambia: lessons from the field.

    PubMed

    Pfaendler, Krista S; Mwanahamuntu, Mulindi H; Sahasrabuddhe, Vikrant V; Mudenda, Victor; Stringer, Jeffrey S A; Parham, Groesbeck P

    2008-09-01

    We demonstrate the feasibility of implementing a referral and management system for cryotherapy-ineligible women in a "screen-and-treat" cervical cancer prevention program targeting HIV-infected women in Zambia. We established criteria for patient referral, developed a training program for loop electrosurgical excision procedure (LEEP) providers, and adapted LEEP to a resource-constrained setting. We successfully trained 15 nurses to perform visual inspection with acetic acid (VIA) followed by immediate cryotherapy. Women with positive tests but ineligible for cryotherapy were referred for further evaluation. We trained four Zambian physicians to evaluate referrals, perform punch biopsy, LEEP, and manage intra-operative and post-operative complications. From January 2006 through October 2007, a total of 8823 women (41.5% HIV seropositive) were evaluated by nurses in outlying prevention clinics; of these, 1477 (16.7%) were referred for physician evaluation based on established criteria. Of the 875 (59.2% of 1147 referred) that presented for evaluation, 748 (8.4% of total screened) underwent histologic evaluation in the form of punch biopsy or LEEP. Complications associated with LEEP included anesthesia reaction (n=2) which spontaneously resolved, intra-operative (n=12) and post-operative (n=2) bleeding managed by local measures, and post-operative infection (n=12) managed with antibiotics. With adaptations for a resource-constrained environment, we have demonstrated that performing LEEP is feasible and safe, with low rates of complications that can be managed locally. It is important to establish referral and management systems using LEEP-based excisional evaluation for women with cryotherapy-ineligible lesions in VIA-based "screen-and-treat" protocols nested within HIV-care programs in resource-constrained settings.

  2. Multi-Cultural Competency-Based Vocational Curricula. Food Service. Multi-Cultural Competency-Based Vocational/Technical Curricula Series.

    ERIC Educational Resources Information Center

    Hepburn, Larry; Shin, Masako

    This document, one of eight in a multi-cultural competency-based vocational/technical curricula series, is on food service. This program is designed to run 24 weeks and cover 15 instructional areas: orientation, sanitation, management/planning, preparing food for cooking, preparing beverages, cooking eggs, cooking meat, cooking vegetables,…

  3. Multi-Cultural Competency-Based Vocational Curricula. Clerical Clusters. Multi-Cultural Competency-Based Vocational/Technical Curricula Series.

    ERIC Educational Resources Information Center

    Hepburn, Larry; Shin, Masako

    This document, one of eight in a multi-cultural competency-based vocational/technical curricula series, is on clerical occupations. This program is designed to run 36 weeks and cover 10 instructional areas: beginning typing, typing I, typing II, duplicating, receptionist activities, general office procedures, operation of electronic calculator,…

  4. Remembering faces and scenes: The mixed-category advantage in visual working memory.

    PubMed

    Jiang, Yuhong V; Remington, Roger W; Asaad, Anthony; Lee, Hyejin J; Mikkalson, Taylor C

    2016-09-01

    We examined the mixed-category memory advantage for faces and scenes to determine how domain-specific cortical resources constrain visual working memory. Consistent with previous findings, visual working memory for a display of 2 faces and 2 scenes was better than that for a display of 4 faces or 4 scenes. This pattern was unaffected by manipulations of encoding duration. However, the mixed-category advantage was carried solely by faces: Memory for scenes was not better when scenes were encoded with faces rather than with other scenes. The asymmetry between faces and scenes was found when items were presented simultaneously or sequentially, centrally, or peripherally, and when scenes were drawn from a narrow category. A further experiment showed a mixed-category advantage in memory for faces and bodies, but not in memory for scenes and objects. The results suggest that unique category-specific interactions contribute significantly to the mixed-category advantage in visual working memory. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  5. Sun Safety at Work Canada: a multiple case-study protocol to develop sun safety and heat protection programs and policies for outdoor workers.

    PubMed

    Kramer, Desre M; Tenkate, Thomas; Strahlendorf, Peter; Kushner, Rivka; Gardner, Audrey; Holness, D Linn

    2015-07-10

    CAREX Canada has identified solar ultraviolet radiation (UV) as the second most prominent carcinogenic exposure in Canada, and over 75 % of Canadian outdoor workers fall within the highest exposure category. Heat stress also presents an important public health issue, particularly for outdoor workers. The most serious form of heat stress is heat stroke, which can cause irreversible damage to the heart, lungs, kidneys, and liver. Although the need for sun and heat protection has been identified, there is no Canada-wide heat and sun safety program for outdoor workers. Further, no prevention programs have addressed both skin cancer prevention and heat stress in an integrated approach. The aim of this partnered study is to evaluate whether a multi-implementation, multi-evaluation approach can help develop sustainable workplace-specific programs, policies, and procedures to increase the use of UV safety and heat protection. This 2-year study is a theory-driven, multi-site, non-randomized study design with a cross-case analysis of 13 workplaces across four provinces in Canada. The first phase of the study includes the development of workplace-specific programs with the support of the intensive engagement of knowledge brokers. There will be a three-points-in-time evaluation with process and impact components involving the occupational health and safety (OHS) director, management, and workers with the goal of measuring changes in workplace policies, procedures, and practices. It will use mixed methods involving semi-structured key informant interviews, focus groups, surveys, site observations, and UV dosimetry assessment. Using the findings from phase I, in phase 2, a web-based, interactive, intervention planning tool for workplaces will be developed, as will the intensive engagement of intermediaries such as industry decision-makers to link to policymakers about the importance of heat and sun safety for outdoor workers. Solar UV and heat are both health and safety hazards. Using an occupational health and safety risk assessment and control framework, Sun Safety at Work Canada will support workplaces to assess their exposure risks, implement control strategies that build on their existing programs, and embed the controls into their existing occupational health and safety system.

  6. CFBDSIR 2149-0403: young isolated planetary-mass object or high-metallicity low-mass brown dwarf?

    NASA Astrophysics Data System (ADS)

    Delorme, P.; Dupuy, T.; Gagné, J.; Reylé, C.; Forveille, T.; Liu, M. C.; Artigau, E.; Albert, L.; Delfosse, X.; Allard, F.; Homeier, D.; Malo, L.; Morley, C.; Naud, M. E.; Bonnefoy, M.

    2017-06-01

    Aims: We conducted a multi-wavelength, multi-instrument observational characterisation of the candidate free-floating planet CFBDSIR J214947.2-040308.9, a late T-dwarf with possible low-gravity features, in order to constrain its physical properties. Methods: We analysed nine hours of X-shooter spectroscopy with signal detectable from 0.8 to 2.3 μm, as well as additional photometry in the mid-infrared using the Spitzer Space Telescope. Combined with a VLT/HAWK-I astrometric parallax, this enabled a full characterisation of the absolute flux from the visible to 5 μm, encompassing more than 90% of the expected energy emitted by such a cool late T-type object. Our analysis of the spectrum also provided the radial velocity and therefore the determination of its full 3D kinematics. Results: While our new spectrum confirms the low gravity and/or high metallicity of CFBDSIR 2149, the parallax and kinematics safely rule out membership to any known young moving group, including AB Doradus. We use the equivalent width of the K I doublet at 1.25 μm as a promising tool to discriminate the effects of low-gravity from the effects of high-metallicity on the emission spectra of cool atmospheres. In the case of CFBDSIR 2149, the observed K I doublet clearly favours the low-gravity solution. Conclusions: CFBDSIR 2149 is therefore a peculiar late-T dwarf that is probably a young, planetary-mass object (2-13 MJup, <500 Myr) possibly similar to the exoplanet 51 Eri b, or perhaps a 2-40 MJup brown dwarf with super-solar metallicity. Based on observations obtained with X-shooter on VLT-UT2 at ESO-Paranal (run 091.D-0723). Based on observations obtained with HAWKI on VLT-UT4 (run 089.C-0952, 090.C-0483, 091.C-0543,092.C-0548,293.C-5019(A) and run 086.C-0655(A)). Based on observations obtained with ISAAC on VLT-UT3 at ESO-Paranal (run 290.C-5083). Based on observation obtained with WIRCam at CFHT (program 2012BF12). Based on Spitzer Space telescope DDT observation (program 10166).

  7. 42 CFR § 414.1350 - Cost performance category.

    Code of Federal Regulations, 2010 CFR

    2017-10-01

    ... SERVICES (CONTINUED) MEDICARE PROGRAM (CONTINUED) PAYMENT FOR PART B MEDICAL AND OTHER HEALTH SERVICES Merit-Based Incentive Payment System and Alternative Payment Model Incentive § 414.1350 Cost performance... category comprises: (1) 0 percent of a MIPS eligible clinician's final score for MIPS payment year 2019. (2...

  8. Algorithm Design of CPCI Backboard's Interrupts Management Based on VxWorks' Multi-Tasks

    NASA Astrophysics Data System (ADS)

    Cheng, Jingyuan; An, Qi; Yang, Junfeng

    2006-09-01

    This paper begins with a brief introduction of the embedded real-time operating system VxWorks and CompactPCI standard, then gives the programming interfaces of Peripheral Controller Interface (PCI) configuring, interrupts handling and multi-tasks programming interface under VxWorks, and then emphasis is placed on the software frameworks of CPCI interrupt management based on multi-tasks. This method is sound in design and easy to adapt, ensures that all possible interrupts are handled in time, which makes it suitable for data acquisition systems with multi-channels, a high data rate, and hard real-time high energy physics.

  9. Scoresum - A technique for displaying and evaluating multi-element geochemical information, with examples of its use in regional mineral assessment programs

    USGS Publications Warehouse

    Chaffee, M.A.

    1983-01-01

    A technique called SCORESUM was developed to display a maximum of multi-element geochemical information on a minimum number of maps for mineral assessment purposes. The technique can be done manually for a small analytical data set or can be done with a computer for a large data set. SCORESUM can be used with highly censored data and can also weight samples so as to minimize the chemical differences of diverse lithologies in different parts of a given study area. The full range of reported analyses for each element of interest in a data set is divided into four categories. Anomaly scores - values of O (background), 1 (weakly anomalous), 2 (moderately anomalous), and 3 (strongly anomalous) - are substituted for all of the analyses falling into each of the four categories. A group of elements based on known or suspected association in altered or mineralized areas is selected for study and the anomaly scores for these elements are summed for each sample site and then plotted on a map. Some of the results of geochemical studies conducted for mineral assessments in two areas are briefly described. The first area, the Mokelumne Wilderness and vicinity, is a relatively small and geologically simple one. The second, the Walker Lake 1?? ?? 2?? quadrangle, is a large area that has extremely complex geology and that contains a number of different mineral deposit environments. These two studies provide examples of how the SCORESUM technique has been used (1) to enhance relatively small but anomalous areas and (2) to delineate and rank areas containing geochemical signatures for specific suites of elements related to certain types of alteration or mineralization. ?? 1983.

  10. Attitudes and Lifestyle Changes Following Jog Your Mind: Results from a Multi-Factorial Community-Based Program Promoting Cognitive Vitality among Seniors

    ERIC Educational Resources Information Center

    Laforest, Sophie; Lorthios-Guilledroit, Agathe; Nour, Kareen; Parisien, Manon; Fournier, Michel; Ellemberg, Dave; Guay, Danielle; Desgagnés-Cyr, Charles-Émile; Bier, Nathalie

    2017-01-01

    This study examined the effects on attitudes and lifestyle behavior of "Jog your Mind," a multi-factorial community-based program promoting cognitive vitality among seniors with no known cognitive impairment. A quasi-experimental study was conducted. Twenty-three community organizations were assigned either to the experimental group…

  11. Predictors of Numeracy Performance in National Testing Programs: Insights from the Longitudinal Study of Australian Children

    ERIC Educational Resources Information Center

    Carmichael, Colin; MacDonald, Amy; McFarland-Piazza, Laura

    2014-01-01

    This article is based on an exploratory study that examines factors which predict children's performance on the numeracy component of the Australian National Assessment Program--Literacy and Numeracy (NAPLAN). Utilizing an ecological theoretical model, this study examines child, home and school variables which may enable or constrain NAPLAN…

  12. How Cognitive Processes Aid Program Understanding.

    DTIC Science & Technology

    1985-06-01

    information critical to program understanding are...are used in conjunction with a ;rcgrarrrer ’s nowledge base and categories cf information critical to prcgrar understanding are identified. The model... understanding . Further, the study contends that the effectiveness of these processes is aeleraent upon the extent of the programmer’s knowledge base.

  13. Multi-level discriminative dictionary learning with application to large scale image classification.

    PubMed

    Shen, Li; Sun, Gang; Huang, Qingming; Wang, Shuhui; Lin, Zhouchen; Wu, Enhua

    2015-10-01

    The sparse coding technique has shown flexibility and capability in image representation and analysis. It is a powerful tool in many visual applications. Some recent work has shown that incorporating the properties of task (such as discrimination for classification task) into dictionary learning is effective for improving the accuracy. However, the traditional supervised dictionary learning methods suffer from high computation complexity when dealing with large number of categories, making them less satisfactory in large scale applications. In this paper, we propose a novel multi-level discriminative dictionary learning method and apply it to large scale image classification. Our method takes advantage of hierarchical category correlation to encode multi-level discriminative information. Each internal node of the category hierarchy is associated with a discriminative dictionary and a classification model. The dictionaries at different layers are learnt to capture the information of different scales. Moreover, each node at lower layers also inherits the dictionary of its parent, so that the categories at lower layers can be described with multi-scale information. The learning of dictionaries and associated classification models is jointly conducted by minimizing an overall tree loss. The experimental results on challenging data sets demonstrate that our approach achieves excellent accuracy and competitive computation cost compared with other sparse coding methods for large scale image classification.

  14. Evolving land cover classification algorithms for multispectral and multitemporal imagery

    NASA Astrophysics Data System (ADS)

    Brumby, Steven P.; Theiler, James P.; Bloch, Jeffrey J.; Harvey, Neal R.; Perkins, Simon J.; Szymanski, John J.; Young, Aaron C.

    2002-01-01

    The Cerro Grande/Los Alamos forest fire devastated over 43,000 acres (17,500 ha) of forested land, and destroyed over 200 structures in the town of Los Alamos and the adjoining Los Alamos National Laboratory. The need to measure the continuing impact of the fire on the local environment has led to the application of a number of remote sensing technologies. During and after the fire, remote-sensing data was acquired from a variety of aircraft- and satellite-based sensors, including Landsat 7 Enhanced Thematic Mapper (ETM+). We now report on the application of a machine learning technique to the automated classification of land cover using multi-spectral and multi-temporal imagery. We apply a hybrid genetic programming/supervised classification technique to evolve automatic feature extraction algorithms. We use a software package we have developed at Los Alamos National Laboratory, called GENIE, to carry out this evolution. We use multispectral imagery from the Landsat 7 ETM+ instrument from before, during, and after the wildfire. Using an existing land cover classification based on a 1992 Landsat 5 TM scene for our training data, we evolve algorithms that distinguish a range of land cover categories, and an algorithm to mask out clouds and cloud shadows. We report preliminary results of combining individual classification results using a K-means clustering approach. The details of our evolved classification are compared to the manually produced land-cover classification.

  15. Construction of social value or utility-based health indices: the usefulness of factorial experimental design plans.

    PubMed

    Cadman, D; Goldsmith, C

    1986-01-01

    Global indices, which aggregate multiple health or function attributes into a single summary indicator, are useful measures in health research. Two key issues must be addressed in the initial stages of index construction from the universe of possible health and function attributes, which ones should be included in a new index? and how simple can the statistical model be to combine attributes into a single numeric index value? Factorial experimental designs were used in the initial stages of developing a function index for evaluating a program for the care of young handicapped children. Beginning with eight attributes judged important to the goals of the program by clinicians, social preference values for different function states were obtained from 32 parents of handicapped children and 32 members of the community. Using category rating methods each rater scored 16 written multi-attribute case descriptions which contained information about a child's status for all eight attributes. Either a good or poor level of each function attribute and age 3 or 5 years were described in each case. Thus, 2(8) = 256 different cases were rated. Two factorial design plans were selected and used to allocate case descriptions to raters. Analysis of variance determined that seven of the eight clinician selected attributes were required in a social value based index for handicapped children. Most importantly, the subsequent steps of index construction could be greatly simplified by the finding that a simple additive statistical model without complex attribute interaction terms was adequate for the index. We conclude that factorial experimental designs are an efficient, feasible and powerful tool for the initial stages of constructing a multi-attribute health index.

  16. Twelve Years of Education and Public Outreach with the Fermi Gamma-ray Space Telescope

    NASA Astrophysics Data System (ADS)

    Cominsky, Lynn R.; McLin, K. M.; Simonnet, A.; Fermi E/PO Team

    2013-04-01

    During the past twelve years, NASA's Fermi Gamma-ray Space Telescope has supported a wide range of Education and Public Outreach (E/PO) activities, targeting K-14 students and the general public. The purpose of the Fermi E/PO program is to increase student and public understanding of the science of the high-energy Universe, through inspiring, engaging and educational activities linked to the mission’s science objectives. The E/PO program has additional more general goals, including increasing the diversity of students in the Science, Technology, Engineering and Mathematics (STEM) pipeline, and increasing public awareness and understanding of Fermi science and technology. Fermi's multi-faceted E/PO program includes elements in each major outcome category: ● Higher Education: Fermi E/PO promotes STEM careers through the use of NASA data including research experiences for students and teachers (Global Telescope Network), education through STEM curriculum development projects (Cosmology curriculum) and through enrichment activities (Large Area Telescope simulator). ● Elementary and Secondary education: Fermi E/PO links the science objectives of the Fermi mission to well-tested, customer-focused and NASA-approved standards-aligned classroom materials (Black Hole Resources, Active Galaxy Education Unit and Pop-up book, TOPS guides, Supernova Education Unit). These materials have been distributed through (Educator Ambassador and on-line) teacher training workshops and through programs involving under-represented students (after-school clubs and Astro 4 Girls). ● Informal education and public outreach: Fermi E/PO engages the public in sharing the experience of exploration and discovery through high-leverage multi-media experiences (Black Holes planetarium and PBS NOVA shows), through popular websites (Gamma-ray Burst Skymap, Epo's Chronicles), social media (Facebook, MySpace), interactive web-based activities (Space Mysteries, Einstein@Home) and activities by amateur astronomers nation-wide (Supernova! Toolkit). This poster highlights various facets of the Fermi E/PO program.

  17. Understanding Attendance in a Community-Based Parenting Intervention for Immigrant Latino Families.

    PubMed

    Garcia-Huidobro, Diego; Allen, Michele; Rosas-Lee, Maira; Maldonado, Francisco; Gutierrez, Lois; Svetaz, Maria Veronica; Wieling, Elizabeth

    2016-01-01

    Community-based participatory research (CBPR) can help increase the attendance in community programs. Padres Informados, Jovenes Preparados (PIJP) is a program that aims to prevent tobacco and other substance use among Latino youth by promoting positive parenting. Although the trial used CBPR approaches, attendance was inconsistent. In the present study, factors associated with attendance and nonattendance and recommendations to maximize participation were explored in 12 brief feedback discussions (BFDs) with participants and in 10 in-depth interviews (IDIs) with facilitators who delivered PIJP. Content analysis guided two pairs of researchers, who independently coded emerging themes and categories (κ = .86 for BFDs and .73 for IDIs). Data from BFDs and IDIs were merged and interpreted together. We grouped factors that positively affected participation into three categories: individual and family (e.g., motivation), program (e.g., offering food and childcare and having facilitators who are trusted), and research (e.g., having incentives). Barriers to participation were grouped into four categories: individual and family (e.g., family conflicts), sociocultural (e.g., community and cultural beliefs), program (e.g., fixed schedules), and research (e.g., recruitment procedures). Participants provided recommendations to address all types of barriers. Although PIJP used CBPR, complete satisfaction of community needs is difficult. Effective community programs must address participants' needs and preferences. © 2015 Society for Public Health Education.

  18. Understanding Attendance in a Community-Based Parenting Intervention for Immigrant Latino Families

    PubMed Central

    Garcia-Huidobro, Diego; Allen, Michele; Rosas-Lee, Maira; Maldonado, Francisco; Gutierrez, Lois; Svetaz, Maria Veronica; Wieling, Elizabeth

    2017-01-01

    Community-based participatory research (CBPR) can help increase the attendance in community programs. Padres Informados, Jovenes Preparados (PIJP) is a program that aims to prevent tobacco and other substance use among Latino youth by promoting positive parenting. Although the trial used CBPR approaches, attendance was inconsistent. In the present study, factors associated with attendance and nonattendance and recommendations to maximize participation were explored in 12 brief feedback discussions (BFDs) with participants and in 10 in-depth interviews (IDIs) with facilitators who delivered PIJP. Content analysis guided two pairs of researchers, who independently coded emerging themes and categories (κ = .86 for BFDs and .73 for IDIs). Data from BFDs and IDIs were merged and interpreted together. We grouped factors that positively affected participation into three categories: individual and family (e.g., motivation), program (e.g., offering food and childcare and having facilitators who are trusted), and research (e.g., having incentives). Barriers to participation were grouped into four categories: individual and family (e.g., family conflicts), sociocultural (e.g., community and cultural beliefs), program (e.g., fixed schedules), and research (e.g., recruitment procedures). Participants provided recommendations to address all types of barriers. Although PIJP used CBPR, complete satisfaction of community needs is difficult. Effective community programs must address participants’ needs and preferences. PMID:25869496

  19. Discourse Connectives in L1 and L2 Argumentative Writing

    ERIC Educational Resources Information Center

    Hu, Chunyu; Li, Yuanyuan

    2015-01-01

    Discourse connectives (DCs) are multi-functional devices used to connect discourse segments and fulfill interpersonal levels of discourse. This study investigates the use of selected 80 DCs within 11 categories in the argumentative essays produced by L1 and L2 university students. The analysis is based on the International Corpus Network of Asian…

  20. Examining the Effectiveness of a Multi-Sensory Instructional Reading Program in One Rural Midwestern School District

    ERIC Educational Resources Information Center

    Waldvogel, Steven John

    2010-01-01

    Scope and method of study: The purpose of this research study was to examine the effectiveness of an (IMSE) Orton-Gillingham based multi-sensory instructional reading program when incorporated with kindergarten through first grade classroom reading instruction in one rural Midwestern school district. The IMSE supplemental reading program is…

  1. Heterogeneous scalable framework for multiphase flows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morris, Karla Vanessa

    2013-09-01

    Two categories of challenges confront the developer of computational spray models: those related to the computation and those related to the physics. Regarding the computation, the trend towards heterogeneous, multi- and many-core platforms will require considerable re-engineering of codes written for the current supercomputing platforms. Regarding the physics, accurate methods for transferring mass, momentum and energy from the dispersed phase onto the carrier fluid grid have so far eluded modelers. Significant challenges also lie at the intersection between these two categories. To be competitive, any physics model must be expressible in a parallel algorithm that performs well on evolving computermore » platforms. This work created an application based on a software architecture where the physics and software concerns are separated in a way that adds flexibility to both. The develop spray-tracking package includes an application programming interface (API) that abstracts away the platform-dependent parallelization concerns, enabling the scientific programmer to write serial code that the API resolves into parallel processes and threads of execution. The project also developed the infrastructure required to provide similar APIs to other application. The API allow object-oriented Fortran applications direct interaction with Trilinos to support memory management of distributed objects in central processing units (CPU) and graphic processing units (GPU) nodes for applications using C++.« less

  2. Stakeholders' contributions to tailored implementation programs: an observational study of group interview methods.

    PubMed

    Huntink, Elke; van Lieshout, Jan; Aakhus, Eivind; Baker, Richard; Flottorp, Signe; Godycki-Cwirko, Maciek; Jäger, Cornelia; Kowalczyk, Anna; Szecsenyi, Joachim; Wensing, Michel

    2014-12-06

    Tailored strategies to implement evidence-based practice can be generated in several ways. In this study, we explored the usefulness of group interviews for generating these strategies, focused on improving healthcare for patients with chronic diseases. Participants included at least four categories of stakeholders (researchers, quality officers, health professionals, and external stakeholders) in five countries. Interviews comprised brainstorming followed by a structured interview and focused on different chronic conditions in each country. We compared the numbers and types of strategies between stakeholder categories and between interview phases. We also determined which strategies were actually used in tailored intervention programs. In total, 127 individuals participated in 25 group interviews across five countries. Brainstorming generated 8 to 120 strategies per group; structured interviews added 0 to 55 strategies. Healthcare professionals and researchers provided the largest numbers of strategies. The type of strategies for improving healthcare practice did not differ systematically between stakeholder groups in four of the five countries. In three out of five countries, all components of the chosen intervention programs were mentioned by the group of researchers. Group interviews with different stakeholder categories produced many strategies for tailored implementation of evidence-based practice, of which the content was largely similar across stakeholder categories.

  3. Provisional-Ideal-Point-Based Multi-objective Optimization Method for Drone Delivery Problem

    NASA Astrophysics Data System (ADS)

    Omagari, Hiroki; Higashino, Shin-Ichiro

    2018-04-01

    In this paper, we proposed a new evolutionary multi-objective optimization method for solving drone delivery problems (DDP). It can be formulated as a constrained multi-objective optimization problem. In our previous research, we proposed the "aspiration-point-based method" to solve multi-objective optimization problems. However, this method needs to calculate the optimal values of each objective function value in advance. Moreover, it does not consider the constraint conditions except for the objective functions. Therefore, it cannot apply to DDP which has many constraint conditions. To solve these issues, we proposed "provisional-ideal-point-based method." The proposed method defines a "penalty value" to search for feasible solutions. It also defines a new reference solution named "provisional-ideal point" to search for the preferred solution for a decision maker. In this way, we can eliminate the preliminary calculations and its limited application scope. The results of the benchmark test problems show that the proposed method can generate the preferred solution efficiently. The usefulness of the proposed method is also demonstrated by applying it to DDP. As a result, the delivery path when combining one drone and one truck drastically reduces the traveling distance and the delivery time compared with the case of using only one truck.

  4. Measuring short-term post-fire forest recovery across a burn severity gradient in a mixed pine-oak forest using multi-sensor remote sensing techniques

    DOE PAGES

    Meng, Ran; Wu, Jin; Zhao, Feng; ...

    2018-06-01

    Understanding post-fire forest recovery is pivotal to the study of forest dynamics and global carbon cycle. Field-based studies indicated a convex response of forest recovery rate to burn severity at the individual tree level, related with fire-induced tree mortality; however, these findings were constrained in spatial/temporal extents, while not detectable by traditional optical remote sensing studies, largely attributing to the contaminated effect from understory recovery. For this work, we examined whether the combined use of multi-sensor remote sensing techniques (i.e., 1m simultaneous airborne imaging spectroscopy and LiDAR and 2m satellite multi-spectral imagery) to separate canopy recovery from understory recovery wouldmore » enable to quantify post-fire forest recovery rate spanning a large gradient in burn severity over large-scales. Our study was conducted in a mixed pine-oak forest in Long Island, NY, three years after a top-killing fire. Our studies remotely detected an initial increase and then decline of forest recovery rate to burn severity across the burned area, with a maximum canopy area-based recovery rate of 10% per year at moderate forest burn severity class. More intriguingly, such remotely detected convex relationships also held at species level, with pine trees being more resilient to high burn severity and having a higher maximum recovery rate (12% per year) than oak trees (4% per year). These results are one of the first quantitative evidences showing the effects of fire adaptive strategies on post-fire forest recovery, derived from relatively large spatial-temporal domains. Our study thus provides the methodological advance to link multi-sensor remote sensing techniques to monitor forest dynamics in a spatially explicit manner over large-scales, with important implications for fire-related forest management, and for constraining/benchmarking fire effect schemes in ecological process models.« less

  5. Measuring short-term post-fire forest recovery across a burn severity gradient in a mixed pine-oak forest using multi-sensor remote sensing techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meng, Ran; Wu, Jin; Zhao, Feng

    Understanding post-fire forest recovery is pivotal to the study of forest dynamics and global carbon cycle. Field-based studies indicated a convex response of forest recovery rate to burn severity at the individual tree level, related with fire-induced tree mortality; however, these findings were constrained in spatial/temporal extents, while not detectable by traditional optical remote sensing studies, largely attributing to the contaminated effect from understory recovery. For this work, we examined whether the combined use of multi-sensor remote sensing techniques (i.e., 1m simultaneous airborne imaging spectroscopy and LiDAR and 2m satellite multi-spectral imagery) to separate canopy recovery from understory recovery wouldmore » enable to quantify post-fire forest recovery rate spanning a large gradient in burn severity over large-scales. Our study was conducted in a mixed pine-oak forest in Long Island, NY, three years after a top-killing fire. Our studies remotely detected an initial increase and then decline of forest recovery rate to burn severity across the burned area, with a maximum canopy area-based recovery rate of 10% per year at moderate forest burn severity class. More intriguingly, such remotely detected convex relationships also held at species level, with pine trees being more resilient to high burn severity and having a higher maximum recovery rate (12% per year) than oak trees (4% per year). These results are one of the first quantitative evidences showing the effects of fire adaptive strategies on post-fire forest recovery, derived from relatively large spatial-temporal domains. Our study thus provides the methodological advance to link multi-sensor remote sensing techniques to monitor forest dynamics in a spatially explicit manner over large-scales, with important implications for fire-related forest management, and for constraining/benchmarking fire effect schemes in ecological process models.« less

  6. Multi-Cultural Competency-Based Vocational Curricula. Maintenance Mechanics. Multi-Cultural Competency-Based Vocational/Technical Curricula Series.

    ERIC Educational Resources Information Center

    Hepburn, Larry; Shin, Masako

    This document, one of eight in a multi-cultural competency-based vocational/technical curricula series, is on maintenance mechanics. This program is designed to run 40 weeks and cover 5 instructional areas: basic electricity (14 weeks); maintenance and repair of heating (4 weeks); maintenance and repair of air conditioning (12 weeks); maintenance…

  7. Multi-Cultural Competency-Based Vocational Curricula. Machine Trades. Multi-Cultural Competency-Based Vocational/Technical Curricula Series.

    ERIC Educational Resources Information Center

    Hepburn, Larry; Shin, Masako

    This document, one of eight in a multi-cultural competency-based vocational/technical curricula series, is on machine trades. This program is designed to run 36 weeks and cover 6 instructional areas: use of measuring tools; benchwork/tool bit grinding; lathe work; milling work; precision grinding; and combination machine work. A duty-task index…

  8. Achieving accuracy in first-principles calculations at extreme temperature and pressure

    NASA Astrophysics Data System (ADS)

    Mattsson, Ann; Wills, John

    2013-06-01

    First-principles calculations are increasingly used to provide EOS data at pressures and temperatures where experimental data is difficult or impossible to obtain. The lack of experimental data, however, also precludes validation of the calculations in those regimes. Factors influencing the accuracy of first-principles data include theoretical approximations, and computational approximations used in implementing and solving the underlying equations. The first category includes approximate exchange-correlation functionals and wave equations simplifying the Dirac equation. In the second category are, e.g., basis completeness and pseudo-potentials. While the first category is extremely hard to assess without experimental data, inaccuracies of the second type should be well controlled. We are using two rather different electronic structure methods (VASP and RSPt) to make explicit the requirements for accuracy of the second type. We will discuss the VASP Projector Augmented Wave potentials, with examples for Li and Mo. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  9. WE-AB-209-12: Quasi Constrained Multi-Criteria Optimization for Automated Radiation Therapy Treatment Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Watkins, W.T.; Siebers, J.V.

    Purpose: To introduce quasi-constrained Multi-Criteria Optimization (qcMCO) for unsupervised radiation therapy optimization which generates alternative patient-specific plans emphasizing dosimetric tradeoffs and conformance to clinical constraints for multiple delivery techniques. Methods: For N Organs At Risk (OARs) and M delivery techniques, qcMCO generates M(N+1) alternative treatment plans per patient. Objective weight variations for OARs and targets are used to generate alternative qcMCO plans. For 30 locally advanced lung cancer patients, qcMCO plans were generated for dosimetric tradeoffs to four OARs: each lung, heart, and esophagus (N=4) and 4 delivery techniques (simple 4-field arrangements, 9-field coplanar IMRT, 27-field non-coplanar IMRT, and non-coplanarmore » Arc IMRT). Quasi-constrained objectives included target prescription isodose to 95% (PTV-D95), maximum PTV dose (PTV-Dmax)< 110% of prescription, and spinal cord Dmax<45 Gy. The algorithm’s ability to meet these constraints while simultaneously revealing dosimetric tradeoffs was investigated. Statistically significant dosimetric tradeoffs were defined such that the coefficient of determination between dosimetric indices which varied by at least 5 Gy between different plans was >0.8. Results: The qcMCO plans varied mean dose by >5 Gy to ipsilateral lung for 24/30 patients, contralateral lung for 29/30 patients, esophagus for 29/30 patients, and heart for 19/30 patients. In the 600 plans computed without human interaction, average PTV-D95=67.4±3.3 Gy, PTV-Dmax=79.2±5.3 Gy, and spinal cord Dmax was >45 Gy in 93 plans (>50 Gy in 2/600 plans). Statistically significant dosimetric tradeoffs were evident in 19/30 plans, including multiple tradeoffs of at least 5 Gy between multiple OARs in 7/30 cases. The most common statistically significant tradeoff was increasing PTV-Dmax to reduce OAR dose (15/30 patients). Conclusion: The qcMCO method can conform to quasi-constrained objectives while revealing significant variations in OAR doses including mean dose reductions >5 Gy. Clinical implementation will facilitate patient-specific decision making based on achievable dosimetry as opposed to accept/reject models based on population derived objectives.« less

  10. A framework for implementing data services in multi-service mobile satellite systems

    NASA Technical Reports Server (NTRS)

    Ali, Mohammed O.; Leung, Victor C. M.; Spolsky, Andrew I.

    1988-01-01

    Mobile satellite systems being planned for introduction in the early 1990s are expected to be invariably of the multi-service type. Mobile Telephone Service (MTS), Mobile Radio Service (MRS), and Mobile Data Service (MDS) are the major classifications used to categorize the many user applications to be supported. The MTS and MRS services encompass circuit-switched voice communication applications, and may be efficiently implemented using a centralized Demand-Assigned Multiple Access (DAMA) scheme. Applications under the MDS category are, on the other hand, message-oriented and expected to vary widely in characteristics; from simplex mode short messaging applications to long duration, full-duplex interactive data communication and large file transfer applications. For some applications under this service category, the conventional circuit-based DAMA scheme may prove highly inefficient due to the long time required to set up and establish communication links relative to the actual message transmission time. It is proposed that by defining a set of basic bearer services to be supported in MDS and optimizing their transmission and access schemes independent of the MTS and MRS services, the MDS applications can be more efficiently integrated into the multi-service design of mobile satellite systems.

  11. A framework for implementing data services in multi-service mobile satellite systems

    NASA Astrophysics Data System (ADS)

    Ali, Mohammed O.; Leung, Victor C. M.; Spolsky, Andrew I.

    1988-05-01

    Mobile satellite systems being planned for introduction in the early 1990s are expected to be invariably of the multi-service type. Mobile Telephone Service (MTS), Mobile Radio Service (MRS), and Mobile Data Service (MDS) are the major classifications used to categorize the many user applications to be supported. The MTS and MRS services encompass circuit-switched voice communication applications, and may be efficiently implemented using a centralized Demand-Assigned Multiple Access (DAMA) scheme. Applications under the MDS category are, on the other hand, message-oriented and expected to vary widely in characteristics; from simplex mode short messaging applications to long duration, full-duplex interactive data communication and large file transfer applications. For some applications under this service category, the conventional circuit-based DAMA scheme may prove highly inefficient due to the long time required to set up and establish communication links relative to the actual message transmission time. It is proposed that by defining a set of basic bearer services to be supported in MDS and optimizing their transmission and access schemes independent of the MTS and MRS services, the MDS applications can be more efficiently integrated into the multi-service design of mobile satellite systems.

  12. Costs by Major Program Category--A Budgetary Guide to Program Replication.

    ERIC Educational Resources Information Center

    Smith, Gary E.; And Others

    Intended as a budgetary guide for potential replicators, this document presents a detailed discussion of estimated costs involved in setting up and operating a program similar to the Mountin-Plains Career Education Model IV, a residential, family-based education program developed to improve the economic potential and lifestyle of selected student…

  13. Performance impact of mutation operators of a subpopulation-based genetic algorithm for multi-robot task allocation problems.

    PubMed

    Liu, Chun; Kroll, Andreas

    2016-01-01

    Multi-robot task allocation determines the task sequence and distribution for a group of robots in multi-robot systems, which is one of constrained combinatorial optimization problems and more complex in case of cooperative tasks because they introduce additional spatial and temporal constraints. To solve multi-robot task allocation problems with cooperative tasks efficiently, a subpopulation-based genetic algorithm, a crossover-free genetic algorithm employing mutation operators and elitism selection in each subpopulation, is developed in this paper. Moreover, the impact of mutation operators (swap, insertion, inversion, displacement, and their various combinations) is analyzed when solving several industrial plant inspection problems. The experimental results show that: (1) the proposed genetic algorithm can obtain better solutions than the tested binary tournament genetic algorithm with partially mapped crossover; (2) inversion mutation performs better than other tested mutation operators when solving problems without cooperative tasks, and the swap-inversion combination performs better than other tested mutation operators/combinations when solving problems with cooperative tasks. As it is difficult to produce all desired effects with a single mutation operator, using multiple mutation operators (including both inversion and swap) is suggested when solving similar combinatorial optimization problems.

  14. Model-based gene set analysis for Bioconductor.

    PubMed

    Bauer, Sebastian; Robinson, Peter N; Gagneur, Julien

    2011-07-01

    Gene Ontology and other forms of gene-category analysis play a major role in the evaluation of high-throughput experiments in molecular biology. Single-category enrichment analysis procedures such as Fisher's exact test tend to flag large numbers of redundant categories as significant, which can complicate interpretation. We have recently developed an approach called model-based gene set analysis (MGSA), that substantially reduces the number of redundant categories returned by the gene-category analysis. In this work, we present the Bioconductor package mgsa, which makes the MGSA algorithm available to users of the R language. Our package provides a simple and flexible application programming interface for applying the approach. The mgsa package has been made available as part of Bioconductor 2.8. It is released under the conditions of the Artistic license 2.0. peter.robinson@charite.de; julien.gagneur@embl.de.

  15. An inexact chance-constrained programming model for water quality management in Binhai New Area of Tianjin, China.

    PubMed

    Xie, Y L; Li, Y P; Huang, G H; Li, Y F; Chen, L R

    2011-04-15

    In this study, an inexact-chance-constrained water quality management (ICC-WQM) model is developed for planning regional environmental management under uncertainty. This method is based on an integration of interval linear programming (ILP) and chance-constrained programming (CCP) techniques. ICC-WQM allows uncertainties presented as both probability distributions and interval values to be incorporated within a general optimization framework. Complexities in environmental management systems can be systematically reflected, thus applicability of the modeling process can be highly enhanced. The developed method is applied to planning chemical-industry development in Binhai New Area of Tianjin, China. Interval solutions associated with different risk levels of constraint violation have been obtained. They can be used for generating decision alternatives and thus help decision makers identify desired policies under various system-reliability constraints of water environmental capacity of pollutant. Tradeoffs between system benefits and constraint-violation risks can also be tackled. They are helpful for supporting (a) decision of wastewater discharge and government investment, (b) formulation of local policies regarding water consumption, economic development and industry structure, and (c) analysis of interactions among economic benefits, system reliability and pollutant discharges. Copyright © 2011 Elsevier B.V. All rights reserved.

  16. Inductive reasoning about causally transmitted properties.

    PubMed

    Shafto, Patrick; Kemp, Charles; Bonawitz, Elizabeth Baraff; Coley, John D; Tenenbaum, Joshua B

    2008-11-01

    Different intuitive theories constrain and guide inferences in different contexts. Formalizing simple intuitive theories as probabilistic processes operating over structured representations, we present a new computational model of category-based induction about causally transmitted properties. A first experiment demonstrates undergraduates' context-sensitive use of taxonomic and food web knowledge to guide reasoning about causal transmission and shows good qualitative agreement between model predictions and human inferences. A second experiment demonstrates strong quantitative and qualitative fits to inferences about a more complex artificial food web. A third experiment investigates human reasoning about complex novel food webs where species have known taxonomic relations. Results demonstrate a double-dissociation between the predictions of our causal model and a related taxonomic model [Kemp, C., & Tenenbaum, J. B. (2003). Learning domain structures. In Proceedings of the 25th annual conference of the cognitive science society]: the causal model predicts human inferences about diseases but not genes, while the taxonomic model predicts human inferences about genes but not diseases. We contrast our framework with previous models of category-based induction and previous formal instantiations of intuitive theories, and outline challenges in developing a complete model of context-sensitive reasoning.

  17. Optimal Energy Management for Microgrids

    NASA Astrophysics Data System (ADS)

    Zhao, Zheng

    Microgrid is a recent novel concept in part of the development of smart grid. A microgrid is a low voltage and small scale network containing both distributed energy resources (DERs) and load demands. Clean energy is encouraged to be used in a microgrid for economic and sustainable reasons. A microgrid can have two operational modes, the stand-alone mode and grid-connected mode. In this research, a day-ahead optimal energy management for a microgrid under both operational modes is studied. The objective of the optimization model is to minimize fuel cost, improve energy utilization efficiency and reduce gas emissions by scheduling generations of DERs in each hour on the next day. Considering the dynamic performance of battery as Energy Storage System (ESS), the model is featured as a multi-objectives and multi-parametric programming constrained by dynamic programming, which is proposed to be solved by using the Advanced Dynamic Programming (ADP) method. Then, factors influencing the battery life are studied and included in the model in order to obtain an optimal usage pattern of battery and reduce the correlated cost. Moreover, since wind and solar generation is a stochastic process affected by weather changes, the proposed optimization model is performed hourly to track the weather changes. Simulation results are compared with the day-ahead energy management model. At last, conclusions are presented and future research in microgrid energy management is discussed.

  18. A Pulse Rate Detection Method for Mouse Application Based on Multi-PPG Sensors

    PubMed Central

    Chen, Wei-Hao

    2017-01-01

    Heart rate is an important physiological parameter for healthcare. Among measurement methods, photoplethysmography (PPG) is an easy and convenient method for pulse rate detection. However, as the PPG signal faces the challenge of motion artifacts and is constrained by the position chosen, the purpose of this paper is to implement a comfortable and easy-to-use multi-PPG sensor module combined with a stable and accurate real-time pulse rate detection method on a computer mouse. A weighted average method for multi-PPG sensors is used to adjust the weight of each signal channel in order to raise the accuracy and stability of the detected signal, therefore reducing the disturbance of noise under the environment of moving effectively and efficiently. According to the experiment results, the proposed method can increase the usability and probability of PPG signal detection on palms. PMID:28708112

  19. Performance Evaluation Tests for Environmental Research (PETER): evaluation of 114 measures

    NASA Technical Reports Server (NTRS)

    Bittner, A. C. Jr; Carter, R. C.; Kennedy, R. S.; Harbeson, M. M.; Krause, M.

    1986-01-01

    The goal of the Performance Evaluation Tests for Environmental Research (PETER) Program was to identify a set of measures of human capabilities for use in the study of environmental and other time-course effects. 114 measures studied in the PETER Program were evaluated and categorized into four groups based upon task stability and task definition. The Recommended category contained 30 measures that clearly obtained total stabilization and had an acceptable level of reliability efficiency. The Acceptable-But-Redundant category contained 15 measures. The 37 measures in the Marginal category, which included an inordinate number of slope and other derived measures, usually had desirable features which were outweighed by faults. The 32 measures in the Unacceptable category had either differential instability or weak reliability efficiency. It is our opinion that the 30 measures in the Recommended category should be given first consideration for environmental research applications. Further, it is recommended that information pertaining to preexperimental practice requirements and stabilized reliabilities should be utilized in repeated-measures environmental studies.

  20. Multiple R&D projects scheduling optimization with improved particle swarm algorithm.

    PubMed

    Liu, Mengqi; Shan, Miyuan; Wu, Juan

    2014-01-01

    For most enterprises, in order to win the initiative in the fierce competition of market, a key step is to improve their R&D ability to meet the various demands of customers more timely and less costly. This paper discusses the features of multiple R&D environments in large make-to-order enterprises under constrained human resource and budget, and puts forward a multi-project scheduling model during a certain period. Furthermore, we make some improvements to existed particle swarm algorithm and apply the one developed here to the resource-constrained multi-project scheduling model for a simulation experiment. Simultaneously, the feasibility of model and the validity of algorithm are proved in the experiment.

  1. A Multi-Discipline, Multi-Genre Digital Library for Research and Education

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Maly, Kurt; Shen, Stewart N. T.

    2004-01-01

    We describe NCSTRL+, a unified, canonical digital library for educational and scientific and technical information (STI). NCSTRL+ is based on the Networked Computer Science Technical Report Library (NCSTRL), a World Wide Web (WWW) accessible digital library (DL) that provides access to over 100 university departments and laboratories. NCSTRL+ implements two new technologies: cluster functionality and publishing "buckets". We have extended the Dienst protocol, the protocol underlying NCSTRL, to provide the ability to "cluster" independent collections into a logically centralized digital library based upon subject category classification, type of organization, and genres of material. The concept of "buckets" provides a mechanism for publishing and managing logically linked entities with multiple data formats. The NCSTRL+ prototype DL contains the holdings of NCSTRL and the NASA Technical Report Server (NTRS). The prototype demonstrates the feasibility of publishing into a multi-cluster DL, searching across clusters, and storing and presenting buckets of information.

  2. NCSTRL+: Adding Multi-Discipline and Multi-Genre Support to the Dienst Protocol Using Clusters and Buckets

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Maly, Kurt; Shen, Stewart N. T.; Zubair, Mohammad

    1998-01-01

    We describe NCSTRL+, a unified, canonical digital library for scientific and technical information (STI). NCSTRL+ is based on the Networked Computer Science Technical Report Library (NCSTRL), a World Wide Web (WWW) accessible digital library (DL) that provides access to over 100 university departments and laboratories. NCSTRL+ implements two new technologies: cluster functionality and publishing buckets. We have extended Dienst, the protocol underlying NCSTRL, to provide the ability to cluster independent collections into a logically centralized digital library based upon subject category classification, type of organization, and genres of material. The bucket construct provides a mechanism for publishing and managing logically linked entities with multiple data forms as a single object. The NCSTRL+ prototype DL contains the holdings of NCSTRL and the NASA Technical Report Server (NTRS). The prototype demonstrates the feasibility of publishing into a multi-cluster DL, searching across clusters, and storing and presenting buckets of information.

  3. The NEWS Water Cycle Climatology

    NASA Astrophysics Data System (ADS)

    Rodell, M.; Beaudoing, H. K.; L'Ecuyer, T.; Olson, W. S.

    2012-12-01

    NASA's Energy and Water Cycle Study (NEWS) program fosters collaborative research towards improved quantification and prediction of water and energy cycle consequences of climate change. In order to measure change, it is first necessary to describe current conditions. The goal of the first phase of the NEWS Water and Energy Cycle Climatology project was to develop "state of the global water cycle" and "state of the global energy cycle" assessments based on data from modern ground and space based observing systems and data integrating models. The project was a multi-institutional collaboration with more than 20 active contributors. This presentation will describe the results of the water cycle component of the first phase of the project, which include seasonal (monthly) climatologies of water fluxes over land, ocean, and atmosphere at continental and ocean basin scales. The requirement of closure of the water budget (i.e., mass conservation) at various scales was exploited to constrain the flux estimates via an optimization approach that will also be described. Further, error assessments were included with the input datasets, and we examine these in relation to inferred uncertainty in the optimized flux estimates in order to gauge our current ability to close the water budget within an expected uncertainty range.

  4. The NEWS Water Cycle Climatology

    NASA Technical Reports Server (NTRS)

    Rodell, Matthew; Beaudoing, Hiroko Kato; L'Ecuyer, Tristan; William, Olson

    2012-01-01

    NASA's Energy and Water Cycle Study (NEWS) program fosters collaborative research towards improved quantification and prediction of water and energy cycle consequences of climate change. In order to measure change, it is first necessary to describe current conditions. The goal of the first phase of the NEWS Water and Energy Cycle Climatology project was to develop "state of the global water cycle" and "state of the global energy cycle" assessments based on data from modern ground and space based observing systems and data integrating models. The project was a multi-institutional collaboration with more than 20 active contributors. This presentation will describe the results of the water cycle component of the first phase of the project, which include seasonal (monthly) climatologies of water fluxes over land, ocean, and atmosphere at continental and ocean basin scales. The requirement of closure of the water budget (i.e., mass conservation) at various scales was exploited to constrain the flux estimates via an optimization approach that will also be described. Further, error assessments were included with the input datasets, and we examine these in relation to inferred uncertainty in the optimized flux estimates in order to gauge our current ability to close the water budget within an expected uncertainty range.

  5. 75 FR 51075 - National Registry of Evidence-Based Programs and Practices (NREPP): Open Submission Period for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-18

    ... and pre- and post-intervention assessments; this category includes longitudinal/multiple time series... Registry of Evidence-based Programs and Practices (NREPP) is a voluntary rating system designed to provide... acceptance status at that time. The number of reviews conducted will depend on the availability of funds...

  6. Comprehensive, Process-based Identification of Hydrologic Models using Satellite and In-situ Water Storage Data: A Multi-objective calibration Approach

    NASA Astrophysics Data System (ADS)

    Abdo Yassin, Fuad; Wheater, Howard; Razavi, Saman; Sapriza, Gonzalo; Davison, Bruce; Pietroniro, Alain

    2015-04-01

    The credible identification of vertical and horizontal hydrological components and their associated parameters is very challenging (if not impossible) by only constraining the model to streamflow data, especially in regions where the vertical processes significantly dominate the horizontal processes. The prairie areas of the Saskatchewan River basin, a major water system in Canada, demonstrate such behavior, where the hydrologic connectivity and vertical fluxes are mainly controlled by the amount of surface and sub-surface water storages. In this study, we develop a framework for distributed hydrologic model identification and calibration that jointly constrains the model response (i.e., streamflows) as well as a set of model state variables (i.e., water storages) to observations. This framework is set up in the form of multi-objective optimization, where multiple performance criteria are defined and used to simultaneously evaluate the fidelity of the model to streamflow observations and observed (estimated) changes of water storage in the gridded landscape over daily and monthly time scales. The time series of estimated changes in total water storage (including soil, canopy, snow and pond storages) used in this study were derived from an experimental study enhanced by the information obtained from the GRACE satellite. We test this framework on the calibration of a Land Surface Scheme-Hydrology model, called MESH (Modélisation Environmentale Communautaire - Surface and Hydrology), for the Saskatchewan River basin. Pareto Archived Dynamically Dimensioned Search (PA-DDS) is used as the multi-objective optimization engine. The significance of using the developed framework is demonstrated in comparison with the results obtained through a conventional calibration approach to streamflow observations. The approach of incorporating water storage data into the model identification process can more potentially constrain the posterior parameter space, more comprehensively evaluate the model fidelity, and yield more credible predictions.

  7. Teaching Programming by Emphasizing Self-Direction: How Did Students React to the Active Role Required of Them?

    ERIC Educational Resources Information Center

    Isomottonen, Ville; Tirronen, Ville

    2013-01-01

    Lecturing is known to be a controversial form of teaching. With massed classrooms, in particular, it tends to constrain the active participation of students. One of the remedies applied to programming education is to use technology that can vitalize interaction in the classroom, while another is to base teaching increasingly on programming…

  8. Refining multi-model projections of temperature extremes by evaluation against land-atmosphere coupling diagnostics

    NASA Astrophysics Data System (ADS)

    Sippel, Sebastian; Zscheischler, Jakob; Mahecha, Miguel D.; Orth, Rene; Reichstein, Markus; Vogel, Martha; Seneviratne, Sonia I.

    2017-05-01

    The Earth's land surface and the atmosphere are strongly interlinked through the exchange of energy and matter. This coupled behaviour causes various land-atmosphere feedbacks, and an insufficient understanding of these feedbacks contributes to uncertain global climate model projections. For example, a crucial role of the land surface in exacerbating summer heat waves in midlatitude regions has been identified empirically for high-impact heat waves, but individual climate models differ widely in their respective representation of land-atmosphere coupling. Here, we compile an ensemble of 54 combinations of observations-based temperature (T) and evapotranspiration (ET) benchmarking datasets and investigate coincidences of T anomalies with ET anomalies as a proxy for land-atmosphere interactions during periods of anomalously warm temperatures. First, we demonstrate that a large fraction of state-of-the-art climate models from the Coupled Model Intercomparison Project (CMIP5) archive produces systematically too frequent coincidences of high T anomalies with negative ET anomalies in midlatitude regions during the warm season and in several tropical regions year-round. These coincidences (high T, low ET) are closely related to the representation of temperature variability and extremes across the multi-model ensemble. Second, we derive a land-coupling constraint based on the spread of the T-ET datasets and consequently retain only a subset of CMIP5 models that produce a land-coupling behaviour that is compatible with these benchmark estimates. The constrained multi-model simulations exhibit more realistic temperature extremes of reduced magnitude in present climate in regions where models show substantial spread in T-ET coupling, i.e. biases in the model ensemble are consistently reduced. Also the multi-model simulations for the coming decades display decreased absolute temperature extremes in the constrained ensemble. On the other hand, the differences between projected and present-day climate extremes are affected to a lesser extent by the applied constraint, i.e. projected changes are reduced locally by around 0.5 to 1 °C - but this remains a local effect in regions that are highly sensitive to land-atmosphere coupling. In summary, our approach offers a physically consistent, diagnostic-based avenue to evaluate multi-model ensembles and subsequently reduce model biases in simulated and projected extreme temperatures.

  9. The CAnadian NIRISS Unbiased Cluster Survey (CANUCS)

    NASA Astrophysics Data System (ADS)

    Ravindranath, Swara; NIRISS GTO Team

    2017-06-01

    CANUCS GTO program is a JWST spectroscopy and imaging survey of five massive galaxy clusters and ten parallel fields using the NIRISS low-resolution grisms, NIRCam imaging and NIRSpec multi-object spectroscopy. The primary goal is to understand the evolution of low mass galaxies across cosmic time. The resolved emission line maps and line ratios for many galaxies, with some at resolution of 100pc via the magnification by gravitational lensing will enable determining the spatial distribution of star formation, dust and metals. Other science goals include the detection and characterization of galaxies within the reionization epoch, using multiply-imaged lensed galaxies to constrain cluster mass distributions and dark matter substructure, and understanding star-formation suppression in the most massive galaxy clusters. In this talk I will describe the science goals of the CANUCS program. The proposed prime and parallel observations will be presented with details of the implementation of the observation strategy using JWST proposal planning tools.

  10. A multi-wavelength database of water vapor in planet-forming regions

    NASA Astrophysics Data System (ADS)

    Pontoppidan, Klaus

    The inner few astronomical units of gas-rich protoplanetary disk are environments characterized by a rich and active gaseous chemistry. Primitive material left over from the formation of our own Solar System has for a long time yielded tantalizing clues to a heterogenous nebula with intricate dynamical, thermal and chemical structure that ultimately led to a great diversity in the planets and planetesimals of the Solar System. The discovery of a rich chemistry in protoplanetary disks via a forest of strong 3-40 micron molecular emission lines (H2O, OH, CO2, HCN, C2H2,...) allows us for the first time to investigate chemical diversity in other planet-forming environmments (Salyk et al. 2008; Carr & Najita 2008). Further efforts, supported by the Origins program, has established that this molecular forest is seen in the disks surrounding most young solar- type stars (Pontoppidan et al. 2010). We propose a 3-year program to analyze our growing multi-wavelength database of observations of water, OH and organic molecules in the surfaces of protoplanetary disks. The database includes high (R~25,000-100,000) and medium resolution (R~600-3000) 3- 200 micron spectra from a wide range of facilities (Keck-NIRSPEC, VLT-CRIRES, Spitzer-IRS, VLT-VISIR, Gemini-Michelle and Herschel-PACS). Our previous efforts have focused on demonstrating feasibility for observing water and other molecules in planet-forming regions, building statistics to show that the molecular forest is ubiquitous in disks around low-mass and solar-type stars and taking the first steps in understanding the implied chemical abundances. Now, as the next logical step, we will combine multi- wavelength data from our unique multi-wavelength database to map the radial distribution of, in particular, water and its derivatives. 1) We will use both line profile information from the high-resolution spectra, as well as line strengths, from a combination of high and low temperature lines to constrain the radial abundance of water vapor in the emitting surfaces of disks. Despite high water abundances inside ~1 AU, there is evidence that the disk surfaces are strongly depleted in water both from the gas and ice phases, by as much as 6 orders of magnitude, beyond 1-2 AU. This may be due to the settling of icy grains as part of the formation of icy planetesimals (Meijerink et al. 2009; Bergin et al. 2010). We wish to quantify the depletion factor and establish whether this is a common property of all protoplanetary disks. 2) We will pursue critical new datasets using upcoming observational facilities, including spectrally resolved rotational water lines in the mid-infrared. VLT-VISIR, with which we have successfully detected water lines at high resolution, is undergoing a significant hardware upgrade with a planned commissioning around January 2012. The upgrade includes a much larger and more sensitive detector based on technology developed for JWST-MIRI, which is expected to increase its efficiency by 1-2 orders of magnitude. On a longer time scale, SOFIA-EXES, JWST-NIRSpec and MIRI will become essential instruments for moving this field forward. Pontoppidan is a JWST-NIRSpec instrument scientist at STScI. 3) We will search for variability of water lines on time scales of months and compare them to variation already seen in CO gas to investigate its origin. One intriguing possibility is dynamical interaction with protoplanets. The proposed research is highly relevant for the Origins of Solar Systems program as described in the solicitation document. It falls into the categories dealing with "Observations related to understanding the formation and evolution of planetary systems" and "Studies of chemical processes related to the formation of planetary systems."

  11. A multi-wavelength database of water vapor in planet-forming regions

    NASA Astrophysics Data System (ADS)

    Pontoppidan, Klaus

    The inner few astronomical units of gas-rich protoplanetary disk are environments characterized by a rich and active gaseous chemistry. Primitive material left over from the formation of our own Solar System has for a long time yielded tantalizing clues to a heterogenous nebula with intricate dynamical, thermal and chemical structure that ultimately led to a great diversity in the planets and planetesimals of the Solar System. The discovery of a rich chemistry in protoplanetary disks via a forest of strong 3-40 micron molecular emission lines (H2O, OH, CO2, HCN, C2H2,...) allows us for the first time to investigate chemical diversity in other planet-forming environmments (Salyk et al. 2008; Carr & Najita 2008). Further efforts, supported by the Origins program, has established that this molecular forest is seen in the disks surrounding most young solar- type stars (Pontoppidan et al. 2010). We propose a 3-year program to analyze our growing multi-wavelength database of observations of water, OH and organic molecules in the surfaces of protoplanetary disks. The database includes high (R~25,000-100,000) and medium resolution (R~600-3000) 3- 200 micron spectra from a wide range of facilities (Keck-NIRSPEC, VLT-CRIRES, Spitzer-IRS, VLT-VISIR, Gemini-Michelle and Herschel-PACS). Our previous efforts have focused on demonstrating feasibility for observing water and other molecules in planet-forming regions, building statistics to show that the molecular forest is ubiquitous in disks around low-mass and solar-type stars and taking the first steps in understanding the implied chemical abundances. Now, as the next logical step, we will combine multi- wavelength data from our unique multi-wavelength database to map the radial distribution of, in particular, water and its derivatives. 1) Â We will use both line profile information from the high-resolution spectra, as well as line strengths, from a combination of high and low temperature lines to constrain the radial abundance of water vapor in the emitting surfaces of disks. Despite high water abundances inside ~1 AU, there is evidence that the disk surfaces are strongly depleted in water both from the gas and ice phases, by as much as 6 orders of magnitude, beyond 1-2 AU. This may be due to the settling of icy grains as part of the formation of icy planetesimals (Meijerink et al. 2009; Bergin et al. 2010). We wish to quantify the depletion factor and establish whether this is a common property of all protoplanetary disks. 2) Â We will pursue critical new datasets using upcoming observational facilities, including spectrally resolved rotational water lines in the mid-infrared. VLT-VISIR, with which we have successfully detected water lines at high resolution, is undergoing a significant hardware upgrade with a planned commissioning around January 2012. The upgrade includes a much larger and more sensitive detector based on technology developed for JWST-MIRI, which is expected to increase its efficiency by 1-2 orders of magnitude. On a longer time scale, SOFIA-EXES, JWST-NIRSpec and MIRI will become essential instruments for moving this field forward. Pontoppidan is a JWST-NIRSpec instrument scientist at STScI. 3) Â We will search for variability of water lines on time scales of months and compare them to variation already seen in CO gas to investigate its origin. One intriguing possibility is dynamical interaction with protoplanets. The proposed research is highly relevant for the Origins of Solar Systems program as described in the solicitation document. It falls into the categories dealing with "Observations related to understanding the formation and evolution of planetary systems" and "Studies of chemical processes related to the formation of planetary systems."

  12. How Does Professional Development Improve Teaching?

    ERIC Educational Resources Information Center

    Kennedy, Mary M.

    2016-01-01

    Professional development programs are based on different theories of how students learn and different theories of how teachers learn. Reviewers often sort programs according to design features such as program duration, intensity, or the use of specific techniques such as coaches or online lessons, but these categories do not illuminate the…

  13. Thermally Activated Composite with Two-Way and Multi-Shape Memory Effects

    PubMed Central

    Basit, Abdul; L’Hostis, Gildas; Pac, Marie José; Durand, Bernard

    2013-01-01

    The use of shape memory polymer composites is growing rapidly in smart structure applications. In this work, an active asymmetric composite called “controlled behavior composite material (CBCM)” is used as shape memory polymer composite. The programming and the corresponding initial fixity of the composite structure is obtained during a bending test, by heating CBCM above thermal glass transition temperature of the used Epoxy polymer. The shape memory properties of these composites are investigated by a bending test. Three types of recoveries are conducted, two classical recovery tests: unconstrained recovery and constrained recovery, and a new test of partial recovery under load. During recovery, high recovery displacement and force are produced that enables the composite to perform strong two-way actuations along with multi-shape memory effect. The recovery force confirms full recovery with two-way actuation even under a high load. This unique property of CBCM is characterized by the recovered mechanical work. PMID:28788316

  14. The Use of a UNIX-Based Workstation in the Information Systems Laboratory

    DTIC Science & Technology

    1989-03-01

    system. The conclusions of the research and the resulting recommendations are presented in Chapter III. These recommendations include how to manage...required to run the program on a new system, these should not be significant changes. 2. Processing Environment The UNIX processing environment is...interactive with multi-tasking and multi-user capabilities. Multi-tasking refers to the fact that many programs can be run concurrently. This capability

  15. Building cancer nursing skills in a resource-constrained government hospital.

    PubMed

    Strother, R M; Fitch, Margaret; Kamau, Peter; Beattie, Kathy; Boudreau, Angela; Busakhalla, N; Loehrer, P J

    2012-09-01

    Cancer is a rising cause of morbidity and mortality in resource-constrained settings. Few places in the developing world have cancer care experts and infrastructure for caring for cancer patients; therefore, it is imperative to develop this infrastructure and expertise. A critical component of cancer care, rarely addressed in the published literature, is cancer nursing. This report describes an effort to develop cancer nursing subspecialty knowledge and skills in support of a growing resource-constrained comprehensive cancer care program in Western Kenya. This report highlights the context of cancer care delivery in a resource-constrained setting, and describes one targeted intervention to further develop the skill set and knowledge of cancer care providers, as part of collaboration between developed world academic institutions and a medical school and governmental hospital in Western Kenya. Based on observations of current practice, practice setting, and resource limitations, a pragmatic curriculum for cancer care nursing was developed and implemented.

  16. PROGRAM PARAMS USERS GUIDE

    EPA Science Inventory

    PARAMS is a Windows-based computer program that implements 30 methods for estimating the parameters in indoor emissions source models, which are an essential component of indoor air quality (IAQ) and exposure models. These methods fall into eight categories: (1) the properties o...

  17. Geochemical evolution processes and water-quality observations based on results of the National Water-Quality Assessment Program in the San Antonio segment of the Edwards aquifer, 1996-2006

    USGS Publications Warehouse

    Musgrove, MaryLynn; Fahlquist, Lynne; Houston, Natalie A.; Lindgren, Richard J.; Ging, Patricia B.

    2010-01-01

    As part of the National Water-Quality Assessment Program, the U.S. Geological Survey collected and analyzed groundwater samples during 1996-2006 from the San Antonio segment of the Edwards aquifer of central Texas, a productive karst aquifer developed in Cretaceous-age carbonate rocks. These National Water-Quality Assessment Program studies provide an extensive dataset of groundwater geochemistry and water quality, consisting of 249 groundwater samples collected from 136 sites (wells and springs), including (1) wells completed in the shallow, unconfined, and urbanized part of the aquifer in the vicinity of San Antonio (shallow/urban unconfined category), (2) wells completed in the unconfined (outcrop area) part of the regional aquifer (unconfined category), and (3) wells completed in and springs discharging from the confined part of the regional aquifer (confined category). This report evaluates these data to assess geochemical evolution processes, including local- and regional-scale processes controlling groundwater geochemistry, and to make water-quality observations pertaining to sources and distribution of natural constituents and anthropogenic contaminants, the relation between geochemistry and hydrologic conditions, and groundwater age tracers and travel time. Implications for monitoring water-quality trends in karst are also discussed. Geochemical and isotopic data are useful tracers of recharge, groundwater flow, fluid mixing, and water-rock interaction processes that affect water quality. Sources of dissolved constituents to Edwards aquifer groundwater include dissolution of and geochemical interaction with overlying soils and calcite and dolomite minerals that compose the aquifer. Geochemical tracers such as magnesium to calcium and strontium to calcium ratios and strontium isotope compositions are used to evaluate and constrain progressive fluid-evolution processes. Molar ratios of magnesium to calcium and strontium to calcium in groundwater typically increase along flow paths; results for samples of Edwards aquifer groundwater show an increase from shallow/urban unconfined, to unconfined, to confined groundwater categories. These differences are consistent with longer residence times and greater extents of water-rock interaction controlling fluid compositions as groundwater evolves from shallow unconfined groundwater to deeper confined groundwater. Results for stable isotopes of hydrogen and oxygen indicate specific geochemical processes affect some groundwater samples, including mixing with downdip saline water, mixing with recent recharge associated with tropical cyclonic storms, or mixing with recharge water than has undergone evaporation. The composition of surface water recharging the aquifer, as well as mixing with downdip water from the Trinity aquifer or the saline zone, also might affect water quality. A time-series record (1938-2006) of discharge at Comal Springs, one of the major aquifer discharge points, indicates an upward trend for nitrate and chloride concentrations, which likely reflects anthropogenic activities. A small number of organic contaminants were routinely or frequently detected in Edwards aquifer groundwater samples. These were the pesticides atrazine, its degradate deethylatrazine, and simazine; the drinking-water disinfection byproduct chloroform; and the solvent tetrachloroethene. Detection of these contaminants was most frequent in samples of the shallow/urban unconfined groundwater category and least frequent in samples of the unconfined groundwater category. Results indicate that the shallow/urban unconfined part of the aquifer is most affected by anthropogenic contaminants and the unconfined part of the aquifer is the least affected. The high frequency of detection for these anthropogenic contaminants aquifer-wide and in samples of deep, confined groundwater indicates that the entire aquifer is susceptible to water-quality changes as a result of anthropogenic activities. L

  18. Motions, efforts and actuations in constrained dynamic systems: a multi-link open-chain example

    NASA Astrophysics Data System (ADS)

    Duke Perreira, N.

    1999-08-01

    The effort-motion method, which describes the dynamics of open- and closed-chain topologies of rigid bodies interconnected with revolute and prismatic pairs, is interpreted geometrically. Systems are identified for which the simultaneous control of forces and velocities is desirable, and a representative open-chain system is selected for use in the ensuing analysis. Gauge invariant transformations are used to recast the commonly used kinetic and kinematic equations into a dimensional gauge invariant form. Constraint elimination techniques based on singular value decompositions then recast the invariant equations into orthogonal and reciprocal sets of motion and effort equations written in state variable form. The ideal actuation is found that simultaneously achieves the obtainable portions of the desired constraining efforts and motions. The performance is then evaluated of using the actuation closest to the ideal actuation.

  19. Standin' tall: (De) criminalization and acts of resistance among boys of color in an elementary after school STEM program

    NASA Astrophysics Data System (ADS)

    Basile, Vincent

    The United States current incarcerates more citizens than any other country in history, by disproportionately targeting men and boys of color through mechanisms such as the school to prison pipeline. In better understanding the processes that fuel the school to prison pipeline such as criminalizing practices and the ways boys of color resist them, we can begin to identify teaching practices and perspectives which work to disrupt those processes. Examining criminalization and acts of resistance in STEM education is particularly salient because of the high social and economic status STEM knowledge bears in dominant U.S. culture, and the ways access to STEM learning functions as gateways in our education system. Through a longitudinal study in a multi-site elementary after-school STEM program, I examined what criminalization and acts of resistance look like, the ways they interact, and how staff in the program work to disrupt those cycles. I found that criminalization and acts of resistance are normal and ordinary occurrences, frequently interacting in response to each other in escalating patterns. I also found that staff engaged in multiple categories of decriminalizing practices based on highly respectful interactions and viewing boys of color as brilliant students who engage in acts of resistance as a healthy response to oppressive measures.

  20. Comparison of Drug Benefits Provided by Veterans Affairs Canada and the Canadian Forces Health Services Group.

    PubMed

    Chow, Matthew; Wicks, Charles J; Ma, Janice; Grenier, Sylvain

    2017-05-23

    Drug benefits are provided at public expense to all actively serving Canadian Armed Forces (CAF) personnel, with ongoing drug coverage offered by Veterans Affairs Canada (VAC) for selected conditions following termination of employment. Differences in drug coverage between these programs could introduce risks for treatment disruption. Work was undertaken to establish a process that would allow systematic comparison of the entire VAC and CAF formularies, and to identify and explain discordant listings in 14 therapeutic categories that pose risk of adverse outcomes with sudden treatment interruption. Lists of medications were created for each program, including regular benefit and restricted use drugs, using files obtained from the claims processor in January 2015. Products were coded using the Anatomic-Therapeutic-Chemical (ATC) system. Degree of alignment within therapeutic categories was assessed based on the percentage of fifth-level ATCs that were covered in common. Discordantly listed drugs in 14 categories of concern were reviewed to identify similarities in product characteristics. A total of 1124 medications were identified in 80 therapeutic categories. Coverage of medications was identical in 11 categories, and overall, almost three-quarters of identified drugs (73.4%, n = 825) were covered in common by both plans. Many discordant listings reflected known differences in the programs' operating procedures. A number of discrepancies were also identified in newer therapeutic categories. There is significant overlap in the medications covered by the CAF and VAC drug benefit programs. Application of the ATC coding system allowed for discrepancies to be readily identified across the entire formulary, and in specific therapeutic categories of concern. © 2017 Journal of Population Therapeutics and Clinical Pharmacology. All rights reserved.

  1. A survey of physical examination skills taught in undergraduate nursing programs: are we teaching too much?

    PubMed

    Giddens, Jean Foret; Eddy, Linda

    2009-01-01

    Because content saturation is a growing concern, as reflected in the nursing literature, the content taught in undergraduate nursing curricula should be critically examined. The purpose of this descriptive cross-sectional research was to determine and analyze the physical assessment content currently taught in undergraduate nursing programs. A total of 198 individuals teaching in undergraduate nursing programs completed a Web-based survey. Of the 122 skills included on the survey, 81% were reportedly being taught in most of the nursing programs. Total scores for 18 systems-based assessment categories were significantly different among associate and baccalaureate nursing programs in all but three categories: assessment of integument, breast, and female genitals. Previous research has shown that nurses use less than 25% of these same skills regularly in clinical practice, regardless of their educational preparation. Findings from this research raise questions about the breadth to which physical examination content should be taught in undergraduate nursing education.

  2. An English language interface for constrained domains

    NASA Technical Reports Server (NTRS)

    Page, Brenda J.

    1989-01-01

    The Multi-Satellite Operations Control Center (MSOCC) Jargon Interpreter (MJI) demonstrates an English language interface for a constrained domain. A constrained domain is defined as one with a small and well delineated set of actions and objects. The set of actions chosen for the MJI is from the domain of MSOCC Applications Executive (MAE) Systems Test and Operations Language (STOL) directives and contains directives for signing a cathode ray tube (CRT) on or off, calling up or clearing a display page, starting or stopping a procedure, and controlling history recording. The set of objects chosen consists of CRTs, display pages, STOL procedures, and history files. Translation from English sentences to STOL directives is done in two phases. In the first phase, an augmented transition net (ATN) parser and dictionary are used for determining grammatically correct parsings of input sentences. In the second phase, grammatically typed sentences are submitted to a forward-chaining rule-based system for interpretation and translation into equivalent MAE STOL directives. Tests of the MJI show that it is able to translate individual clearly stated sentences into the subset of directives selected for the prototype. This approach to an English language interface may be used for similarly constrained situations by modifying the MJI's dictionary and rules to reflect the change of domain.

  3. MIROS: A Hybrid Real-Time Energy-Efficient Operating System for the Resource-Constrained Wireless Sensor Nodes

    PubMed Central

    Liu, Xing; Hou, Kun Mean; de Vaulx, Christophe; Shi, Hongling; Gholami, Khalid El

    2014-01-01

    Operating system (OS) technology is significant for the proliferation of the wireless sensor network (WSN). With an outstanding OS; the constrained WSN resources (processor; memory and energy) can be utilized efficiently. Moreover; the user application development can be served soundly. In this article; a new hybrid; real-time; memory-efficient; energy-efficient; user-friendly and fault-tolerant WSN OS MIROS is designed and implemented. MIROS implements the hybrid scheduler and the dynamic memory allocator. Real-time scheduling can thus be achieved with low memory consumption. In addition; it implements a mid-layer software EMIDE (Efficient Mid-layer Software for User-Friendly Application Development Environment) to decouple the WSN application from the low-level system. The application programming process can consequently be simplified and the application reprogramming performance improved. Moreover; it combines both the software and the multi-core hardware techniques to conserve the energy resources; improve the node reliability; as well as achieve a new debugging method. To evaluate the performance of MIROS; it is compared with the other WSN OSes (TinyOS; Contiki; SOS; openWSN and mantisOS) from different OS concerns. The final evaluation results prove that MIROS is suitable to be used even on the tight resource-constrained WSN nodes. It can support the real-time WSN applications. Furthermore; it is energy efficient; user friendly and fault tolerant. PMID:25248069

  4. MIROS: a hybrid real-time energy-efficient operating system for the resource-constrained wireless sensor nodes.

    PubMed

    Liu, Xing; Hou, Kun Mean; de Vaulx, Christophe; Shi, Hongling; El Gholami, Khalid

    2014-09-22

    Operating system (OS) technology is significant for the proliferation of the wireless sensor network (WSN). With an outstanding OS; the constrained WSN resources (processor; memory and energy) can be utilized efficiently. Moreover; the user application development can be served soundly. In this article; a new hybrid; real-time; memory-efficient; energy-efficient; user-friendly and fault-tolerant WSN OS MIROS is designed and implemented. MIROS implements the hybrid scheduler and the dynamic memory allocator. Real-time scheduling can thus be achieved with low memory consumption. In addition; it implements a mid-layer software EMIDE (Efficient Mid-layer Software for User-Friendly Application Development Environment) to decouple the WSN application from the low-level system. The application programming process can consequently be simplified and the application reprogramming performance improved. Moreover; it combines both the software and the multi-core hardware techniques to conserve the energy resources; improve the node reliability; as well as achieve a new debugging method. To evaluate the performance of MIROS; it is compared with the other WSN OSes (TinyOS; Contiki; SOS; openWSN and mantisOS) from different OS concerns. The final evaluation results prove that MIROS is suitable to be used even on the tight resource-constrained WSN nodes. It can support the real-time WSN applications. Furthermore; it is energy efficient; user friendly and fault tolerant.

  5. Missile Guidance Law Based on Robust Model Predictive Control Using Neural-Network Optimization.

    PubMed

    Li, Zhijun; Xia, Yuanqing; Su, Chun-Yi; Deng, Jun; Fu, Jun; He, Wei

    2015-08-01

    In this brief, the utilization of robust model-based predictive control is investigated for the problem of missile interception. Treating the target acceleration as a bounded disturbance, novel guidance law using model predictive control is developed by incorporating missile inside constraints. The combined model predictive approach could be transformed as a constrained quadratic programming (QP) problem, which may be solved using a linear variational inequality-based primal-dual neural network over a finite receding horizon. Online solutions to multiple parametric QP problems are used so that constrained optimal control decisions can be made in real time. Simulation studies are conducted to illustrate the effectiveness and performance of the proposed guidance control law for missile interception.

  6. Model-independent indirect detection constraints on hidden sector dark matter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elor, Gilly; Rodd, Nicholas L.; Slatyer, Tracy R.

    2016-06-10

    If dark matter inhabits an expanded “hidden sector”, annihilations may proceed through sequential decays or multi-body final states. We map out the potential signals and current constraints on such a framework in indirect searches, using a model-independent setup based on multi-step hierarchical cascade decays. While remaining agnostic to the details of the hidden sector model, our framework captures the generic broadening of the spectrum of secondary particles (photons, neutrinos, e{sup +}e{sup −} and p-barp) relative to the case of direct annihilation to Standard Model particles. We explore how indirect constraints on dark matter annihilation limit the parameter space for suchmore » cascade/multi-particle decays. We investigate limits from the cosmic microwave background by Planck, the Fermi measurement of photons from the dwarf galaxies, and positron data from AMS-02. The presence of a hidden sector can change the constraints on the dark matter by up to an order of magnitude in either direction (although the effect can be much smaller). We find that generally the bound from the Fermi dwarfs is most constraining for annihilations to photon-rich final states, while AMS-02 is most constraining for electron and muon final states; however in certain instances the CMB bounds overtake both, due to their approximate independence on the details of the hidden sector cascade. We provide the full set of cascade spectra considered here as publicly available code with examples at http://web.mit.edu/lns/research/CascadeSpectra.html.« less

  7. Model-independent indirect detection constraints on hidden sector dark matter

    DOE PAGES

    Elor, Gilly; Rodd, Nicholas L.; Slatyer, Tracy R.; ...

    2016-06-10

    If dark matter inhabits an expanded ``hidden sector'', annihilations may proceed through sequential decays or multi-body final states. We map out the potential signals and current constraints on such a framework in indirect searches, using a model-independent setup based on multi-step hierarchical cascade decays. While remaining agnostic to the details of the hidden sector model, our framework captures the generic broadening of the spectrum of secondary particles (photons, neutrinos, e +e - andmore » $$\\overline{p}$$ p) relative to the case of direct annihilation to Standard Model particles. We explore how indirect constraints on dark matter annihilation limit the parameter space for such cascade/multi-particle decays. We investigate limits from the cosmic microwave background by Planck, the Fermi measurement of photons from the dwarf galaxies, and positron data from AMS-02. The presence of a hidden sector can change the constraints on the dark matter by up to an order of magnitude in either direction (although the effect can be much smaller). We find that generally the bound from the Fermi dwarfs is most constraining for annihilations to photon-rich final states, while AMS-02 is most constraining for electron and muon final states; however in certain instances the CMB bounds overtake both, due to their approximate independence on the details of the hidden sector cascade. We provide the full set of cascade spectra considered here as publicly available code with examples at http://web.mit.edu/lns/research/CascadeSpectra.html.« less

  8. Fingerprint recognition of alien invasive weeds based on the texture character and machine learning

    NASA Astrophysics Data System (ADS)

    Yu, Jia-Jia; Li, Xiao-Li; He, Yong; Xu, Zheng-Hao

    2008-11-01

    Multi-spectral imaging technique based on texture analysis and machine learning was proposed to discriminate alien invasive weeds with similar outline but different categories. The objectives of this study were to investigate the feasibility of using Multi-spectral imaging, especially the near-infrared (NIR) channel (800 nm+/-10 nm) to find the weeds' fingerprints, and validate the performance with specific eigenvalues by co-occurrence matrix. Veronica polita Pries, Veronica persica Poir, longtube ground ivy, Laminum amplexicaule Linn. were selected in this study, which perform different effect in field, and are alien invasive species in China. 307 weed leaves' images were randomly selected for the calibration set, while the remaining 207 samples for the prediction set. All images were pretreated by Wallis filter to adjust the noise by uneven lighting. Gray level co-occurrence matrix was applied to extract the texture character, which shows density, randomness correlation, contrast and homogeneity of texture with different algorithms. Three channels (green channel by 550 nm+/-10 nm, red channel by 650 nm+/-10 nm and NIR channel by 800 nm+/-10 nm) were respectively calculated to get the eigenvalues.Least-squares support vector machines (LS-SVM) was applied to discriminate the categories of weeds by the eigenvalues from co-occurrence matrix. Finally, recognition ratio of 83.35% by NIR channel was obtained, better than the results by green channel (76.67%) and red channel (69.46%). The prediction results of 81.35% indicated that the selected eigenvalues reflected the main characteristics of weeds' fingerprint based on multi-spectral (especially by NIR channel) and LS-SVM model.

  9. A Constraint-Based Planner for Data Production

    NASA Technical Reports Server (NTRS)

    Pang, Wanlin; Golden, Keith

    2005-01-01

    This paper presents a graph-based backtracking algorithm designed to support constrain-tbased planning in data production domains. This algorithm performs backtracking at two nested levels: the outer- backtracking following the structure of the planning graph to select planner subgoals and actions to achieve them and the inner-backtracking inside a subproblem associated with a selected action to find action parameter values. We show this algorithm works well in a planner applied to automating data production in an ecological forecasting system. We also discuss how the idea of multi-level backtracking may improve efficiency of solving semi-structured constraint problems.

  10. Probing Inflation Using Galaxy Clustering On Ultra-Large Scales

    NASA Astrophysics Data System (ADS)

    Dalal, Roohi; de Putter, Roland; Dore, Olivier

    2018-01-01

    A detailed understanding of curvature perturbations in the universe is necessary to constrain theories of inflation. In particular, measurements of the local non-gaussianity parameter, flocNL, enable us to distinguish between two broad classes of inflationary theories, single-field and multi-field inflation. While most single-field theories predict flocNL ≈ ‑5/12 (ns -1), in multi-field theories, flocNL is not constrained to this value and is allowed to be observably large. Achieving σ(flocNL) = 1 would give us discovery potential for detecting multi-field inflation, while finding flocNL=0 would rule out a good fraction of interesting multi-field models. We study the use of galaxy clustering on ultra-large scales to achieve this level of constraint on flocNL. Upcoming surveys such as Euclid and LSST will give us galaxy catalogs from which we can construct the galaxy power spectrum and hence infer a value of flocNL. We consider two possible methods of determining the galaxy power spectrum from a catalog of galaxy positions: the traditional Feldman Kaiser Peacock (FKP) Power Spectrum Estimator, and an Optimal Quadratic Estimator (OQE). We implemented and tested each method using mock galaxy catalogs, and compared the resulting constraints on flocNL. We find that the FKP estimator can measure flocNL in an unbiased way, but there remains room for improvement in its precision. We also find that the OQE is not computationally fast, but remains a promising option due to its ability to isolate the power spectrum at large scales. We plan to extend this research to study alternative methods, such as pixel-based likelihood functions. We also plan to study the impact of general relativistic effects at these scales on our ability to measure flocNL.

  11. Ahead of the game protocol: a multi-component, community sport-based program targeting prevention, promotion and early intervention for mental health among adolescent males.

    PubMed

    Vella, Stewart A; Swann, Christian; Batterham, Marijka; Boydell, Katherine M; Eckermann, Simon; Fogarty, Andrea; Hurley, Diarmuid; Liddle, Sarah K; Lonsdale, Chris; Miller, Andrew; Noetel, Michael; Okely, Anthony D; Sanders, Taren; Telenta, Joanne; Deane, Frank P

    2018-03-21

    There is a recognised need for targeted community-wide mental health strategies and interventions aimed specifically at prevention and early intervention in promoting mental health. Young males are a high need group who hold particularly negative attitudes towards mental health services, and these views are detrimental for early intervention and help-seeking. Organised sports provide a promising context to deliver community-wide mental health strategies and interventions to adolescent males. The aim of the Ahead of the Game program is to test the effectiveness of a multi-component, community-sport based program targeting prevention, promotion and early intervention for mental health among adolescent males. The Ahead of the Game program will be implemented within a sample drawn from community sporting clubs and evaluated using a sample drawn from a matched control community. Four programs are proposed, including two targeting adolescents, one for parents, and one for sports coaches. One adolescent program aims to increase mental health literacy, intentions to seek and/or provide help for mental health, and to decrease stigmatising attitudes. The second adolescent program aims to increase resilience. The goal of the parent program is to increase parental mental health literacy and confidence to provide help. The coach program is intended to increase coaches' supportive behaviours (e.g., autonomy supportive behaviours), and in turn facilitate high-quality motivation and wellbeing among adolescents. Programs will be complemented by a messaging campaign aimed at adolescents to enhance mental health literacy. The effects of the program on adolescent males' psychological distress and wellbeing will also be explored. Organised sports represent a potentially engaging avenue to promote mental health and prevent the onset of mental health problems among adolescent males. The community-based design, with samples drawn from an intervention and a matched control community, enables evaluation of adolescent males' incremental mental health literacy, help-seeking intentions, stigmatising attitudes, motivation, and resilience impacts from the multi-level, multi-component Ahead of the Game program. Notable risks to the study include self-selection bias, the non-randomised design, and the translational nature of the program. However, strengths include extensive community input, as well as the multi-level and multi-component design. Australian New Zealand Clinical Trials Registry ACTRN12617000709347 . Date registered 17 May 2017. Retrospectively registered.

  12. LAWS simulation: Sampling strategies and wind computation algorithms

    NASA Technical Reports Server (NTRS)

    Emmitt, G. D. A.; Wood, S. A.; Houston, S. H.

    1989-01-01

    In general, work has continued on developing and evaluating algorithms designed to manage the Laser Atmospheric Wind Sounder (LAWS) lidar pulses and to compute the horizontal wind vectors from the line-of-sight (LOS) measurements. These efforts fall into three categories: Improvements to the shot management and multi-pair algorithms (SMA/MPA); observing system simulation experiments; and ground-based simulations of LAWS.

  13. Medication management strategies used by older adults with heart failure: A systems-based analysis.

    PubMed

    Mickelson, Robin S; Holden, Richard J

    2017-09-01

    Older adults with heart failure use strategies to cope with the constraining barriers impeding medication management. Strategies are behavioral adaptations that allow goal achievement despite these constraining conditions. When strategies do not exist, are ineffective or maladaptive, medication performance and health outcomes are at risk. While constraints to medication adherence are described in literature, strategies used by patients to manage medications are less well-described or understood. Guided by cognitive engineering concepts, the aim of this study was to describe and analyze the strategies used by older adults with heart failure to achieve their medication management goals. This mixed methods study employed an empirical strategies analysis method to elicit medication management strategies used by older adults with heart failure. Observation and interview data collected from 61 older adults with heart failure and 31 caregivers were analyzed using qualitative content analysis to derive categories, patterns and themes within and across cases. Data derived thematic sub-categories described planned and ad hoc methods of strategic adaptations. Stable strategies proactively adjusted the medication management process, environment, or the patients themselves. Patients applied situational strategies (planned or ad hoc) to irregular or unexpected situations. Medication non-adherence was a strategy employed when life goals conflicted with medication adherence. The health system was a source of constraints without providing commensurate strategies. Patients strived to control their medication system and achieve goals using adaptive strategies. Future patient self-mangement research can benefit from methods and theories used to study professional work, such as strategies analysis.

  14. Prior image constrained image reconstruction in emerging computed tomography applications

    NASA Astrophysics Data System (ADS)

    Brunner, Stephen T.

    Advances have been made in computed tomography (CT), especially in the past five years, by incorporating prior images into the image reconstruction process. In this dissertation, we investigate prior image constrained image reconstruction in three emerging CT applications: dual-energy CT, multi-energy photon-counting CT, and cone-beam CT in image-guided radiation therapy. First, we investigate the application of Prior Image Constrained Compressed Sensing (PICCS) in dual-energy CT, which has been called "one of the hottest research areas in CT." Phantom and animal studies are conducted using a state-of-the-art 64-slice GE Discovery 750 HD CT scanner to investigate the extent to which PICCS can enable radiation dose reduction in material density and virtual monochromatic imaging. Second, we extend the application of PICCS from dual-energy CT to multi-energy photon-counting CT, which has been called "one of the 12 topics in CT to be critical in the next decade." Numerical simulations are conducted to generate multiple energy bin images for a photon-counting CT acquisition and to investigate the extent to which PICCS can enable radiation dose efficiency improvement. Third, we investigate the performance of a newly proposed prior image constrained scatter correction technique to correct scatter-induced shading artifacts in cone-beam CT, which, when used in image-guided radiation therapy procedures, can assist in patient localization, and potentially, dose verification and adaptive radiation therapy. Phantom studies are conducted using a Varian 2100 EX system with an on-board imager to investigate the extent to which the prior image constrained scatter correction technique can mitigate scatter-induced shading artifacts in cone-beam CT. Results show that these prior image constrained image reconstruction techniques can reduce radiation dose in dual-energy CT by 50% in phantom and animal studies in material density and virtual monochromatic imaging, can lead to radiation dose efficiency improvement in multi-energy photon-counting CT, and can mitigate scatter-induced shading artifacts in cone-beam CT in full-fan and half-fan modes.

  15. Multi-Tiered Systems of Support Preservice Residency: A Pilot Undergraduate Teacher Preparation Model

    ERIC Educational Resources Information Center

    Ross, Scott Warren; Lignugaris-Kraft, Ben

    2015-01-01

    This case study examined the implementation of a novel nontraditional teacher preparation program, "Multi-Tiered Systems of Support Preservice Residency Project" (MTSS-PR). The two-year program placed general and special education composite undergraduate majors full time in high-need schools implementing evidence-based systems of…

  16. Measuring Provider Performance for Physicians Participating in the Merit-Based Incentive Payment System.

    PubMed

    Squitieri, Lee; Chung, Kevin C

    2017-07-01

    In 2017, the Centers for Medicare and Medicaid Services began requiring all eligible providers to participate in the Quality Payment Program or face financial reimbursement penalty. The Quality Payment Program outlines two paths for provider participation: the Merit-Based Incentive Payment System and Advanced Alternative Payment Models. For the first performance period beginning in January of 2017, the Centers for Medicare and Medicaid Services estimates that approximately 83 to 90 percent of eligible providers will not qualify for participation in an Advanced Alternative Payment Model and therefore must participate in the Merit-Based Incentive Payment System program. The Merit-Based Incentive Payment System path replaces existing quality-reporting programs and adds several new measures to evaluate providers using four categories of data: (1) quality, (2) cost/resource use, (3) improvement activities, and (4) advancing care information. These categories will be combined to calculate a weighted composite score for each provider or provider group. Composite Merit-Based Incentive Payment System scores based on 2017 performance data will be used to adjust reimbursed payment in 2019. In this article, the authors provide relevant background for understanding value-based provider performance measurement. The authors also discuss Merit-Based Incentive Payment System reporting requirements and scoring methodology to provide plastic surgeons with the necessary information to critically evaluate their own practice capabilities in the context of current performance metrics under the Quality Payment Program.

  17. Outcome based state budget allocation for diabetes prevention programs using multi-criteria optimization with robust weights.

    PubMed

    Mehrotra, Sanjay; Kim, Kibaek

    2011-12-01

    We consider the problem of outcomes based budget allocations to chronic disease prevention programs across the United States (US) to achieve greater geographical healthcare equity. We use Diabetes Prevention and Control Programs (DPCP) by the Center for Disease Control and Prevention (CDC) as an example. We present a multi-criteria robust weighted sum model for such multi-criteria decision making in a group decision setting. The principal component analysis and an inverse linear programming techniques are presented and used to study the actual 2009 budget allocation by CDC. Our results show that the CDC budget allocation process for the DPCPs is not likely model based. In our empirical study, the relative weights for different prevalence and comorbidity factors and the corresponding budgets obtained under different weight regions are discussed. Parametric analysis suggests that money should be allocated to states to promote diabetes education and to increase patient-healthcare provider interactions to reduce disparity across the US.

  18. Energy efficient LED layout optimization for near-uniform illumination

    NASA Astrophysics Data System (ADS)

    Ali, Ramy E.; Elgala, Hany

    2016-09-01

    In this paper, we consider the problem of designing energy efficient light emitting diodes (LEDs) layout while satisfying the illumination constraints. Towards this objective, we present a simple approach to the illumination design problem based on the concept of the virtual LED. We formulate a constrained optimization problem for minimizing the power consumption while maintaining a near-uniform illumination throughout the room. By solving the resulting constrained linear program, we obtain the number of required LEDs and the optimal output luminous intensities that achieve the desired illumination constraints.

  19. Implementation of remote sensing data for flood forecasting

    NASA Astrophysics Data System (ADS)

    Grimaldi, S.; Li, Y.; Pauwels, V. R. N.; Walker, J. P.; Wright, A. J.

    2016-12-01

    Flooding is one of the most frequent and destructive natural disasters. A timely, accurate and reliable flood forecast can provide vital information for flood preparedness, warning delivery, and emergency response. An operational flood forecasting system typically consists of a hydrologic model, which simulates runoff generation and concentration, and a hydraulic model, which models riverine flood wave routing and floodplain inundation. However, these two types of models suffer from various sources of uncertainties, e.g., forcing data initial conditions, model structure and parameters. To reduce those uncertainties, current forecasting systems are typically calibrated and/or updated using streamflow measurements, and such applications are limited in well-gauged areas. The recent increasing availability of spatially distributed Remote Sensing (RS) data offers new opportunities for flood events investigation and forecast. Based on an Australian case study, this presentation will discuss the use 1) of RS soil moisture data to constrain a hydrologic model, and 2) of RS-derived flood extent and level to constrain a hydraulic model. The hydrological model is based on a semi-distributed system coupled with a two-soil-layer rainfall-runoff model GRKAL and a linear Muskingum routing model. Model calibration was performed using either 1) streamflow data only or 2) both streamflow and RS soil moisture data. The model was then further constrained through the integration of real-time soil moisture data. The hydraulic model is based on LISFLOOD-FP which solves the 2D inertial approximation of the Shallow Water Equations. Streamflow data and RS-derived flood extent and levels were used to apply a multi-objective calibration protocol. The effectiveness with which each data source or combination of data sources constrained the parameter space was quantified and discussed.

  20. A comparison of acceleration methods for solving the neutron transport k-eigenvalue problem

    NASA Astrophysics Data System (ADS)

    Willert, Jeffrey; Park, H.; Knoll, D. A.

    2014-10-01

    Over the past several years a number of papers have been written describing modern techniques for numerically computing the dominant eigenvalue of the neutron transport criticality problem. These methods fall into two distinct categories. The first category of methods rewrite the multi-group k-eigenvalue problem as a nonlinear system of equations and solve the resulting system using either a Jacobian-Free Newton-Krylov (JFNK) method or Nonlinear Krylov Acceleration (NKA), a variant of Anderson Acceleration. These methods are generally successful in significantly reducing the number of transport sweeps required to compute the dominant eigenvalue. The second category of methods utilize Moment-Based Acceleration (or High-Order/Low-Order (HOLO) Acceleration). These methods solve a sequence of modified diffusion eigenvalue problems whose solutions converge to the solution of the original transport eigenvalue problem. This second class of methods is, in our experience, always superior to the first, as most of the computational work is eliminated by the acceleration from the LO diffusion system. In this paper, we review each of these methods. Our computational results support our claim that the choice of which nonlinear solver to use, JFNK or NKA, should be secondary. The primary computational savings result from the implementation of a HOLO algorithm. We display computational results for a series of challenging multi-dimensional test problems.

  1. Precision Spectroscopy, Diode Lasers, and Optical Frequency Measurement Technology

    NASA Technical Reports Server (NTRS)

    Hollberg, Leo (Editor); Fox, Richard (Editor); Waltman, Steve (Editor); Robinson, Hugh

    1998-01-01

    This compilation is a selected set of reprints from the Optical Frequency Measurement Group of the Time and Frequency Division of the National Institute of Standards and Technology, and consists of work published between 1987 and 1997. The two main programs represented here are (1) development of tunable diode-laser technology for scientific applications and precision measurements, and (2) research toward the goal of realizing optical-frequency measurements and synthesis. The papers are organized chronologically in five, somewhat arbitrarily chosen categories: Diode Laser Technology, Tunable Laser Systems, Laser Spectroscopy, Optical Synthesis and Extended Wavelength Coverage, and Multi-Photon Interactions and Optical Coherences.

  2. Attitudes and lifestyle changes following Jog your Mind: results from a multi-factorial community-based program promoting cognitive vitality among seniors

    PubMed Central

    Laforest, Sophie; Lorthios-Guilledroit, Agathe; Nour, Kareen; Parisien, Manon; Fournier, Michel; Ellemberg, Dave; Guay, Danielle; Desgagn�s-Cyr, Charles-�mile; Bier, Nathalie

    2017-01-01

    Abstract This study examined the effects on attitudes and lifestyle behavior of Jog your Mind, a multi-factorial community-based program promoting cognitive vitality among seniors with no known cognitive impairment. A quasi-experimental study was conducted. Twenty-three community organizations were assigned either to the experimental group (offering the program) or to the control group (creating a waiting list). They recruited 294 community-dwelling seniors. The aims of the study were to verify the effects of the program on attitudes and behaviors related to cognitive vitality and to explore its effects on cognitive vitality. Data was collected at baseline and after the program. Regression analyses revealed that, following their participation in the program, experimental group participants reported: (i) in terms of attitudes, having a greater feeling of control concerning their cognitive capacities, (ii) in terms of behaviors, using significantly more memory strategies and practicing more physical activity and stimulating activities than control group participants. However, the program had no significant effects on measures of cognitive vitality. This study supports the fact that a multi-factorial community-based program can have significant effects on seniors’ attitudes and lifestyle behaviors related to cognitive vitality but at short term, no effects on cognitive vitality it-self were found. PMID:28334988

  3. Exploring dangerous neighborhoods: Latent Semantic Analysis and computing beyond the bounds of the familiar

    PubMed Central

    Cohen, Trevor; Blatter, Brett; Patel, Vimla

    2005-01-01

    Certain applications require computer systems to approximate intended human meaning. This is achievable in constrained domains with a finite number of concepts. Areas such as psychiatry, however, draw on concepts from the world-at-large. A knowledge structure with broad scope is required to comprehend such domains. Latent Semantic Analysis (LSA) is an unsupervised corpus-based statistical method that derives quantitative estimates of the similarity between words and documents from their contextual usage statistics. The aim of this research was to evaluate the ability of LSA to derive meaningful associations between concepts relevant to the assessment of dangerousness in psychiatry. An expert reference model of dangerousness was used to guide the construction of a relevant corpus. Derived associations between words in the corpus were evaluated qualitatively. A similarity-based scoring function was used to assign dangerousness categories to discharge summaries. LSA was shown to derive intuitive relationships between concepts and correlated significantly better than random with human categorization of psychiatric discharge summaries according to dangerousness. The use of LSA to derive a simulated knowledge structure can extend the scope of computer systems beyond the boundaries of constrained conceptual domains. PMID:16779020

  4. An algorithm for the solution of dynamic linear programs

    NASA Technical Reports Server (NTRS)

    Psiaki, Mark L.

    1989-01-01

    The algorithm's objective is to efficiently solve Dynamic Linear Programs (DLP) by taking advantage of their special staircase structure. This algorithm constitutes a stepping stone to an improved algorithm for solving Dynamic Quadratic Programs, which, in turn, would make the nonlinear programming method of Successive Quadratic Programs more practical for solving trajectory optimization problems. The ultimate goal is to being trajectory optimization solution speeds into the realm of real-time control. The algorithm exploits the staircase nature of the large constraint matrix of the equality-constrained DLPs encountered when solving inequality-constrained DLPs by an active set approach. A numerically-stable, staircase QL factorization of the staircase constraint matrix is carried out starting from its last rows and columns. The resulting recursion is like the time-varying Riccati equation from multi-stage LQR theory. The resulting factorization increases the efficiency of all of the typical LP solution operations over that of a dense matrix LP code. At the same time numerical stability is ensured. The algorithm also takes advantage of dynamic programming ideas about the cost-to-go by relaxing active pseudo constraints in a backwards sweeping process. This further decreases the cost per update of the LP rank-1 updating procedure, although it may result in more changes of the active set that if pseudo constraints were relaxed in a non-stagewise fashion. The usual stability of closed-loop Linear/Quadratic optimally-controlled systems, if it carries over to strictly linear cost functions, implies that the saving due to reduced factor update effort may outweigh the cost of an increased number of updates. An aerospace example is presented in which a ground-to-ground rocket's distance is maximized. This example demonstrates the applicability of this class of algorithms to aerospace guidance. It also sheds light on the efficacy of the proposed pseudo constraint relaxation scheme.

  5. "Talkin' about a revolution": How electronic health records can facilitate the scale-up of HIV care and treatment and catalyze primary care in resource-constrained settings.

    PubMed

    Braitstein, Paula; Einterz, Robert M; Sidle, John E; Kimaiyo, Sylvester; Tierney, William

    2009-11-01

    Health care for patients with HIV infection in developing countries has increased substantially in response to major international funding. Scaling up treatment programs requires timely data on the type, quantity, and quality of care being provided. Increasingly, such programs are turning to electronic health records (EHRs) to provide these data. We describe how a medical school in the United States and another in Kenya collaborated to develop and implement an EHR in a large HIV/AIDS care program in western Kenya. These data were used to manage patients, providers, and the program itself as it grew to encompass 18 sites serving more than 90,000 patients. Lessons learned have been applicable beyond HIV/AIDS to include primary care, chronic disease management, and community-based health screening and disease prevention programs. EHRs will be key to providing the highest possible quality of care for the funds developing countries can commit to health care. Public, private, and academic partnerships can facilitate the development and implementation of EHRs in resource-constrained settings.

  6. Situation analysis of the National Comprehensive Cancer Control Program (2013) in the I. R. of Iran; assessment and recommendations based on the IAEA imPACT mission.

    PubMed

    Rouhollahi, Mohammad Reza; Mohagheghi, Mohammad Ali; Mohammadrezai, Narges; Ghiasvand, Reza; Ghanbari Motlagh, Ali; Harirchi, Iraj; Zendehdel, Kazem

    2014-04-01

    Iran was engaged in the Program of Action for Cancer Therapy (PACT) in 2012, and delegates from the International Atomic Energy Agency (IAEA), and the World Health Organization (WHO) evaluated the National Cancer Control Program (NCCP) status (the imPACT mission), based on which they provided recommendations for improvements of NCCP in the I.R. of Iran. We reported the results of this situational analysis and discussed the recommendations and their implication in the promotion of NCCP in the I.R. of Iran.  International delegates visited the I.R. of Iran and evaluated different aspects and capacities of NCCP in Iran. In addition, a Farsi version of the WHO/IAEA self-assessment tool was completed by local experts and stakeholders, including experts from different departments of the Ministry of Health and Medical Education (MOHME) and representatives from the National Cancer Research Network (NCRN). Following these evaluations, the PACT office provided recommendations for improving the NCCP in Iran. Almost all the recommendations were endorsed by MOHME. The PACT program provided 31 recommendations for improvement of NCCP in Iran in six categories, including planning, cancer registration and information, prevention, early detection, diagnosis and treatment, and palliative care. The most important recommendation was to establish a strong, multi-sectoral NCCP committee and develop an updated national cancer control program. The imPACT mission report provided a comprehensive view about the NCCP status in Iran. An appropriate response to these recommendations and filing the observed gaps will improve the NCCP status in the I.R. of Iran.

  7. Robust, Efficient Depth Reconstruction With Hierarchical Confidence-Based Matching.

    PubMed

    Sun, Li; Chen, Ke; Song, Mingli; Tao, Dacheng; Chen, Gang; Chen, Chun

    2017-07-01

    In recent years, taking photos and capturing videos with mobile devices have become increasingly popular. Emerging applications based on the depth reconstruction technique have been developed, such as Google lens blur. However, depth reconstruction is difficult due to occlusions, non-diffuse surfaces, repetitive patterns, and textureless surfaces, and it has become more difficult due to the unstable image quality and uncontrolled scene condition in the mobile setting. In this paper, we present a novel hierarchical framework with multi-view confidence-based matching for robust, efficient depth reconstruction in uncontrolled scenes. Particularly, the proposed framework combines local cost aggregation with global cost optimization in a complementary manner that increases efficiency and accuracy. A depth map is efficiently obtained in a coarse-to-fine manner by using an image pyramid. Moreover, confidence maps are computed to robustly fuse multi-view matching cues, and to constrain the stereo matching on a finer scale. The proposed framework has been evaluated with challenging indoor and outdoor scenes, and has achieved robust and efficient depth reconstruction.

  8. Resource Management in Constrained Dynamic Situations

    NASA Astrophysics Data System (ADS)

    Seok, Jinwoo

    Resource management is considered in this dissertation for systems with limited resources, possibly combined with other system constraints, in unpredictably dynamic environments. Resources may represent fuel, power, capabilities, energy, and so on. Resource management is important for many practical systems; usually, resources are limited, and their use must be optimized. Furthermore, systems are often constrained, and constraints must be satisfied for safe operation. Simplistic resource management can result in poor use of resources and failure of the system. Furthermore, many real-world situations involve dynamic environments. Many traditional problems are formulated based on the assumptions of given probabilities or perfect knowledge of future events. However, in many cases, the future is completely unknown, and information on or probabilities about future events are not available. In other words, we operate in unpredictably dynamic situations. Thus, a method is needed to handle dynamic situations without knowledge of the future, but few formal methods have been developed to address them. Thus, the goal is to design resource management methods for constrained systems, with limited resources, in unpredictably dynamic environments. To this end, resource management is organized hierarchically into two levels: 1) planning, and 2) control. In the planning level, the set of tasks to be performed is scheduled based on limited resources to maximize resource usage in unpredictably dynamic environments. In the control level, the system controller is designed to follow the schedule by considering all the system constraints for safe and efficient operation. Consequently, this dissertation is mainly divided into two parts: 1) planning level design, based on finite state machines, and 2) control level methods, based on model predictive control. We define a recomposable restricted finite state machine to handle limited resource situations and unpredictably dynamic environments for the planning level. To obtain a policy, dynamic programing is applied, and to obtain a solution, limited breadth-first search is applied to the recomposable restricted finite state machine. A multi-function phased array radar resource management problem and an unmanned aerial vehicle patrolling problem are treated using recomposable restricted finite state machines. Then, we use model predictive control for the control level, because it allows constraint handling and setpoint tracking for the schedule. An aircraft power system management problem is treated that aims to develop an integrated control system for an aircraft gas turbine engine and electrical power system using rate-based model predictive control. Our results indicate that at the planning level, limited breadth-first search for recomposable restricted finite state machines generates good scheduling solutions in limited resource situations and unpredictably dynamic environments. The importance of cooperation in the planning level is also verified. At the control level, a rate-based model predictive controller allows good schedule tracking and safe operations. The importance of considering the system constraints and interactions between the subsystems is indicated. For the best resource management in constrained dynamic situations, the planning level and the control level need to be considered together.

  9. Energy Operation Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Energy Operation Model (EOM) simulates the operation of the electric grid at the zonal scale, including inter-zonal transmission constraints. It generates the production cost, power generation by plant and category, fuel usage, and locational marginal price (LMP) with a flexible way to constrain the power production by environmental constraints, e.g. heat waves, drought conditions). Different from commercial software such as PROMOD IV where generator capacity and heat rate efficiency can only be adjusted on a monthly basis, EOM calculates capacity impacts and plant efficiencies based on hourly ambient conditions (air temperature and humidity) and cooling water availability for thermal plants.more » What is missing is a hydro power dispatch.« less

  10. High-throughput screening of a diversity collection using biodefense category A and B priority pathogens.

    PubMed

    Barrow, Esther W; Clinkenbeard, Patricia A; Duncan-Decocq, Rebecca A; Perteet, Rachel F; Hill, Kimberly D; Bourne, Philip C; Valderas, Michelle W; Bourne, Christina R; Clarkson, Nicole L; Clinkenbeard, Kenneth D; Barrow, William W

    2012-08-01

    One of the objectives of the National Institutes of Allergy and Infectious Diseases (NIAID) Biodefense Program is to identify or develop broad-spectrum antimicrobials for use against bioterrorism pathogens and emerging infectious agents. As a part of that program, our institution has screened the 10 000-compound MyriaScreen Diversity Collection of high-purity druglike compounds against three NIAID category A and one category B priority pathogens in an effort to identify potential compound classes for further drug development. The effective use of a Clinical and Laboratory Standards Institute-based high-throughput screening (HTS) 96-well-based format allowed for the identification of 49 compounds that had in vitro activity against all four pathogens with minimum inhibitory concentration values of ≤16 µg/mL. Adaptation of the HTS process was necessary to conduct the work in higher-level containment, in this case, biosafety level 3. Examination of chemical scaffolds shared by some of the 49 compounds and assessment of available chemical databases indicates that several may represent broad-spectrum antimicrobials whose activity is based on novel mechanisms of action.

  11. Brief Report: Coaching Adolescents with Autism Spectrum Disorder in a School-Based Multi-Sport Program

    ERIC Educational Resources Information Center

    Rosso, Edoardo G.

    2016-01-01

    While physical activity (PA) is often overwhelming for people with ASD, appropriate engagement strategies can result in increased motivation to participate and associated physical and psychosocial benefits. In this framework, the multi-sport Supporting Success program aims to inform good-practice coaching strategies for community coaches to engage…

  12. A Standardization Evaluation Potential Study of the Common Multi-Mode Radar Program.

    DTIC Science & Technology

    1979-11-01

    Radar, the RX (RF-16 etc.), Enhanced Tactical Fighter ( ETF ), and A-7. Candidate radar systems applicable to the Common Multi-Mode Radar Program...RSTC R Resupply Time to Overseas Located Bases (hours) RSTO R Depot Stock Safety Factor (standard deviations) DLY R Shipping Time to Depot from CONUS

  13. Technical Assistance as a Prevention Capacity-Building Tool: A Demonstration Using the Getting to Outcomes[R] Framework

    ERIC Educational Resources Information Center

    Hunter, Sarah B.; Chinman, Matthew; Ebener, Patricia; Imm, Pam; Wandersman, Abraham; Ryan, Gery W.

    2009-01-01

    Demands on community-based prevention programs for performance accountability and positive outcomes are ever increasing in the face of constrained resources. Relatively little is known about how technical assistance (TA) should be structured to benefit community-based organizations and to lead to better outcomes. In this study, data from multiple…

  14. Memory color assisted illuminant estimation through pixel clustering

    NASA Astrophysics Data System (ADS)

    Zhang, Heng; Quan, Shuxue

    2010-01-01

    The under constrained nature of illuminant estimation determines that in order to resolve the problem, certain assumptions are needed, such as the gray world theory. Including more constraints in this process may help explore the useful information in an image and improve the accuracy of the estimated illuminant, providing that the constraints hold. Based on the observation that most personal images have contents of one or more of the following categories: neutral objects, human beings, sky, and plants, we propose a method for illuminant estimation through the clustering of pixels of gray and three dominant memory colors: skin tone, sky blue, and foliage green. Analysis shows that samples of the above colors cluster around small areas under different illuminants and their characteristics can be used to effectively detect pixels falling into each of the categories. The algorithm requires the knowledge of the spectral sensitivity response of the camera, and a spectral database consisted of the CIE standard illuminants and reflectance or radiance database of samples of the above colors.

  15. Handheld Multi-Gas Meters Assessment Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Gustavious; Wald-Hopkins, Mark David; Obrey, Stephen J.

    2016-06-27

    Handheld multi-gas meters (MGMs) are equipped with sensors to monitor oxygen (O2) levels and additional sensors to detect the presence of combustible or toxic gases in the environment. This report is limited to operational response-type MGMs that include at least four different sensors. These sensors can vary by type and by the monitored chemical. In real time, the sensors report the concentration of monitored gases in the atmosphere near the MGM. In April 2016 the System Assessment and Validation for Emergency Responders (SAVER) Program conducted an operationally-oriented assessment of MGMs. Five MGMs were assessed by emergency responders. The criteria andmore » scenarios used in this assessment were derived from the results of a focus group of emergency responders with experience in using MGMs. The assessment addressed 16 evaluation criteria in four SAVER categories: Usability, Capability, Maintainability, and Deployability.« less

  16. Implementation of multi-professional healthcare residency at a federal university: historical trajectory.

    PubMed

    Martins, Gabriela Del Mestre; Caregnato, Rita Catalina Aquino; Barroso, Véra Lucia Maciel; Ribas, Daniela Celiva Pedrotti

    2016-08-25

    To retrieve the historical trajectory of the implementation of a multi-professional healthcare residency at the Universidade Federal de Ciências da Saúde de Porto Alegre, in partnership with the Santa Casa de Misericórdia de Porto Alegre. Historical research based on oral history. Interviews were conducted with six professionals of both institutions from October to December 2013. The data were subjected to content analysis. The oral histories led to three thematic categories, as follows: Strengthening the involved institutions; Professional qualification for intensive care; and Programme implementation. The historical trajectory of a multi-professional healthcare residency programme revealed the efforts of linking teaching and service to better qualify healthcare professionals and strengthen healthcare teams, and consequently change the hegemonic medical assistance model.

  17. Uncoordinated MAC for Adaptive Multi-Beam Directional Networks: Analysis and Evaluation

    DTIC Science & Technology

    2016-04-10

    transmission times, hence traditional CSMA approaches are not appropriate. We first present our model of these multi-beamforming capa- bilities and the...resulting wireless interference. We then derive an upper bound on multi-access performance for an idealized version of this physical layer. We then present... transmissions and receptions in a mobile ad-hoc network has in practice led to very constrained topologies. As mentioned, one approach for system design is to de

  18. Multi Robot Path Planning for Budgeted Active Perception with Self-Organising Maps

    DTIC Science & Technology

    2016-10-04

    Multi- Robot Path Planning for Budgeted Active Perception with Self-Organising Maps Graeme Best1, Jan Faigl2 and Robert Fitch1 Abstract— We propose a...optimise paths for a multi- robot team that aims to maximally observe a set of nodes in the environment. The selected nodes are observed by visiting...regions, each node has an observation reward, and the robots are constrained by travel budgets. The SOM algorithm jointly selects and allocates nodes

  19. Building Multi-Discipline, Multi-Format Digital Libraries Using Clusters and Buckets. Degree rewarded by Old Dominion Univ. on Aug. 1997

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.

    1997-01-01

    Our objective was to study the feasibility of extending the Dienst protocol to enable a multi-discipline, multi-format digital library. We implemented two new technologies: cluster functionality and publishing buckets. We have designed a possible implementation of clusters and buckets, and have prototyped some aspects of the resultant digital library. Currently, digital libraries are segregated by the disciplines they serve (computer science, aeronautics, etc.), and by the format of their holdings (reports, software, datasets, etc.). NCSTRL+ is a multi-discipline, multi-format digital library (DL) prototype created to explore the feasibility of the design and implementation issues involved with created a unified, canonical scientific and technical information (STI) DL. NCSTRL+ is based on the Networked Computer Science Technical Report Library (NCSTRL), a World Wide Web (WWW) accessible DL that provides access to over 80 university departments and laboratories. We have extended the Dienst protocol (version 4.1.8), the protocol underlying NCSTRL, to provide the ability to cluster independent collections into a logically centralized DL based upon subject category classification, type of organization, and genre of material. The concept of buckets provides a mechanism for publishing and managing logically linked entities with multiple data formats.

  20. Taylor O(h³) Discretization of ZNN Models for Dynamic Equality-Constrained Quadratic Programming With Application to Manipulators.

    PubMed

    Liao, Bolin; Zhang, Yunong; Jin, Long

    2016-02-01

    In this paper, a new Taylor-type numerical differentiation formula is first presented to discretize the continuous-time Zhang neural network (ZNN), and obtain higher computational accuracy. Based on the Taylor-type formula, two Taylor-type discrete-time ZNN models (termed Taylor-type discrete-time ZNNK and Taylor-type discrete-time ZNNU models) are then proposed and discussed to perform online dynamic equality-constrained quadratic programming. For comparison, Euler-type discrete-time ZNN models (called Euler-type discrete-time ZNNK and Euler-type discrete-time ZNNU models) and Newton iteration, with interesting links being found, are also presented. It is proved herein that the steady-state residual errors of the proposed Taylor-type discrete-time ZNN models, Euler-type discrete-time ZNN models, and Newton iteration have the patterns of O(h(3)), O(h(2)), and O(h), respectively, with h denoting the sampling gap. Numerical experiments, including the application examples, are carried out, of which the results further substantiate the theoretical findings and the efficacy of Taylor-type discrete-time ZNN models. Finally, the comparisons with Taylor-type discrete-time derivative model and other Lagrange-type discrete-time ZNN models for dynamic equality-constrained quadratic programming substantiate the superiority of the proposed Taylor-type discrete-time ZNN models once again.

  1. Landslide Susceptibility Assessment Using Spatial Multi-Criteria Evaluation Model in Rwanda.

    PubMed

    Nsengiyumva, Jean Baptiste; Luo, Geping; Nahayo, Lamek; Huang, Xiaotao; Cai, Peng

    2018-01-31

    Landslides susceptibility assessment has to be conducted to identify prone areas and guide risk management. Landslides in Rwanda are very deadly disasters. The current research aimed to conduct landslide susceptibility assessment by applying Spatial Multi-Criteria Evaluation Model with eight layers of causal factors including: slope, distance to roads, lithology, precipitation, soil texture, soil depth, altitude and land cover. In total, 980 past landslide locations were mapped. The relationship between landslide factors and inventory map was calculated using the Spatial Multi-Criteria Evaluation. The results revealed that susceptibility is spatially distributed countrywide with 42.3% of the region classified from moderate to very high susceptibility, and this is inhabited by 49.3% of the total population. In addition, Provinces with high to very high susceptibility are West, North and South (40.4%, 22.8% and 21.5%, respectively). Subsequently, the Eastern Province becomes the peak under low susceptibility category (87.8%) with no very high susceptibility (0%). Based on these findings, the employed model produced accurate and reliable outcome in terms of susceptibility, since 49.5% of past landslides fell within the very high susceptibility category, which confirms the model's performance. The outcomes of this study will be useful for future initiatives related to landslide risk reduction and management.

  2. Landslide Susceptibility Assessment Using Spatial Multi-Criteria Evaluation Model in Rwanda

    PubMed Central

    Nsengiyumva, Jean Baptiste; Luo, Geping; Nahayo, Lamek; Huang, Xiaotao; Cai, Peng

    2018-01-01

    Landslides susceptibility assessment has to be conducted to identify prone areas and guide risk management. Landslides in Rwanda are very deadly disasters. The current research aimed to conduct landslide susceptibility assessment by applying Spatial Multi-Criteria Evaluation Model with eight layers of causal factors including: slope, distance to roads, lithology, precipitation, soil texture, soil depth, altitude and land cover. In total, 980 past landslide locations were mapped. The relationship between landslide factors and inventory map was calculated using the Spatial Multi-Criteria Evaluation. The results revealed that susceptibility is spatially distributed countrywide with 42.3% of the region classified from moderate to very high susceptibility, and this is inhabited by 49.3% of the total population. In addition, Provinces with high to very high susceptibility are West, North and South (40.4%, 22.8% and 21.5%, respectively). Subsequently, the Eastern Province becomes the peak under low susceptibility category (87.8%) with no very high susceptibility (0%). Based on these findings, the employed model produced accurate and reliable outcome in terms of susceptibility, since 49.5% of past landslides fell within the very high susceptibility category, which confirms the model’s performance. The outcomes of this study will be useful for future initiatives related to landslide risk reduction and management. PMID:29385096

  3. Multi-target parallel processing approach for gene-to-structure determination of the influenza polymerase PB2 subunit.

    PubMed

    Armour, Brianna L; Barnes, Steve R; Moen, Spencer O; Smith, Eric; Raymond, Amy C; Fairman, James W; Stewart, Lance J; Staker, Bart L; Begley, Darren W; Edwards, Thomas E; Lorimer, Donald D

    2013-06-28

    Pandemic outbreaks of highly virulent influenza strains can cause widespread morbidity and mortality in human populations worldwide. In the United States alone, an average of 41,400 deaths and 1.86 million hospitalizations are caused by influenza virus infection each year (1). Point mutations in the polymerase basic protein 2 subunit (PB2) have been linked to the adaptation of the viral infection in humans (2). Findings from such studies have revealed the biological significance of PB2 as a virulence factor, thus highlighting its potential as an antiviral drug target. The structural genomics program put forth by the National Institute of Allergy and Infectious Disease (NIAID) provides funding to Emerald Bio and three other Pacific Northwest institutions that together make up the Seattle Structural Genomics Center for Infectious Disease (SSGCID). The SSGCID is dedicated to providing the scientific community with three-dimensional protein structures of NIAID category A-C pathogens. Making such structural information available to the scientific community serves to accelerate structure-based drug design. Structure-based drug design plays an important role in drug development. Pursuing multiple targets in parallel greatly increases the chance of success for new lead discovery by targeting a pathway or an entire protein family. Emerald Bio has developed a high-throughput, multi-target parallel processing pipeline (MTPP) for gene-to-structure determination to support the consortium. Here we describe the protocols used to determine the structure of the PB2 subunit from four different influenza A strains.

  4. Quantum field theory of interacting dark matter and dark energy: Dark monodromies

    DOE PAGES

    D’Amico, Guido; Hamill, Teresa; Kaloper, Nemanja

    2016-11-28

    We discuss how to formulate a quantum field theory of dark energy interacting with dark matter. We show that the proposals based on the assumption that dark matter is made up of heavy particles with masses which are very sensitive to the value of dark energy are strongly constrained. Quintessence-generated long-range forces and radiative stability of the quintessence potential require that such dark matter and dark energy are completely decoupled. However, if dark energy and a fraction of dark matter are very light axions, they can have significant mixings which are radiatively stable and perfectly consistent with quantum field theory.more » Such models can naturally occur in multi-axion realizations of monodromies. The mixings yield interesting signatures which are observable and are within current cosmological limits but could be constrained further by future observations« less

  5. Cooperative parallel adaptive neighbourhood search for the disjunctively constrained knapsack problem

    NASA Astrophysics Data System (ADS)

    Quan, Zhe; Wu, Lei

    2017-09-01

    This article investigates the use of parallel computing for solving the disjunctively constrained knapsack problem. The proposed parallel computing model can be viewed as a cooperative algorithm based on a multi-neighbourhood search. The cooperation system is composed of a team manager and a crowd of team members. The team members aim at applying their own search strategies to explore the solution space. The team manager collects the solutions from the members and shares the best one with them. The performance of the proposed method is evaluated on a group of benchmark data sets. The results obtained are compared to those reached by the best methods from the literature. The results show that the proposed method is able to provide the best solutions in most cases. In order to highlight the robustness of the proposed parallel computing model, a new set of large-scale instances is introduced. Encouraging results have been obtained.

  6. Quantum field theory of interacting dark matter and dark energy: Dark monodromies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D’Amico, Guido; Hamill, Teresa; Kaloper, Nemanja

    We discuss how to formulate a quantum field theory of dark energy interacting with dark matter. We show that the proposals based on the assumption that dark matter is made up of heavy particles with masses which are very sensitive to the value of dark energy are strongly constrained. Quintessence-generated long-range forces and radiative stability of the quintessence potential require that such dark matter and dark energy are completely decoupled. However, if dark energy and a fraction of dark matter are very light axions, they can have significant mixings which are radiatively stable and perfectly consistent with quantum field theory.more » Such models can naturally occur in multi-axion realizations of monodromies. The mixings yield interesting signatures which are observable and are within current cosmological limits but could be constrained further by future observations« less

  7. A sensor data format incorporating battery charge information for smartphone-based mHealth applications

    NASA Astrophysics Data System (ADS)

    Escobar, Rodrigo; Akopian, David; Boppana, Rajendra

    2015-03-01

    Remote health monitoring systems involve energy-constrained devices, such as sensors and mobile gateways. Current data formats for communication of health data, such as DICOM and HL7, were not designed for multi-sensor applications or to enable the management of power-constrained devices in health monitoring processes. In this paper, a data format suitable for collection of multiple sensor data, including readings and other operational parameters is presented. By using the data format, the system management can assess energy consumptions and plan realistic monitoring scenarios. The proposed data format not only outperforms other known data formats in terms of readability, flexibility, interoperability and validation of compliant documents, but also enables energy assessment capability for realistic data collection scenarios and maintains or even reduces the overhead introduced due to formatting. Additionally, we provide analytical methods to estimate incremental energy consumption by various sensors and experiments to measure the actual battery drain on smartphones.

  8. Write Proposals. Module CG B-2 of Category B--Supporting. Competency-Based Career Guidance Modules.

    ERIC Educational Resources Information Center

    Gustafson, Richard A.

    This module is intended to help guidance personnel in a variety of educational and agency settings plan and develop successful proposals to assist in financing the improvement of existing or future career guidance programs. The module is one of a series of competency-based guidance program training packages focusing upon specific professional and…

  9. Reliable design of a closed loop supply chain network under uncertainty: An interval fuzzy possibilistic chance-constrained model

    NASA Astrophysics Data System (ADS)

    Vahdani, Behnam; Tavakkoli-Moghaddam, Reza; Jolai, Fariborz; Baboli, Arman

    2013-06-01

    This article seeks to offer a systematic approach to establishing a reliable network of facilities in closed loop supply chains (CLSCs) under uncertainties. Facilities that are located in this article concurrently satisfy both traditional objective functions and reliability considerations in CLSC network designs. To attack this problem, a novel mathematical model is developed that integrates the network design decisions in both forward and reverse supply chain networks. The model also utilizes an effective reliability approach to find a robust network design. In order to make the results of this article more realistic, a CLSC for a case study in the iron and steel industry has been explored. The considered CLSC is multi-echelon, multi-facility, multi-product and multi-supplier. Furthermore, multiple facilities exist in the reverse logistics network leading to high complexities. Since the collection centres play an important role in this network, the reliability concept of these facilities is taken into consideration. To solve the proposed model, a novel interactive hybrid solution methodology is developed by combining a number of efficient solution approaches from the recent literature. The proposed solution methodology is a bi-objective interval fuzzy possibilistic chance-constraint mixed integer linear programming (BOIFPCCMILP). Finally, computational experiments are provided to demonstrate the applicability and suitability of the proposed model in a supply chain environment and to help decision makers facilitate their analyses.

  10. Clustering header categories extracted from web tables

    NASA Astrophysics Data System (ADS)

    Nagy, George; Embley, David W.; Krishnamoorthy, Mukkai; Seth, Sharad

    2015-01-01

    Revealing related content among heterogeneous web tables is part of our long term objective of formulating queries over multiple sources of information. Two hundred HTML tables from institutional web sites are segmented and each table cell is classified according to the fundamental indexing property of row and column headers. The categories that correspond to the multi-dimensional data cube view of a table are extracted by factoring the (often multi-row/column) headers. To reveal commonalities between tables from diverse sources, the Jaccard distances between pairs of category headers (and also table titles) are computed. We show how about one third of our heterogeneous collection can be clustered into a dozen groups that exhibit table-title and header similarities that can be exploited for queries.

  11. Supervised Semantic Classification for Nuclear Proliferation Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vatsavai, Raju; Cheriyadat, Anil M; Gleason, Shaun Scott

    2010-01-01

    Existing feature extraction and classification approaches are not suitable for monitoring proliferation activity using high-resolution multi-temporal remote sensing imagery. In this paper we present a supervised semantic labeling framework based on the Latent Dirichlet Allocation method. This framework is used to analyze over 120 images collected under different spatial and temporal settings over the globe representing three major semantic categories: airports, nuclear, and coal power plants. Initial experimental results show a reasonable discrimination of these three categories even though coal and nuclear images share highly common and overlapping objects. This research also identified several research challenges associated with nuclear proliferationmore » monitoring using high resolution remote sensing images.« less

  12. Word-level information influences phonetic learning in adults and infants

    PubMed Central

    Feldman, Naomi H.; Myers, Emily B.; White, Katherine S.; Griffiths, Thomas L.; Morgan, James L.

    2013-01-01

    Infants begin to segment words from fluent speech during the same time period that they learn phonetic categories. Segmented words can provide a potentially useful cue for phonetic learning, yet accounts of phonetic category acquisition typically ignore the contexts in which sounds appear. We present two experiments to show that, contrary to the assumption that phonetic learning occurs in isolation, learners are sensitive to the words in which sounds appear and can use this information to constrain their interpretation of phonetic variability. Experiment 1 shows that adults use word-level information in a phonetic category learning task, assigning acoustically similar vowels to different categories more often when those sounds consistently appear in different words. Experiment 2 demonstrates that eight-month-old infants similarly pay attention to word-level information and that this information affects how they treat phonetic contrasts. These findings suggest that phonetic category learning is a rich, interactive process that takes advantage of many different types of cues that are present in the input. PMID:23562941

  13. Ionized Outflows in 3-D Insights from Herbig-Haro Objects and Applications to Nearby AGN

    NASA Technical Reports Server (NTRS)

    Cecil, Gerald

    1999-01-01

    HST shows that the gas distributions of these objects are complex and clump at the limit of resolution. HST spectra have lumpy emission-line profiles, indicating unresolved sub-structure. The advantages of 3D over slits on gas so distributed are: robust flux estimates of various dynamical systems projected along lines of sight, sensitivity to fainter spectral lines that are physical diagnostics (reddening-gas density, T, excitation mechanisms, abundances), and improved prospects for recovery of unobserved dimensions of phase-space. These advantages al- low more confident modeling for more profound inquiry into underlying dynamics. The main complication is the effort required to link multi- frequency datasets that optimally track the energy flow through various phases of the ISM. This tedium has limited the number of objects that have been thoroughly analyzed to the a priori most spectacular systems. For HHO'S, proper-motions constrain the ambient B-field, shock velocity, gas abundances, mass-loss rates, source duty-cycle, and tie-ins with molecular flows. If the shock speed, hence ionization fraction, is indeed small then the ionized gas is a significant part of the flow energetics. For AGN'S, nuclear beaming is a source of ionization ambiguity. Establishing the energetics of the outflow is critical to determining how the accretion disk loses its energy. CXO will provide new constraints (especially spectral) on AGN outflows, and STIS UV-spectroscopy is also constraining cloud properties (although limited by extinction). HHO's show some of the things that we will find around AGN'S. I illustrate these points with results from ground-based and HST programs being pursued with collaborators.

  14. Classification of hydrocephalus: critical analysis of classification categories and advantages of "Multi-categorical Hydrocephalus Classification" (Mc HC).

    PubMed

    Oi, Shizuo

    2011-10-01

    Hydrocephalus is a complex pathophysiology with disturbed cerebrospinal fluid (CSF) circulation. There are numerous numbers of classification trials published focusing on various criteria, such as associated anomalies/underlying lesions, CSF circulation/intracranial pressure patterns, clinical features, and other categories. However, no definitive classification exists comprehensively to cover the variety of these aspects. The new classification of hydrocephalus, "Multi-categorical Hydrocephalus Classification" (Mc HC), was invented and developed to cover the entire aspects of hydrocephalus with all considerable classification items and categories. Ten categories include "Mc HC" category I: onset (age, phase), II: cause, III: underlying lesion, IV: symptomatology, V: pathophysiology 1-CSF circulation, VI: pathophysiology 2-ICP dynamics, VII: chronology, VII: post-shunt, VIII: post-endoscopic third ventriculostomy, and X: others. From a 100-year search of publication related to the classification of hydrocephalus, 14 representative publications were reviewed and divided into the 10 categories. The Baumkuchen classification graph made from the round o'clock classification demonstrated the historical tendency of deviation to the categories in pathophysiology, either CSF or ICP dynamics. In the preliminary clinical application, it was concluded that "Mc HC" is extremely effective in expressing the individual state with various categories in the past and present condition or among the compatible cases of hydrocephalus along with the possible chronological change in the future.

  15. On the comparison of stochastic model predictive control strategies applied to a hydrogen-based microgrid

    NASA Astrophysics Data System (ADS)

    Velarde, P.; Valverde, L.; Maestre, J. M.; Ocampo-Martinez, C.; Bordons, C.

    2017-03-01

    In this paper, a performance comparison among three well-known stochastic model predictive control approaches, namely, multi-scenario, tree-based, and chance-constrained model predictive control is presented. To this end, three predictive controllers have been designed and implemented in a real renewable-hydrogen-based microgrid. The experimental set-up includes a PEM electrolyzer, lead-acid batteries, and a PEM fuel cell as main equipment. The real experimental results show significant differences from the plant components, mainly in terms of use of energy, for each implemented technique. Effectiveness, performance, advantages, and disadvantages of these techniques are extensively discussed and analyzed to give some valid criteria when selecting an appropriate stochastic predictive controller.

  16. Transitioning Together: A Multi-Family Group Psychoeducation Program for Adolescents with ASD and Their Parents

    ERIC Educational Resources Information Center

    DaWalt, Leann Smith; Greenberg, Jan S.; Mailick, Marsha R.

    2018-01-01

    Currently there are few evidence-based programs available for families of individuals with ASD during the transition to adulthood. The present study provided a preliminary evaluation of a multi-family group psychoeducation intervention using a randomized waitlist control design (n = 41). Families in the intervention condition participated in…

  17. System-Wide Water Resources Program Nutrient Sub-Model (SWWRP-NSM) Version 1.1

    DTIC Science & Technology

    2008-09-01

    species including crops, native grasses, and trees . The process descriptions utilize a single plant growth model to simulate all types of land covers...characteristics: • Multi- species , multi-phase, and multi-reaction system • Fast (equilibrium-based) and slow (non-equilibrium-based or rate- based...Transformation and loading of N and P species in the overland flow • Simulation of the N and P cycle in the water column (both overland and

  18. Multi-objective evolutionary algorithms for fuzzy classification in survival prediction.

    PubMed

    Jiménez, Fernando; Sánchez, Gracia; Juárez, José M

    2014-03-01

    This paper presents a novel rule-based fuzzy classification methodology for survival/mortality prediction in severe burnt patients. Due to the ethical aspects involved in this medical scenario, physicians tend not to accept a computer-based evaluation unless they understand why and how such a recommendation is given. Therefore, any fuzzy classifier model must be both accurate and interpretable. The proposed methodology is a three-step process: (1) multi-objective constrained optimization of a patient's data set, using Pareto-based elitist multi-objective evolutionary algorithms to maximize accuracy and minimize the complexity (number of rules) of classifiers, subject to interpretability constraints; this step produces a set of alternative (Pareto) classifiers; (2) linguistic labeling, which assigns a linguistic label to each fuzzy set of the classifiers; this step is essential to the interpretability of the classifiers; (3) decision making, whereby a classifier is chosen, if it is satisfactory, according to the preferences of the decision maker. If no classifier is satisfactory for the decision maker, the process starts again in step (1) with a different input parameter set. The performance of three multi-objective evolutionary algorithms, niched pre-selection multi-objective algorithm, elitist Pareto-based multi-objective evolutionary algorithm for diversity reinforcement (ENORA) and the non-dominated sorting genetic algorithm (NSGA-II), was tested using a patient's data set from an intensive care burn unit and a standard machine learning data set from an standard machine learning repository. The results are compared using the hypervolume multi-objective metric. Besides, the results have been compared with other non-evolutionary techniques and validated with a multi-objective cross-validation technique. Our proposal improves the classification rate obtained by other non-evolutionary techniques (decision trees, artificial neural networks, Naive Bayes, and case-based reasoning) obtaining with ENORA a classification rate of 0.9298, specificity of 0.9385, and sensitivity of 0.9364, with 14.2 interpretable fuzzy rules on average. Our proposal improves the accuracy and interpretability of the classifiers, compared with other non-evolutionary techniques. We also conclude that ENORA outperforms niched pre-selection and NSGA-II algorithms. Moreover, given that our multi-objective evolutionary methodology is non-combinational based on real parameter optimization, the time cost is significantly reduced compared with other evolutionary approaches existing in literature based on combinational optimization. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. Concepts and Categories: A Cognitive Neuropsychological Perspective

    PubMed Central

    Mahon, Bradford Z.; Caramazza, Alfonso

    2010-01-01

    One of the most provocative and exciting issues in cognitive science is how neural specificity for semantic categories of common objects arises in the functional architecture of the brain. More than two decades of research on the neuropsychological phenomenon of category-specific semantic deficits has generated detailed claims about the organization and representation of conceptual knowledge. More recently, researchers have sought to test hypotheses developed on the basis of neuropsychological evidence with functional imaging. From those two fields, the empirical generalization emerges that object domain and sensory modality jointly constrain the organization of knowledge in the brain. At the same time, research within the embodied cognition framework has highlighted the need to articulate how information is communicated between the sensory and motor systems, and processes that represent and generalize abstract information. Those developments point toward a new approach for understanding category specificity in terms of the coordinated influences of diverse regions and cognitive systems. PMID:18767921

  20. Hierarchical Bayesian Model Averaging for Chance Constrained Remediation Designs

    NASA Astrophysics Data System (ADS)

    Chitsazan, N.; Tsai, F. T.

    2012-12-01

    Groundwater remediation designs are heavily relying on simulation models which are subjected to various sources of uncertainty in their predictions. To develop a robust remediation design, it is crucial to understand the effect of uncertainty sources. In this research, we introduce a hierarchical Bayesian model averaging (HBMA) framework to segregate and prioritize sources of uncertainty in a multi-layer frame, where each layer targets a source of uncertainty. The HBMA framework provides an insight to uncertainty priorities and propagation. In addition, HBMA allows evaluating model weights in different hierarchy levels and assessing the relative importance of models in each level. To account for uncertainty, we employ a chance constrained (CC) programming for stochastic remediation design. Chance constrained programming was implemented traditionally to account for parameter uncertainty. Recently, many studies suggested that model structure uncertainty is not negligible compared to parameter uncertainty. Using chance constrained programming along with HBMA can provide a rigorous tool for groundwater remediation designs under uncertainty. In this research, the HBMA-CC was applied to a remediation design in a synthetic aquifer. The design was to develop a scavenger well approach to mitigate saltwater intrusion toward production wells. HBMA was employed to assess uncertainties from model structure, parameter estimation and kriging interpolation. An improved harmony search optimization method was used to find the optimal location of the scavenger well. We evaluated prediction variances of chloride concentration at the production wells through the HBMA framework. The results showed that choosing the single best model may lead to a significant error in evaluating prediction variances for two reasons. First, considering the single best model, variances that stem from uncertainty in the model structure will be ignored. Second, considering the best model with non-dominant model weight may underestimate or overestimate prediction variances by ignoring other plausible propositions. Chance constraints allow developing a remediation design with a desirable reliability. However, considering the single best model, the calculated reliability will be different from the desirable reliability. We calculated the reliability of the design for the models at different levels of HBMA. The results showed that by moving toward the top layers of HBMA, the calculated reliability converges to the chosen reliability. We employed the chance constrained optimization along with the HBMA framework to find the optimal location and pumpage for the scavenger well. The results showed that using models at different levels in the HBMA framework, the optimal location of the scavenger well remained the same, but the optimal extraction rate was altered. Thus, we concluded that the optimal pumping rate was sensitive to the prediction variance. Also, the prediction variance was changed by using different extraction rate. Using very high extraction rate will cause prediction variances of chloride concentration at the production wells to approach zero regardless of which HBMA models used.

  1. Informing Aerosol Transport Models With Satellite Multi-Angle Aerosol Measurements

    NASA Technical Reports Server (NTRS)

    Limbacher, J.; Patadia, F.; Petrenko, M.; Martin, M. Val; Chin, M.; Gaitley, B.; Garay, M.; Kalashnikova, O.; Nelson, D.; Scollo, S.

    2011-01-01

    As the aerosol products from the NASA Earth Observing System's Multi-angle Imaging SpectroRadiometer (MISR) mature, we are placing greater focus on ways of using the aerosol amount and type data products, and aerosol plume heights, to constrain aerosol transport models. We have demonstrated the ability to map aerosol air-mass-types regionally, and have identified product upgrades required to apply them globally, including the need for a quality flag indicating the aerosol type information content, that varies depending upon retrieval conditions. We have shown that MISR aerosol type can distinguish smoke from dust, volcanic ash from sulfate and water particles, and can identify qualitative differences in mixtures of smoke, dust, and pollution aerosol components in urban settings. We demonstrated the use of stereo imaging to map smoke, dust, and volcanic effluent plume injection height, and the combination of MISR and MODIS aerosol optical depth maps to constrain wildfire smoke source strength. This talk will briefly highlight where we stand on these application, with emphasis on the steps we are taking toward applying the capabilities toward constraining aerosol transport models, planet-wide.

  2. Taking Risk Assessment and Management to the Next Level: Program-Level Risk Analysis to Enable Solid Decision-Making on Priorities and Funding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, J. G.; Morton, R. L.; Castillo, C.

    2011-02-01

    A multi-level (facility and programmatic) risk assessment was conducted for the facilities in the Nevada National Security Site (NNSS) Readiness in Technical Base and Facilities (RTBF) Program and results were included in a new Risk Management Plan (RMP), which was incorporated into the fiscal year (FY) 2010 Integrated Plans. Risks, risk events, probability, consequence(s), and mitigation strategies were identified and captured, for most scope areas (i.e., risk categories) during the facilitated risk workshops. Risk mitigations (i.e., efforts in addition to existing controls) were identified during the facilitated risk workshops when the risk event was identified. Risk mitigation strategies fell intomore » two broad categories: threats or opportunities. Improvement projects were identified and linked to specific risks they mitigate, making the connection of risk reduction through investments for the annual Site Execution Plan. Due to the amount of that was collected, analysis to be performed, and reports to be generated, a Risk Assessment/ Management Tool (RAMtool) database was developed to analyze the risks in real-time, at multiple levels, which reinforced the site-level risk management process and procedures. The RAMtool database was developed and designed to assist in the capturing and analysis of the key elements of risk: probability, consequence, and impact. The RAMtool calculates the facility-level and programmatic-level risk factors to enable a side-by-side comparison to see where the facility manager and program manager should focus their risk reduction efforts and funding. This enables them to make solid decisions on priorities and funding to maximize the risk reduction. A more active risk management process was developed where risks and opportunities are actively managed, monitored, and controlled by each facility more aggressively and frequently. risk owners have the responsibility and accountability to manage their assigned risk in real-time, using the RAMtool database.« less

  3. Modeling small-scale dairy farms in central Mexico using multi-criteria programming.

    PubMed

    Val-Arreola, D; Kebreab, E; France, J

    2006-05-01

    Milk supply from Mexican dairy farms does not meet demand and small-scale farms can contribute toward closing the gap. Two multi-criteria programming techniques, goal programming and compromise programming, were used in a study of small-scale dairy farms in central Mexico. To build the goal and compromise programming models, 4 ordinary linear programming models were also developed, which had objective functions to maximize metabolizable energy for milk production, to maximize margin of income over feed costs, to maximize metabolizable protein for milk production, and to minimize purchased feedstuffs. Neither multi-criteria approach was significantly better than the other; however, by applying both models it was possible to perform a more comprehensive analysis of these small-scale dairy systems. The multi-criteria programming models affirm findings from previous work and suggest that a forage strategy based on alfalfa, ryegrass, and corn silage would meet nutrient requirements of the herd. Both models suggested that there is an economic advantage in rescheduling the calving season to the second and third calendar quarters to better synchronize higher demand for nutrients with the period of high forage availability.

  4. Joint fMRI analysis and subject clustering using sparse dictionary learning

    NASA Astrophysics Data System (ADS)

    Kim, Seung-Jun; Dontaraju, Krishna K.

    2017-08-01

    Multi-subject fMRI data analysis methods based on sparse dictionary learning are proposed. In addition to identifying the component spatial maps by exploiting the sparsity of the maps, clusters of the subjects are learned by postulating that the fMRI volumes admit a subspace clustering structure. Furthermore, in order to tune the associated hyper-parameters systematically, a cross-validation strategy is developed based on entry-wise sampling of the fMRI dataset. Efficient algorithms for solving the proposed constrained dictionary learning formulations are developed. Numerical tests performed on synthetic fMRI data show promising results and provides insights into the proposed technique.

  5. Reliability analysis and utilization of PEMs in space application

    NASA Astrophysics Data System (ADS)

    Jiang, Xiujie; Wang, Zhihua; Sun, Huixian; Chen, Xiaomin; Zhao, Tianlin; Yu, Guanghua; Zhou, Changyi

    2009-11-01

    More and more plastic encapsulated microcircuits (PEMs) are used in space missions to achieve high performance. Since PEMs are designed for use in terrestrial operating conditions, the successful usage of PEMs in space harsh environment is closely related to reliability issues, which should be considered firstly. However, there is no ready-made methodology for PEMs in space applications. This paper discusses the reliability for the usage of PEMs in space. This reliability analysis can be divided into five categories: radiation test, radiation hardness, screening test, reliability calculation and reliability assessment. One case study is also presented to illuminate the details of the process, in which a PEM part is used in a joint space program Double-Star Project between the European Space Agency (ESA) and China. The influence of environmental constrains including radiation, humidity, temperature and mechanics on the PEM part has been considered. Both Double-Star Project satellites are still running well in space now.

  6. Searching for X-ray Pulsations from Neutron Stars Using NICER

    NASA Astrophysics Data System (ADS)

    Ray, Paul S.; Arzoumanian, Zaven; Bogdanov, Slavko; Bult, Peter; Chakrabarty, Deepto; Guillot, Sebastien; Kust Harding, Alice; Ho, Wynn C. G.; Lamb, Frederick K.; Mahmoodifar, Simin; Miller, M. Coleman; Strohmayer, Tod E.; Wilson-Hodge, Colleen A.; Wolff, Michael Thomas

    2017-08-01

    The Neutron Star Interior Composition Explorer (NICER) presents an exciting new capability for discovering new modulation properties of X-ray emitting neutron stars, including large area, low background, extremely precise absolute time stamps, superb low-energy response and flexible scheduling. The Pulsation Searches and Multiwavelength Coordination working group has designed a 2.5 Ms observing program to search for pulsations and characterize the modulation properties of about 30 known or suspected neutron star sources across a number of source categories. A key early goal will be to search for pulsations from millisecond pulsars that might exhibit thermal pulsations from the surface suitable for pulse profile modeling to constrain the neutron star equation of state. In addition, we will search for pulsations from transitional millisecond pulsars, isolated neutron stars, LMXBs, accretion-powered millisecond pulsars, central compact objects and other sources. We will present our science plan and initial results from the first months of the NICER mission.

  7. Searching for X-ray Pulsations from Neutron Stars Using NICER

    NASA Astrophysics Data System (ADS)

    Ray, Paul S.; Arzoumanian, Zaven; Gendreau, Keith C.; Bogdanov, Slavko; Bult, Peter; Chakrabarty, Deepto; Chakrabarty, Deepto; Guillot, Sebastien; Harding, Alice; Ho, Wynn C. G.; Lamb, Frederick; Mahmoodifar, Simin; Miller, Cole; Strohmayer, Tod; Wilson-Hodge, Colleen; Wolff, Michael T.; NICER Science Team Working Group on Pulsation Searches and Multiwavelength Coordination

    2018-01-01

    The Neutron Star Interior Composition Explorer (NICER) presents an exciting new capability for discovering new modulation properties of X-ray emitting neutron stars, including large area, low background, extremely precise absolute time stamps, superb low-energy response and flexible scheduling. The Pulsation Searches and Multiwavelength Coordination working group has designed a 2.5 Ms observing program to search for pulsations and characterize the modulation properties of about 30 known or suspected neutron star sources across a number of source categories. A key early goal will be to search for pulsations from millisecond pulsars that might exhibit thermal pulsations from the surface suitable for pulse profile modeling to constrain the neutron star equation of state. In addition, we will search for pulsations from transitional millisecond pulsars, isolated neutron stars, LMXBs, accretion-powered millisecond pulsars, central compact objects and other sources. We present our science plan and initial results from the first months of the NICER mission.

  8. Feasibility study analysis for multi-function dual energy oven (case study: tapioca crackers small medium enterprise)

    NASA Astrophysics Data System (ADS)

    Soraya, N. W.; El Hadi, R. M.; Chumaidiyah, E.; Tripiawan, W.

    2017-12-01

    Conventional drying process is constrained by weather (cloudy / rainy), and requires wide drying area, and provides low-quality product. Multi-function dual energy oven is the appropriate technology to solve these problems. The oven uses solar thermal or gas heat for drying various type of products, including tapioca crackers. Investment analysis in technical, operational, and financial aspects show that the multi-function dual energy oven is feasible to be implemented for small medium enterprise (SME) processing tapioca crackers.

  9. Probing the Milky Way electron density using multi-messenger astronomy

    NASA Astrophysics Data System (ADS)

    Breivik, Katelyn; Larson, Shane

    2015-04-01

    Multi-messenger observations of ultra-compact binaries in both gravitational waves and electromagnetic radiation supply highly complementary information, providing new ways of characterizing the internal dynamics of these systems, as well as new probes of the galaxy itself. Electron density models, used in pulsar distance measurements via the electron dispersion measure, are currently not well constrained. Simultaneous radio and gravitational wave observations of pulsars in binaries provide a method of measuring the average electron density along the line of sight to the pulsar, thus giving a new method for constraining current electron density models. We present this method and assess its viability with simulations of the compact binary component of the Milky Way using the public domain binary evolution code, BSE. This work is supported by NASA Award NNX13AM10G.

  10. Alberta's provincial take-home naloxone program: A multi-sectoral and multi-jurisdictional response to overdose.

    PubMed

    Freeman, Lisa K; Bourque, Stacey; Etches, Nick; Goodison, Karin; O'Gorman, Claire; Rittenbach, Kay; Sikora, Christopher A; Yarema, Mark

    2017-11-09

    Alberta is a prairie province located in western Canada, with a population of approximately 4.3 million. In 2016, 363 Albertans died from apparent drug overdoses related to fentanyl, an opioid 50-100 times more toxic than morphine. This surpassed the number of deaths from motor vehicle collisions and homicides combined. Naloxone is a safe, effective, opioid antagonist that may quickly reverse an opioid overdose. In July 2015, a committee of community-based harm reduction programs in Alberta implemented a geographically restricted take-home naloxone (THN) program. The successes and limitations of this program demonstrated the need for an expanded, multi-sectoral, multi-jurisdictional response. The provincial health authority, Alberta Health Services (AHS), used previously established incident command system processes to coordinate implementation of a provincial THN program. Alberta's provincial THN program was implemented on December 23, 2015. This collaborative program resulted in a coordinated response across jurisdictional levels with wide geographical reach. Between December 2015 and December 2016, 953 locations, including many community pharmacies, registered to dispense THN kits, 9572 kits were distributed, and 472 reversals were reported. The provincial supply of THN kits more than tripled from 3000 to 10 000. Alberta was uniquely poised to deliver a large, province-wide, multi-sectoral and multi-jurisdictional THN program as part of a comprehensive response to increasing opioid-related morbidity and mortality. The speed at which AHS was able to roll out the program was made possible by work done previously and the willingness of multiple jurisdictions to work together to build on and expand the program.

  11. Worldwide Portals to Classroom Research on Light Pollution

    NASA Astrophysics Data System (ADS)

    Walker, C. E.; Pompea, S. M.; Buxner, S.

    2016-12-01

    Issues affecting society can provide stimulus for scientific research relevant to students' lives and, hence, of interest to them. These multi-disciplinary, non-traditional science topics often need foundational instruction for both students and instructors that steers students to and through research using Problem-Based or Project-Based Learning and provides more of a comfort zone for the instructor in terms of content and execution. A program created by the National Optical Astronomy Observatory's Education and Public Outreach staff (NOAO EPO) during the International Year of Light (2015) offers real-life challenges for students to solve and leads them to further research. The program is called the Quality Lighting Teaching (QLT) program (www.noao.edu/education/qltkit.php). For instructors, the impact of the program is amplified by providing professional development using tutorial videos created at NOAO on each of 6 activities and by conducting Q&A sessions via 14 Google+ Hangouts. Hangouts make communication possible with groups from 30 countries, which have received 88 QLT Kits. The central issue is poor quality lighting. It not only impedes astronomy research and seeing a starry night sky, but creates safety issues, affects human circadian sensitivities, disrupts ecosystems, and wastes billions of dollars/year in energy consumption. It also leads to excess carbon emissions. In this problem-based scenario, the city mayor (e.g., instructor) has received complaints from citizens about streetlights. Students are assembled into task forces to determine the underlying problems in the 6 complaint categories, as well as come up with feasible solutions. By exploring the concepts and practices of quality lighting, students will solve realistic cases on how light pollution affects wildlife, the night sky, our eyes, energy consumption, safety, and light trespass into buildings. The QLT Kit has all the materials for the explorations. Join us for our assessment of the program, success stories and lessons learned.

  12. Multi-state Markov model for disability: A case of Malaysia Social Security (SOCSO)

    NASA Astrophysics Data System (ADS)

    Samsuddin, Shamshimah; Ismail, Noriszura

    2016-06-01

    Studies of SOCSO's contributor outcomes like disability are usually restricted to a single outcome. In this respect, the study has focused on the approach of multi-state Markov model for estimating the transition probabilities among SOCSO's contributor in Malaysia between states: work, temporary disability, permanent disability and death at yearly intervals on age, gender, year and disability category; ignoring duration and past disability experience which is not consider of how or when someone arrived in that category. These outcomes represent different states which depend on health status among the workers.

  13. Paradigm shift: contribution of field epidemiology training in advancing the “One Health” approach to strengthen disease surveillance and outbreak investigations in Africa

    PubMed Central

    Monday, Busuulwa; Gitta, Sheba Nakacubo; Wasswa, Peter; Namusisi, Olivia; Bingi, Aloysius; Musenero, Monica; Mukanga, David

    2011-01-01

    The occurrence of major zoonotic disease outbreaks in Sub-Saharan Africa has had a significant impact on the already constrained public health systems. This has, as a result, justified the need to identify creative strategies to address threats from emerging and re-emerging infectious diseases at the human-animal-environmental interface, and implement robust multi-disease public health surveillance systems that will enhance early detection and response. Additionally, enhanced reporting and timely investigation of all suspected notifiable infectious disease threats within the health system is vital. Field epidemiology and laboratory training programs (FELTPs) have made significant contributions to public health systems for more than 10 years by producing highly skilled field epidemiologists. These epidemiologists have not only improved disease surveillance and response to outbreaks, but also improved management of health systems. Furthermore, the FETPs/FELTPs have laid an excellent foundation that brings clinicians, veterinarians, and environmental health professionals drawn from different governmental sectors, to work with a common purpose of disease control and prevention. The emergence of the One Health approach in the last decade has coincided with the present, paradigm, shift that calls for multi-sectoral and cross-sectoral collaboration towards disease surveillance, detection, reporting and timely response. The positive impact from the integration of FETP/FELTP and the One Health approach by selected programs in Africa has demonstrated the importance of multi-sectoral collaboration in addressing threats from infectious and non- infectious causes to man, animals and the environment. PMID:22359701

  14. How landmark suitability shapes recognition memory signals for objects in the medial temporal lobes.

    PubMed

    Martin, Chris B; Sullivan, Jacqueline A; Wright, Jessey; Köhler, Stefan

    2018-02-01

    A role of perirhinal cortex (PrC) in recognition memory for objects has been well established. Contributions of parahippocampal cortex (PhC) to this function, while documented, remain less well understood. Here, we used fMRI to examine whether the organization of item-based recognition memory signals across these two structures is shaped by object category, independent of any difference in representing episodic context. Guided by research suggesting that PhC plays a critical role in processing landmarks, we focused on three categories of objects that differ from each other in their landmark suitability as confirmed with behavioral ratings (buildings > trees > aircraft). Participants made item-based recognition-memory decisions for novel and previously studied objects from these categories, which were matched in accuracy. Multi-voxel pattern classification revealed category-specific item-recognition memory signals along the long axis of PrC and PhC, with no sharp functional boundaries between these structures. Memory signals for buildings were observed in the mid to posterior extent of PhC, signals for trees in anterior to posterior segments of PhC, and signals for aircraft in mid to posterior aspects of PrC and the anterior extent of PhC. Notably, item-based memory signals for the category with highest landmark suitability ratings were observed only in those posterior segments of PhC that also allowed for classification of landmark suitability of objects when memory status was held constant. These findings provide new evidence in support of the notion that item-based memory signals for objects are not limited to PrC, and that the organization of these signals along the longitudinal axis that crosses PrC and PhC can be captured with reference to landmark suitability. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Training and retaining staff to competently deliver an evidence-based practice: the role of staff attributes and perceptions of organizational functioning.

    PubMed

    Garner, Bryan R; Hunter, Brooke D; Godley, Susan H; Godley, Mark D

    2012-03-01

    Within the context of an initiative to implement evidence-based practices (EBPs) for adolescents with substance use disorders, this study examined the extent to which staff factors measured at an initial EBP training workshop were predictive of EBP competence and turnover status of staff (N = 121) measured 6, 9, and 12 months posttraining. By the final assessment point, 52.3% of staff transitioned to the employed/EBP-competent category, 26.6% transitioned to the not employed/not EBP-competent category, 4.6% transitioned to the not employed/EBP-competent category, and 16.5% had not transitioned out of the initial category. Multilevel multinomial regression analysis identified several measures that were significant predictors of staff transitions to the not employed/not EBP-competent category (e.g., program needs, job satisfaction, burnout) and transitions to the employed/EBP-competent category (e.g., months in position, pressures for change, influence). Findings have implications for the development and testing of strategies to train and retain staff to deliver EBPs in practice settings. Copyright © 2012 Elsevier Inc. All rights reserved.

  16. Coastal aquifer management under parameter uncertainty: Ensemble surrogate modeling based simulation-optimization

    NASA Astrophysics Data System (ADS)

    Janardhanan, S.; Datta, B.

    2011-12-01

    Surrogate models are widely used to develop computationally efficient simulation-optimization models to solve complex groundwater management problems. Artificial intelligence based models are most often used for this purpose where they are trained using predictor-predictand data obtained from a numerical simulation model. Most often this is implemented with the assumption that the parameters and boundary conditions used in the numerical simulation model are perfectly known. However, in most practical situations these values are uncertain. Under these circumstances the application of such approximation surrogates becomes limited. In our study we develop a surrogate model based coupled simulation optimization methodology for determining optimal pumping strategies for coastal aquifers considering parameter uncertainty. An ensemble surrogate modeling approach is used along with multiple realization optimization. The methodology is used to solve a multi-objective coastal aquifer management problem considering two conflicting objectives. Hydraulic conductivity and the aquifer recharge are considered as uncertain values. Three dimensional coupled flow and transport simulation model FEMWATER is used to simulate the aquifer responses for a number of scenarios corresponding to Latin hypercube samples of pumping and uncertain parameters to generate input-output patterns for training the surrogate models. Non-parametric bootstrap sampling of this original data set is used to generate multiple data sets which belong to different regions in the multi-dimensional decision and parameter space. These data sets are used to train and test multiple surrogate models based on genetic programming. The ensemble of surrogate models is then linked to a multi-objective genetic algorithm to solve the pumping optimization problem. Two conflicting objectives, viz, maximizing total pumping from beneficial wells and minimizing the total pumping from barrier wells for hydraulic control of saltwater intrusion are considered. The salinity levels resulting at strategic locations due to these pumping are predicted using the ensemble surrogates and are constrained to be within pre-specified levels. Different realizations of the concentration values are obtained from the ensemble predictions corresponding to each candidate solution of pumping. Reliability concept is incorporated as the percent of the total number of surrogate models which satisfy the imposed constraints. The methodology was applied to a realistic coastal aquifer system in Burdekin delta area in Australia. It was found that all optimal solutions corresponding to a reliability level of 0.99 satisfy all the constraints and as reducing reliability level decreases the constraint violation increases. Thus ensemble surrogate model based simulation-optimization was found to be useful in deriving multi-objective optimal pumping strategies for coastal aquifers under parameter uncertainty.

  17. Multi-task learning with group information for human action recognition

    NASA Astrophysics Data System (ADS)

    Qian, Li; Wu, Song; Pu, Nan; Xu, Shulin; Xiao, Guoqiang

    2018-04-01

    Human action recognition is an important and challenging task in computer vision research, due to the variations in human motion performance, interpersonal differences and recording settings. In this paper, we propose a novel multi-task learning framework with group information (MTL-GI) for accurate and efficient human action recognition. Specifically, we firstly obtain group information through calculating the mutual information according to the latent relationship between Gaussian components and action categories, and clustering similar action categories into the same group by affinity propagation clustering. Additionally, in order to explore the relationships of related tasks, we incorporate group information into multi-task learning. Experimental results evaluated on two popular benchmarks (UCF50 and HMDB51 datasets) demonstrate the superiority of our proposed MTL-GI framework.

  18. Maximized gust loads for a nonlinear airplane using matched filter theory and constrained optimization

    NASA Technical Reports Server (NTRS)

    Scott, Robert C.; Pototzky, Anthony S.; Perry, Boyd, III

    1991-01-01

    Two matched filter theory based schemes are described and illustrated for obtaining maximized and time correlated gust loads for a nonlinear aircraft. The first scheme is computationally fast because it uses a simple 1-D search procedure to obtain its answers. The second scheme is computationally slow because it uses a more complex multi-dimensional search procedure to obtain its answers, but it consistently provides slightly higher maximum loads than the first scheme. Both schemes are illustrated with numerical examples involving a nonlinear control system.

  19. Sustainable Cooperative Robotic Technologies for Human and Robotic Outpost Infrastructure Construction and Maintenance

    NASA Technical Reports Server (NTRS)

    Stroupe, Ashley W.; Okon, Avi; Robinson, Matthew; Huntsberger, Terry; Aghazarian, Hrand; Baumgartner, Eric

    2004-01-01

    Robotic Construction Crew (RCC) is a heterogeneous multi-robot system for autonomous acquisition, transport, and precision mating of components in construction tasks. RCC minimizes resources constrained in a space environment such as computation, power, communication and, sensing. A behavior-based architecture provides adaptability and robustness despite low computational requirements. RCC successfully performs several construction related tasks in an emulated outdoor environment despite high levels of uncertainty in motions and sensing. Quantitative results are provided for formation keeping in component transport, precision instrument placement, and construction tasks.

  20. Analysis of the Multi Strategy Goal Programming for Micro-Grid Based on Dynamic ant Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Qiu, J. P.; Niu, D. X.

    Micro-grid is one of the key technologies of the future energy supplies. Take economic planning. reliability, and environmental protection of micro grid as a basis for the analysis of multi-strategy objective programming problems for micro grid which contains wind power, solar power, and battery and micro gas turbine. Establish the mathematical model of each power generation characteristics and energy dissipation. and change micro grid planning multi-objective function under different operating strategies to a single objective model based on AHP method. Example analysis shows that in combination with dynamic ant mixed genetic algorithm can get the optimal power output of this model.

  1. Performance evaluation of throughput computing workloads using multi-core processors and graphics processors

    NASA Astrophysics Data System (ADS)

    Dave, Gaurav P.; Sureshkumar, N.; Blessy Trencia Lincy, S. S.

    2017-11-01

    Current trend in processor manufacturing focuses on multi-core architectures rather than increasing the clock speed for performance improvement. Graphic processors have become as commodity hardware for providing fast co-processing in computer systems. Developments in IoT, social networking web applications, big data created huge demand for data processing activities and such kind of throughput intensive applications inherently contains data level parallelism which is more suited for SIMD architecture based GPU. This paper reviews the architectural aspects of multi/many core processors and graphics processors. Different case studies are taken to compare performance of throughput computing applications using shared memory programming in OpenMP and CUDA API based programming.

  2. Development of a Multi-Disciplinary Intervention for the Treatment of Childhood Obesity Based on Cognitive Behavioral Therapy

    ERIC Educational Resources Information Center

    Bathrellou, Eirini; Yannakoulia, Mary; Papanikolaou, Katerina; Pehlivanidis, Artemios; Pervanidou, Panagiota; Kanaka-Gantenbein, Christina; Tsiantis, John; Chrousos, George P.; Sidossis, Labros S.

    2010-01-01

    Along the lines of the evidence-based recommendations, we developed a multi-disciplinary intervention for overweight children 7- to 12-years-old, primarily aiming at helping children to adopt healthier eating habits and a physically active lifestyle. The program combined nutrition intervention, based on a non-dieting approach, with physical…

  3. Multi-agents and learning: Implications for Webusage mining.

    PubMed

    Lotfy, Hewayda M S; Khamis, Soheir M S; Aboghazalah, Maie M

    2016-03-01

    Characterization of user activities is an important issue in the design and maintenance of websites. Server weblog files have abundant information about the user's current interests. This information can be mined and analyzed therefore the administrators may be able to guide the users in their browsing activity so they may obtain relevant information in a shorter span of time to obtain user satisfaction. Web-based technology facilitates the creation of personally meaningful and socially useful knowledge through supportive interactions, communication and collaboration among educators, learners and information. This paper suggests a new methodology based on learning techniques for a Web-based Multiagent-based application to discover the hidden patterns in the user's visited links. It presents a new approach that involves unsupervised, reinforcement learning, and cooperation between agents. It is utilized to discover patterns that represent the user's profiles in a sample website into specific categories of materials using significance percentages. These profiles are used to make recommendations of interesting links and categories to the user. The experimental results of the approach showed successful user pattern recognition, and cooperative learning among agents to obtain user profiles. It indicates that combining different learning algorithms is capable of improving user satisfaction indicated by the percentage of precision, recall, the progressive category weight and F 1-measure.

  4. Multi-agents and learning: Implications for Webusage mining

    PubMed Central

    Lotfy, Hewayda M.S.; Khamis, Soheir M.S.; Aboghazalah, Maie M.

    2015-01-01

    Characterization of user activities is an important issue in the design and maintenance of websites. Server weblog files have abundant information about the user’s current interests. This information can be mined and analyzed therefore the administrators may be able to guide the users in their browsing activity so they may obtain relevant information in a shorter span of time to obtain user satisfaction. Web-based technology facilitates the creation of personally meaningful and socially useful knowledge through supportive interactions, communication and collaboration among educators, learners and information. This paper suggests a new methodology based on learning techniques for a Web-based Multiagent-based application to discover the hidden patterns in the user’s visited links. It presents a new approach that involves unsupervised, reinforcement learning, and cooperation between agents. It is utilized to discover patterns that represent the user’s profiles in a sample website into specific categories of materials using significance percentages. These profiles are used to make recommendations of interesting links and categories to the user. The experimental results of the approach showed successful user pattern recognition, and cooperative learning among agents to obtain user profiles. It indicates that combining different learning algorithms is capable of improving user satisfaction indicated by the percentage of precision, recall, the progressive category weight and F1-measure. PMID:26966569

  5. Mars Technology Program: Planetary Protection Technology Development

    NASA Technical Reports Server (NTRS)

    Lin, Ying

    2006-01-01

    This slide presentation reviews the development of Planetary Protection Technology in the Mars Technology Program. The goal of the program is to develop technologies that will enable NASA to build, launch, and operate a mission that has subsystems with different Planetary Protection (PP) classifications, specifically for operating a Category IVb-equivalent subsystem from a Category IVa platform. The IVa category of planetary protection requires bioburden reduction (i.e., no sterilization is required) The IVb category in addition to IVa requirements: (i.e., terminal sterilization of spacecraft is required). The differences between the categories are further reviewed.

  6. Social adaptation in multi-agent model of linguistic categorization is affected by network information flow.

    PubMed

    Zubek, Julian; Denkiewicz, Michał; Barański, Juliusz; Wróblewski, Przemysław; Rączaszek-Leonardi, Joanna; Plewczynski, Dariusz

    2017-01-01

    This paper explores how information flow properties of a network affect the formation of categories shared between individuals, who are communicating through that network. Our work is based on the established multi-agent model of the emergence of linguistic categories grounded in external environment. We study how network information propagation efficiency and the direction of information flow affect categorization by performing simulations with idealized network topologies optimizing certain network centrality measures. We measure dynamic social adaptation when either network topology or environment is subject to change during the experiment, and the system has to adapt to new conditions. We find that both decentralized network topology efficient in information propagation and the presence of central authority (information flow from the center to peripheries) are beneficial for the formation of global agreement between agents. Systems with central authority cope well with network topology change, but are less robust in the case of environment change. These findings help to understand which network properties affect processes of social adaptation. They are important to inform the debate on the advantages and disadvantages of centralized systems.

  7. Social adaptation in multi-agent model of linguistic categorization is affected by network information flow

    PubMed Central

    Denkiewicz, Michał; Barański, Juliusz; Wróblewski, Przemysław; Rączaszek-Leonardi, Joanna; Plewczynski, Dariusz

    2017-01-01

    This paper explores how information flow properties of a network affect the formation of categories shared between individuals, who are communicating through that network. Our work is based on the established multi-agent model of the emergence of linguistic categories grounded in external environment. We study how network information propagation efficiency and the direction of information flow affect categorization by performing simulations with idealized network topologies optimizing certain network centrality measures. We measure dynamic social adaptation when either network topology or environment is subject to change during the experiment, and the system has to adapt to new conditions. We find that both decentralized network topology efficient in information propagation and the presence of central authority (information flow from the center to peripheries) are beneficial for the formation of global agreement between agents. Systems with central authority cope well with network topology change, but are less robust in the case of environment change. These findings help to understand which network properties affect processes of social adaptation. They are important to inform the debate on the advantages and disadvantages of centralized systems. PMID:28809957

  8. Repeat Customer Success in Extension

    ERIC Educational Resources Information Center

    Bess, Melissa M.; Traub, Sarah M.

    2013-01-01

    Four multi-session research-based programs were offered by two Extension specialist in one rural Missouri county. Eleven participants who came to multiple Extension programs could be called "repeat customers." Based on the total number of participants for all four programs, 25% could be deemed as repeat customers. Repeat customers had…

  9. A Comprehensive Multi-Media Program to Prevent Smoking among Black Students.

    ERIC Educational Resources Information Center

    Kaufman, Joy S.; And Others

    1994-01-01

    Implemented program to decrease incidence of new smokers among black adolescents. Program combined school-based curriculum with comprehensive media intervention. There were two experimental conditions: one group participated in school-based intervention and was prompted to participate in multimedia intervention; other group had access to…

  10. Variable-Metric Algorithm For Constrained Optimization

    NASA Technical Reports Server (NTRS)

    Frick, James D.

    1989-01-01

    Variable Metric Algorithm for Constrained Optimization (VMACO) is nonlinear computer program developed to calculate least value of function of n variables subject to general constraints, both equality and inequality. First set of constraints equality and remaining constraints inequalities. Program utilizes iterative method in seeking optimal solution. Written in ANSI Standard FORTRAN 77.

  11. Fine-grained leukocyte classification with deep residual learning for microscopic images.

    PubMed

    Qin, Feiwei; Gao, Nannan; Peng, Yong; Wu, Zizhao; Shen, Shuying; Grudtsin, Artur

    2018-08-01

    Leukocyte classification and cytometry have wide applications in medical domain, previous researches usually exploit machine learning techniques to classify leukocytes automatically. However, constrained by the past development of machine learning techniques, for example, extracting distinctive features from raw microscopic images are difficult, the widely used SVM classifier only has relative few parameters to tune, these methods cannot efficiently handle fine-grained classification cases when the white blood cells have up to 40 categories. Based on deep learning theory, a systematic study is conducted on finer leukocyte classification in this paper. A deep residual neural network based leukocyte classifier is constructed at first, which can imitate the domain expert's cell recognition process, and extract salient features robustly and automatically. Then the deep neural network classifier's topology is adjusted according to the prior knowledge of white blood cell test. After that the microscopic image dataset with almost one hundred thousand labeled leukocytes belonging to 40 categories is built, and combined training strategies are adopted to make the designed classifier has good generalization ability. The proposed deep residual neural network based classifier was tested on microscopic image dataset with 40 leukocyte categories. It achieves top-1 accuracy of 77.80%, top-5 accuracy of 98.75% during the training procedure. The average accuracy on the test set is nearly 76.84%. This paper presents a fine-grained leukocyte classification method for microscopic images, based on deep residual learning theory and medical domain knowledge. Experimental results validate the feasibility and effectiveness of our approach. Extended experiments support that the fine-grained leukocyte classifier could be used in real medical applications, assist doctors in diagnosing diseases, reduce human power significantly. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. [Management accounting in hospital setting].

    PubMed

    Brzović, Z; Richter, D; Simunić, S; Bozić, R; Hadjina, N; Piacun, D; Harcet, B

    1998-12-01

    The periodic income and expenditure accounts produced at the hospital and departmental level enable successful short term management, but, in the long run do not help remove tensions between health care demand and limited resources, nor do they enable optimal medical planning within the limited financial resources. We are trying to estabilish disease category costs based on case mixing according to diagnostic categories (diagnosis related groups, DRG, or health care resource groups, HRG) and calculation of hospital standard product costs, e.g., radiology cost, preoperative nursing cost etc. The average DRG cost is composed of standard product costs plus any costs specific to a diagnostic category. As an example, current costing procedure for hip artheroplasty in the University Hospital Center Zagreb is compared to the management accounting approach based on British Health Care Resource experience. The knowledge of disease category costs based on management accounting requirements facilitates the implementation of medical programs within the given financial resources and devolves managerial responsibility closer to the clinical level where medical decisions take place.

  13. Design forms of total knee replacement.

    PubMed

    Walker, P S; Sathasivam, S

    2000-01-01

    The starting point of this article is a general design criterion applicable to all types of total knee replacement. This criterion is then expanded upon to provide more specifics of the required kinematics, and the forces which the total knee must sustain. A characteristic which differentiates total knees is the amount of constraint which is required, and whether the constraint is translational or rotational. The different forms of total knee replacement are described in terms of these constraints, starting with the least constrained unicompartments to the almost fully constrained fixed and rotating hinges. Much attention is given to the range of designs in between these two extreme types, because they constitute by far the largest in usage. This category includes condylar replacements where the cruciate ligaments are preserved or resected, posterior cruciate substituting designs and mobile bearing knees. A new term, 'guided motion knees', is applied to the growing number of designs which control the kinematics by the use of intercondylar cams or specially shaped and even additional bearing surfaces. The final section deals with the selection of an appropriate design of total knee for specific indications based on the design characteristics.

  14. The "Motherese" of Mr. Rogers: A Description of the Dialogue of Educational Television Programs.

    ERIC Educational Resources Information Center

    Rice, Mabel L.; Haight, Patti L.

    Dialogue from 30-minute samples from "Sesame Street" and "Mr. Rogers' Neighborhood" was coded for grammar, content, and discourse. Grammatical analysis used the LINGQUEST computer-assisted language assessment program (Mordecai, Palen, and Palmer 1982). Content coding was based on categories developed by Rice (1984) and…

  15. Formalizing Knowledge in Multi-Scale Agent-Based Simulations

    PubMed Central

    Somogyi, Endre; Sluka, James P.; Glazier, James A.

    2017-01-01

    Multi-scale, agent-based simulations of cellular and tissue biology are increasingly common. These simulations combine and integrate a range of components from different domains. Simulations continuously create, destroy and reorganize constituent elements causing their interactions to dynamically change. For example, the multi-cellular tissue development process coordinates molecular, cellular and tissue scale objects with biochemical, biomechanical, spatial and behavioral processes to form a dynamic network. Different domain specific languages can describe these components in isolation, but cannot describe their interactions. No current programming language is designed to represent in human readable and reusable form the domain specific knowledge contained in these components and interactions. We present a new hybrid programming language paradigm that naturally expresses the complex multi-scale objects and dynamic interactions in a unified way and allows domain knowledge to be captured, searched, formalized, extracted and reused. PMID:29338063

  16. Formalizing Knowledge in Multi-Scale Agent-Based Simulations.

    PubMed

    Somogyi, Endre; Sluka, James P; Glazier, James A

    2016-10-01

    Multi-scale, agent-based simulations of cellular and tissue biology are increasingly common. These simulations combine and integrate a range of components from different domains. Simulations continuously create, destroy and reorganize constituent elements causing their interactions to dynamically change. For example, the multi-cellular tissue development process coordinates molecular, cellular and tissue scale objects with biochemical, biomechanical, spatial and behavioral processes to form a dynamic network. Different domain specific languages can describe these components in isolation, but cannot describe their interactions. No current programming language is designed to represent in human readable and reusable form the domain specific knowledge contained in these components and interactions. We present a new hybrid programming language paradigm that naturally expresses the complex multi-scale objects and dynamic interactions in a unified way and allows domain knowledge to be captured, searched, formalized, extracted and reused.

  17. Analysis of Prospective Mathematics Teachers’ Basic Teaching Skills (a Study of Mathematics Education Departement Students’ Field Experience Program at STKIP Garut)

    NASA Astrophysics Data System (ADS)

    Rahayu, D. V.

    2017-02-01

    This study was intended to figure out basic teaching skills of Mathematics Department Students of STKIP Garut at Field Experience Program in academic year 2014/2015. This study was qualitative research with analysis descriptive technique. Instrument used in this study was observation sheet to measure basic teaching mathematics skills. The result showed that ability of content mastery and explaining skill were in average category. Questioning skill, conducting variations skill and conducting assessment skill were in good category. Managing classroom skill and giving motivation skill were in poor category. Based on the result, it can be concluded that the students’ basic teaching skills weren’t optimal. It is recommended for the collegians to get lesson with appropriate strategy so that they can optimize their basic teaching skills.

  18. Simulating secondary organic aerosol in a regional air quality model using the statistical oxidation model - Part 1: Assessing the influence of constrained multi-generational ageing

    NASA Astrophysics Data System (ADS)

    Jathar, S. H.; Cappa, C. D.; Wexler, A. S.; Seinfeld, J. H.; Kleeman, M. J.

    2015-09-01

    Multi-generational oxidation of volatile organic compound (VOC) oxidation products can significantly alter the mass, chemical composition and properties of secondary organic aerosol (SOA) compared to calculations that consider only the first few generations of oxidation reactions. However, the most commonly used state-of-the-science schemes in 3-D regional or global models that account for multi-generational oxidation (1) consider only functionalization reactions but do not consider fragmentation reactions, (2) have not been constrained to experimental data; and (3) are added on top of existing parameterizations. The incomplete description of multi-generational oxidation in these models has the potential to bias source apportionment and control calculations for SOA. In this work, we used the Statistical Oxidation Model (SOM) of Cappa and Wilson (2012), constrained by experimental laboratory chamber data, to evaluate the regional implications of multi-generational oxidation considering both functionalization and fragmentation reactions. SOM was implemented into the regional UCD/CIT air quality model and applied to air quality episodes in California and the eastern US. The mass, composition and properties of SOA predicted using SOM are compared to SOA predictions generated by a traditional "two-product" model to fully investigate the impact of explicit and self-consistent accounting of multi-generational oxidation. Results show that SOA mass concentrations predicted by the UCD/CIT-SOM model are very similar to those predicted by a two-product model when both models use parameters that are derived from the same chamber data. Since the two-product model does not explicitly resolve multi-generational oxidation reactions, this finding suggests that the chamber data used to parameterize the models captures the majority of the SOA mass formation from multi-generational oxidation under the conditions tested. Consequently, the use of low and high NOx yields perturbs SOA concentrations by a factor of two and are probably a much stronger determinant in 3-D models than constrained multi-generational oxidation. While total predicted SOA mass is similar for the SOM and two-product models, the SOM model predicts increased SOA contributions from anthropogenic (alkane, aromatic) and sesquiterpenes and decreased SOA contributions from isoprene and monoterpene relative to the two-product model calculations. The SOA predicted by SOM has a much lower volatility than that predicted by the traditional model resulting in better qualitative agreement with volatility measurements of ambient OA. On account of its lower-volatility, the SOA mass produced by SOM does not appear to be as strongly influenced by the inclusion of oligomerization reactions, whereas the two-product model relies heavily on oligomerization to form low volatility SOA products. Finally, an unconstrained contemporary hybrid scheme to model multi-generational oxidation within the framework of a two-product model in which "ageing" reactions are added on top of the existing two-product parameterization is considered. This hybrid scheme formed at least three times more SOA than the SOM during regional simulations as a result of excessive transformation of semi-volatile vapors into lower volatility material that strongly partitions to the particle phase. This finding suggests that these "hybrid" multi-generational schemes should be used with great caution in regional models.

  19. Iowa Child Care Quality Rating System: QRS Profile. The Child Care Quality Rating System (QRS) Assessment

    ERIC Educational Resources Information Center

    Child Trends, 2010

    2010-01-01

    This paper presents a profile of Iowa's Child Care Quality Rating System prepared as part of the Child Care Quality Rating System (QRS) Assessment Study. The profile is divided into the following categories: (1) Program Information; (2) Rating Details; (3) Quality Indicators for Center-Based Programs; (4) Indicators for Family Child Care Programs;…

  20. Gap analysis: Concepts, methods, and recent results

    USGS Publications Warehouse

    Jennings, M.D.

    2000-01-01

    Rapid progress is being made in the conceptual, technical, and organizational requirements for generating synoptic multi-scale views of the earth's surface and its biological content. Using the spatially comprehensive data that are now available, researchers, land managers, and land-use planners can, for the first time, quantitatively place landscape units - from general categories such as 'Forests' or 'Cold-Deciduous Shrubland Formation' to more categories such as 'Picea glauca-Abies balsamea-Populus spp. Forest Alliance' - in their large-area contexts. The National Gap Analysis Program (GAP) has developed the technical and organizational capabilities necessary for the regular production and analysis of such information. This paper provides a brief overview of concepts and methods as well as some recent results from the GAP projects. Clearly, new frameworks for biogeographic information and organizational cooperation are needed if we are to have any hope of documenting the full range of species occurrences and ecological processes in ways meaningful to their management. The GAP experience provides one model for achieving these new frameworks.

  1. Low-rank regularization for learning gene expression programs.

    PubMed

    Ye, Guibo; Tang, Mengfan; Cai, Jian-Feng; Nie, Qing; Xie, Xiaohui

    2013-01-01

    Learning gene expression programs directly from a set of observations is challenging due to the complexity of gene regulation, high noise of experimental measurements, and insufficient number of experimental measurements. Imposing additional constraints with strong and biologically motivated regularizations is critical in developing reliable and effective algorithms for inferring gene expression programs. Here we propose a new form of regulation that constrains the number of independent connectivity patterns between regulators and targets, motivated by the modular design of gene regulatory programs and the belief that the total number of independent regulatory modules should be small. We formulate a multi-target linear regression framework to incorporate this type of regulation, in which the number of independent connectivity patterns is expressed as the rank of the connectivity matrix between regulators and targets. We then generalize the linear framework to nonlinear cases, and prove that the generalized low-rank regularization model is still convex. Efficient algorithms are derived to solve both the linear and nonlinear low-rank regularized problems. Finally, we test the algorithms on three gene expression datasets, and show that the low-rank regularization improves the accuracy of gene expression prediction in these three datasets.

  2. Context-aware and locality-constrained coding for image categorization.

    PubMed

    Xiao, Wenhua; Wang, Bin; Liu, Yu; Bao, Weidong; Zhang, Maojun

    2014-01-01

    Improving the coding strategy for BOF (Bag-of-Features) based feature design has drawn increasing attention in recent image categorization works. However, the ambiguity in coding procedure still impedes its further development. In this paper, we introduce a context-aware and locality-constrained Coding (CALC) approach with context information for describing objects in a discriminative way. It is generally achieved by learning a word-to-word cooccurrence prior to imposing context information over locality-constrained coding. Firstly, the local context of each category is evaluated by learning a word-to-word cooccurrence matrix representing the spatial distribution of local features in neighbor region. Then, the learned cooccurrence matrix is used for measuring the context distance between local features and code words. Finally, a coding strategy simultaneously considers locality in feature space and context space, while introducing the weight of feature is proposed. This novel coding strategy not only semantically preserves the information in coding, but also has the ability to alleviate the noise distortion of each class. Extensive experiments on several available datasets (Scene-15, Caltech101, and Caltech256) are conducted to validate the superiority of our algorithm by comparing it with baselines and recent published methods. Experimental results show that our method significantly improves the performance of baselines and achieves comparable and even better performance with the state of the arts.

  3. Putting time into proof outlines

    NASA Technical Reports Server (NTRS)

    Schneider, Fred B.; Bloom, Bard; Marzullo, Keith

    1991-01-01

    A logic for reasoning about timing of concurrent programs is presented. The logic is based on proof outlines and can handle maximal parallelism as well as resource-constrained execution environments. The correctness proof for a mutual exclusion protocol that uses execution timings in a subtle way illustrates the logic in action.

  4. [Governance, sustainability, and equity in the health program for the municipality of São José dos Pinhais, Brazil].

    PubMed

    Bueno, Roberto Eduardo; Moysés, Simone Tetu; Bueno, Paula Alexandra Reis; Moysés, Samuel Jorge

    2013-12-01

    To analyze the Final Report of the VIII Health Conference and the São José dos Pinhais City Health Program for 2010-2013 and investigate whether these documents addressed the themes of sustainability, governance, and equity and the interfaces between these themes--government policies, power balance, and inclusive processes/impacting results--that make up the Concept Model for Human Development and Health Promotion developed by the authors. This case study analyzed 331 proposals approved for incorporation in the City Health Program. The six thematical categories of the Concept Model were analyzed using ATLAS Ti 5.0 software. The proposals were classified according to the number of themes and interfaces of the Concept Model: full health proposals contained all six categories; partial proposals contained three categories; and incipient proposals contained one category. Of 331 proposals approved, 162 (49%) contemplated the six thematical categories and were classified as full health promotion proposals. Ninety-five (29%) contemplated three categories (partial health promotion). Of these, 38 (12%) addressed Governance, Sustainability, and Government Policies, 33 (10%) addressed Governance, Power Balance, and Equity and 24 (7%) addressed Equity, Inclusive Processes/Impact Results, and Sustainability. Finally, 74 (22%) proposals contemplated only one category and were classified as incipient: 36 (11%) addressed Governance, 27 (8%) addressed sustainability, and 11 (3%) addressed equity. Based on the fact that 49% of the proposals approved were classified as full health promotion, it is considered that the effectiveness of social control and popular participation in the construction of health policies at the local level contritute to the promotion of health in the city.

  5. Factors related to leader implementation of a nationally disseminated community-based exercise program: a cross-sectional study

    PubMed Central

    Seguin, Rebecca A; Palombo, Ruth; Economos, Christina D; Hyatt, Raymond; Kuder, Julia; Nelson, Miriam E

    2008-01-01

    Background The benefits of community-based health programs are widely recognized. However, research examining factors related to community leaders' characteristics and roles in implementation is limited. Methods The purpose of this cross-sectional study was to use a social ecological framework of variables to explore and describe the relationships between socioeconomic, personal/behavioral, programmatic, leadership, and community-level social and demographic characteristics as they relate to the implementation of an evidence-based strength training program by community leaders. Eight-hundred fifty-four trained program leaders in 43 states were invited to participate in either an online or mail survey. Corresponding community-level characteristics were also collected. Programmatic details were obtained from those who implemented. Four-hundred eighty-seven program leaders responded to the survey (response rate = 57%), 78% online and 22% by mail. Results Of the 487 respondents, 270 implemented the program (55%). One or more factors from each category – professional, socioeconomic, personal/behavioral, and leadership characteristics – were significantly different between implementers and non-implementers, determined by chi square or student's t-tests as appropriate. Implementers reported higher levels of strength training participation, current and lifetime physical activity, perceived support, and leadership competence (all p < 0.05). Logistic regression analysis revealed a positive association between implementation and fitness credentials/certification (p = 0.003), program-specific self-efficacy (p = 0.002), and support-focused leadership (p = 0.006), and a negative association between implementation and educational attainment (p = 0.002). Conclusion Among this sample of trained leaders, several factors within the professional, socioeconomic, personal/behavioral, and leadership categories were related to whether they implemented a community-based exercise program. It may benefit future community-based physical activity program disseminations to consider these factors when selecting and training leaders. PMID:19055821

  6. The Higher Education Act and Minority Serving Institutions: Towards a Typology of Title III and V Funded Programs

    ERIC Educational Resources Information Center

    Boland, William Casey

    2018-01-01

    To date, there has been little analysis of MSI Title III and V grant-funded programs across all MSI categories. For researchers, practitioners, and policymakers, it is imperative to explore the contributions of MSIs as manifested in Title III and V grant-funded programs. The purpose of this study is to analyze MSI Title III and V programs based on…

  7. Infants' Discrimination of Consonants: Interplay between Word Position and Acoustic Saliency

    ERIC Educational Resources Information Center

    Archer, Stephanie L.; Zamuner, Tania; Engel, Kathleen; Fais, Laurel; Curtin, Suzanne

    2016-01-01

    Research has shown that young infants use contrasting acoustic information to distinguish consonants. This has been used to argue that by 12 months, infants have homed in on their native language sound categories. However, this ability seems to be positionally constrained, with contrasts at the beginning of words (onsets) discriminated earlier.…

  8. Condom use as situated in a risk context: women's experiences in the massage parlour industry in Vancouver, Canada.

    PubMed

    Handlovsky, Ingrid; Bungay, Vicky; Kolar, Kat

    2012-10-01

    Investigation into condom use in sex work has aroused interest in health promotion and illness prevention. Yet there remains a dearth of inquiry into condom use practices in the indoor sex industry, particularly in North America. We performed a thematic analysis of one aspect of the indoor sex work by drawing on data from a larger mixed-methods study that investigated women's health issues in the massage parlour industry in Vancouver, Canada. Using a risk context framework, condom use was approached as a socially situated practice constituted by supportive and constraining dynamics. Three analytic categories were identified: (1) the process of condom negotiation, (2) the availability of condoms and accessibility to information on STI and (3) financial vulnerability. Within these categories, several supportive dynamics (industry experience and personal ingenuity) and constraining dynamics (lack of agency support, client preferences, limited language proficiency and the legal system) were explored as interfacing influences on condom use. Initiatives to encourage condom use must recognise the role of context in order to more effectively support the health-promoting efforts of women in sex work.

  9. Show me the data: advances in multi-model benchmarking, assimilation, and forecasting

    NASA Astrophysics Data System (ADS)

    Dietze, M.; Raiho, A.; Fer, I.; Cowdery, E.; Kooper, R.; Kelly, R.; Shiklomanov, A. N.; Desai, A. R.; Simkins, J.; Gardella, A.; Serbin, S.

    2016-12-01

    Researchers want their data to inform carbon cycle predictions, but there are considerable bottlenecks between data collection and the use of data to calibrate and validate earth system models and inform predictions. This talk highlights recent advancements in the PEcAn project aimed at it making it easier for individual researchers to confront models with their own data: (1) The development of an easily extensible site-scale benchmarking system aimed at ensuring that models capture process rather than just reproducing pattern; (2) Efficient emulator-based Bayesian parameter data assimilation to constrain model parameters; (3) A novel, generalized approach to ensemble data assimilation to estimate carbon pools and fluxes and quantify process error; (4) automated processing and downscaling of CMIP climate scenarios to support forecasts that include driver uncertainty; (5) a large expansion in the number of models supported, with new tools for conducting multi-model and multi-site analyses; and (6) a network-based architecture that allows analyses to be shared with model developers and other collaborators. Application of these methods is illustrated with data across a wide range of time scales, from eddy-covariance to forest inventories to tree rings to paleoecological pollen proxies.

  10. Hurricane Earl Multi-level Winds

    NASA Image and Video Library

    2010-09-02

    NASA Multi-angle Imaging SpectroRadiometer instrument captured this image of Hurricane Earl Aug. 30, 2010. At this time, Hurricane Earl was a Category 3 storm. The hurricane eye is just visible on the right edge of the MISR image swath.

  11. Tank waste remediation system multi-year work plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The Tank Waste Remediation System (TWRS) Multi-Year Work Plan (MYWP) documents the detailed total Program baseline and was constructed to guide Program execution. The TWRS MYWP is one of two elements that comprise the TWRS Program Management Plan. The TWRS MYWP fulfills the Hanford Site Management System requirement for a Multi-Year Program Plan and a Fiscal-Year Work Plan. The MYWP addresses program vision, mission, objectives, strategy, functions and requirements, risks, decisions, assumptions, constraints, structure, logic, schedule, resource requirements, and waste generation and disposition. Sections 1 through 6, Section 8, and the appendixes provide program-wide information. Section 7 includes a subsectionmore » for each of the nine program elements that comprise the TWRS Program. The foundation of any program baseline is base planning data (e.g., defendable product definition, logic, schedules, cost estimates, and bases of estimates). The TWRS Program continues to improve base data. As data improve, so will program element planning, integration between program elements, integration outside of the TWRS Program, and the overall quality of the TWRS MYWP. The MYWP establishes the TWRS baseline objectives to store, treat, and immobilize highly radioactive Hanford waste in an environmentally sound, safe, and cost-effective manner. The TWRS Program will complete the baseline mission in 2040 and will incur costs totalling approximately 40 billion dollars. The summary strategy is to meet the above objectives by using a robust systems engineering effort, placing the highest possible priority on safety and environmental protection; encouraging {open_quotes}out sourcing{close_quotes} of the work to the extent practical; and managing significant but limited resources to move toward final disposition of tank wastes, while openly communicating with all interested stakeholders.« less

  12. Tank waste remediation system multi-year work plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1994-09-01

    The Tank Waste Remediation System (TWRS) Multi-Year Work Plan (MYWP) documents the detailed total Program baseline and was constructed to guide Program execution. The TWRS MYWP is one of two elements that comprise the TWRS Program Management Plan. The TWRS MYWP fulfills the Hanford Site Management System requirement for a Multi-Year Program Plan and a Fiscal-Year Work Plan. The MYWP addresses program vision, mission, objectives, strategy, functions and requirements, risks, decisions, assumptions, constraints, structure, logic, schedule, resource requirements, and waste generation and disposition. Sections 1 through 6, Section 8, and the appendixes provide program-wide information. Section 7 includes a subsectionmore » for each of the nine program elements that comprise the TWRS Program. The foundation of any program baseline is base planning data (e.g., defendable product definition, logic, schedules, cost estimates, and bases of estimates). The TWRS Program continues to improve base data. As data improve, so will program element planning, integration between program elements, integration outside of the TWRS Program, and the overall quality of the TWRS MYWP. The MYWP establishes the TWRS baseline objectives to store, treat, and immobilize highly radioactive Hanford waste in an environmentally sound, safe, and cost-effective manner. The TWRS Program will complete the baseline mission in 2040 and will incur costs totalling approximately 40 billion dollars. The summary strategy is to meet the above objectives by using a robust systems engineering effort, placing the highest possible priority on safety and environmental protection; encouraging {open_quotes}out sourcing{close_quotes} of the work to the extent practical; and managing significant but limited resources to move toward final disposition of tank wastes, while openly communicating with all interested stakeholders.« less

  13. Chance Constrained Programming Methods in Probabilistic Programming.

    DTIC Science & Technology

    1982-03-01

    Financial and Quantitative Analysis 2, 1967. Also reproduced in R. F. Byrne et. al., eds.5tudies in Budgeting (Amsterdam: North Holland, 1971 ). [3...Rules for the E-Model of Chance-Constrained Programming," Management Science, 17, 1971 . [23] Garstka, S. J. "The Economic Equivalence of Several...Iowa City: The University of Iowa College of Business Administration, 1981). -3- (29] Kall , P. and A. Prekopa, eds, Recent Results in Stochastic

  14. Competency-Based Adult Education: Florida Model.

    ERIC Educational Resources Information Center

    Singer, Elizabeth

    This compilation of program materials serves as an introduction to Florida's Brevard Community College's (BCC's) Competency-Based Adult High School Completion Project, a multi-year project designed to teach adult administrators, counselors, and teachers how to organize and implement a competency-based adult education (CBAE) program; to critique…

  15. Multi-Agent Simulation of Allocating and Routing Ambulances Under Condition of Street Blockage after Natural Disaster

    NASA Astrophysics Data System (ADS)

    Azimi, S.; Delavar, M. R.; Rajabifard, A.

    2017-09-01

    In response to natural disasters, efficient planning for optimum allocation of the medical assistance to wounded as fast as possible and wayfinding of first responders immediately to minimize the risk of natural disasters are of prime importance. This paper aims to propose a multi-agent based modeling for optimum allocation of space to emergency centers according to the population, street network and number of ambulances in emergency centers by constraint network Voronoi diagrams, wayfinding of ambulances from emergency centers to the wounded locations and return based on the minimum ambulances travel time and path length implemented by NSGA and the use of smart city facilities to accelerate the rescue operation. Simulated annealing algorithm has been used for minimizing the difference between demands and supplies of the constrained network Voronoi diagrams. In the proposed multi-agent system, after delivering the location of the wounded and their symptoms, the constraint network Voronoi diagram for each emergency center is determined. This process was performed simultaneously for the multi-injuries in different Voronoi diagrams. In the proposed multi-agent system, the priority of the injuries for receiving medical assistance and facilities of the smart city for reporting the blocked streets was considered. Tehran Municipality District 5 was considered as the study area and during 3 minutes intervals, the volunteers reported the blocked street. The difference between the supply and the demand divided to the supply in each Voronoi diagram decreased to 0.1601. In the proposed multi-agent system, the response time of the ambulances is decreased about 36.7%.

  16. Classification of Initial conditions required for Substorm prediction.

    NASA Astrophysics Data System (ADS)

    Patra, S.; Spencer, E. A.

    2014-12-01

    We investigate different classes of substorms that occur as a result of various drivers such as the conditions in the solar wind and the internal state of the magnetosphere ionosphere system during the geomagnetic activity. In performing our study, we develop and use our low order physics based nonlinear model of the magnetosphere called WINDMI to establish the global energy exchange between the solar wind, magnetosphere and ionosphere by constraining the model results to satellite and ground measurements. On the other hand, we make quantitative and qualitative comparisons between our low order model with available MHD, multi-fluid and ring current simulations in terms of the energy transfer between the geomagnetic tail, plasma sheet, field aligned currents, ionospheric currents and ring current, during isolated substorms, storm time substorms, and sawtooth events. We use high resolution solar wind data from the ACE satellite, measurements from the CLUSTER and THEMIS missions satellites, and ground based magnetometer measurements from SUPERMAG and WDC Kyoto, to further develop our low order physics based model. Finally, we attempt to answer the following questions: 1) What conditions in the solar wind influence the type of substorm event. This includes the IMF strength and orientation, the particle densities, velocities and temperatures, and the timing of changes such as shocks, southward turnings or northward turnings of the IMF. 2) What is the state of the magnetosphere ionosphere system before an event begins. These are the steady state conditions prior to an event, if they exist, which produce the satellite and ground based measurements matched to the WINDMI model. 3) How does the prior state of the magnetosphere influence the transition into a particular mode of behavior under solar wind forcing. 4) Is it possible to classify the states of the magnetosphere into distinct categories depending on pre-conditioning, and solar wind forcing conditions? 5) Can we predict the occurrence of substorms with any confidence?

  17. Interacting agricultural pests and their effect on crop yield: application of a Bayesian decision theory approach to the joint management of Bromus tectorum and Cephus cinctus.

    PubMed

    Keren, Ilai N; Menalled, Fabian D; Weaver, David K; Robison-Cox, James F

    2015-01-01

    Worldwide, the landscape homogeneity of extensive monocultures that characterizes conventional agriculture has resulted in the development of specialized and interacting multitrophic pest complexes. While integrated pest management emphasizes the need to consider the ecological context where multiple species coexist, management recommendations are often based on single-species tactics. This approach may not provide satisfactory solutions when confronted with the complex interactions occurring between organisms at the same or different trophic levels. Replacement of the single-species management model with more sophisticated, multi-species programs requires an understanding of the direct and indirect interactions occurring between the crop and all categories of pests. We evaluated a modeling framework to make multi-pest management decisions taking into account direct and indirect interactions among species belonging to different trophic levels. We adopted a Bayesian decision theory approach in combination with path analysis to evaluate interactions between Bromus tectorum (downy brome, cheatgrass) and Cephus cinctus (wheat stem sawfly) in wheat (Triticum aestivum) systems. We assessed their joint responses to weed management tactics, seeding rates, and cultivar tolerance to insect stem boring or competition. Our results indicated that C. cinctus oviposition behavior varied as a function of B. tectorum pressure. Crop responses were more readily explained by the joint effects of management tactics on both categories of pests and their interactions than just by the direct impact of any particular management scheme on yield. In accordance, a C. cinctus tolerant variety should be planted at a low seeding rate under high insect pressure. However as B. tectorum levels increase, the C. cinctus tolerant variety should be replaced by a competitive and drought tolerant cultivar at high seeding rates despite C. cinctus infestation. This study exemplifies the necessity of accounting for direct and indirect biological interactions occurring within agroecosystems and propagating this information from the statistical analysis stage to the management stage.

  18. U.S. Department of Energy Isotope Program

    ScienceCinema

    None

    2018-01-16

    The National Isotope Development Center (NIDC) interfaces with the User Community and manages the coordination of isotope production across the facilities and business operations involved in the production, sale, and distribution of isotopes. A virtual center, the NIDC is funded by the Isotope Development and Production for Research and Applications (IDPRA) subprogram of the Office of Nuclear Physics in the U.S. Department of Energy Office of Science. PNNL’s Isotope Program operates in a multi-program category-2 nuclear facility, the Radiochemical Processing Laboratory (RPL), that contains 16 hot cells and 20 gloveboxes. As part of the DOE Isotope Program, the Pacific Northwest National Laboratory dispenses strontium-90, neptunium-237, radium-223, and thorium-227. PNNL’s Isotope Program uses a dedicated hot-cell for strontium-90 dispensing and a dedicated glovebox for radium-223 and thorium-227 dispensing. PNNL’s Isotope Program has access to state of the art analytical equipment in the RPL to support their research and production activities. DOE Isotope Program funded research at PNNL has advanced the application of automated radiochemistry for isotope such as zirconium-89 and astatine-211 in partnership with the University of Washington.

  19. U.S. Department of Energy Isotope Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    The National Isotope Development Center (NIDC) interfaces with the User Community and manages the coordination of isotope production across the facilities and business operations involved in the production, sale, and distribution of isotopes. A virtual center, the NIDC is funded by the Isotope Development and Production for Research and Applications (IDPRA) subprogram of the Office of Nuclear Physics in the U.S. Department of Energy Office of Science. PNNL’s Isotope Program operates in a multi-program category-2 nuclear facility, the Radiochemical Processing Laboratory (RPL), that contains 16 hot cells and 20 gloveboxes. As part of the DOE Isotope Program, the Pacific Northwestmore » National Laboratory dispenses strontium-90, neptunium-237, radium-223, and thorium-227. PNNL’s Isotope Program uses a dedicated hot-cell for strontium-90 dispensing and a dedicated glovebox for radium-223 and thorium-227 dispensing. PNNL’s Isotope Program has access to state of the art analytical equipment in the RPL to support their research and production activities. DOE Isotope Program funded research at PNNL has advanced the application of automated radiochemistry for isotope such as zirconium-89 and astatine-211 in partnership with the University of Washington.« less

  20. Tabu search algorithm for the distance-constrained vehicle routing problem with split deliveries by order.

    PubMed

    Xia, Yangkun; Fu, Zhuo; Pan, Lijun; Duan, Fenghua

    2018-01-01

    The vehicle routing problem (VRP) has a wide range of applications in the field of logistics distribution. In order to reduce the cost of logistics distribution, the distance-constrained and capacitated VRP with split deliveries by order (DCVRPSDO) was studied. We show that the customer demand, which can't be split in the classical VRP model, can only be discrete split deliveries by order. A model of double objective programming is constructed by taking the minimum number of vehicles used and minimum vehicle traveling cost as the first and the second objective, respectively. This approach contains a series of constraints, such as single depot, single vehicle type, distance-constrained and load capacity limit, split delivery by order, etc. DCVRPSDO is a new type of VRP. A new tabu search algorithm is designed to solve the problem and the examples testing show the efficiency of the proposed algorithm. This paper focuses on constructing a double objective mathematical programming model for DCVRPSDO and designing an adaptive tabu search algorithm (ATSA) with good performance to solving the problem. The performance of the ATSA is improved by adding some strategies into the search process, including: (a) a strategy of discrete split deliveries by order is used to split the customer demand; (b) a multi-neighborhood structure is designed to enhance the ability of global optimization; (c) two levels of evaluation objectives are set to select the current solution and the best solution; (d) a discriminating strategy of that the best solution must be feasible and the current solution can accept some infeasible solution, helps to balance the performance of the solution and the diversity of the neighborhood solution; (e) an adaptive penalty mechanism will help the candidate solution be closer to the neighborhood of feasible solution; (f) a strategy of tabu releasing is used to transfer the current solution into a new neighborhood of the better solution.

  1. Tabu search algorithm for the distance-constrained vehicle routing problem with split deliveries by order

    PubMed Central

    Xia, Yangkun; Pan, Lijun; Duan, Fenghua

    2018-01-01

    The vehicle routing problem (VRP) has a wide range of applications in the field of logistics distribution. In order to reduce the cost of logistics distribution, the distance-constrained and capacitated VRP with split deliveries by order (DCVRPSDO) was studied. We show that the customer demand, which can’t be split in the classical VRP model, can only be discrete split deliveries by order. A model of double objective programming is constructed by taking the minimum number of vehicles used and minimum vehicle traveling cost as the first and the second objective, respectively. This approach contains a series of constraints, such as single depot, single vehicle type, distance-constrained and load capacity limit, split delivery by order, etc. DCVRPSDO is a new type of VRP. A new tabu search algorithm is designed to solve the problem and the examples testing show the efficiency of the proposed algorithm. This paper focuses on constructing a double objective mathematical programming model for DCVRPSDO and designing an adaptive tabu search algorithm (ATSA) with good performance to solving the problem. The performance of the ATSA is improved by adding some strategies into the search process, including: (a) a strategy of discrete split deliveries by order is used to split the customer demand; (b) a multi-neighborhood structure is designed to enhance the ability of global optimization; (c) two levels of evaluation objectives are set to select the current solution and the best solution; (d) a discriminating strategy of that the best solution must be feasible and the current solution can accept some infeasible solution, helps to balance the performance of the solution and the diversity of the neighborhood solution; (e) an adaptive penalty mechanism will help the candidate solution be closer to the neighborhood of feasible solution; (f) a strategy of tabu releasing is used to transfer the current solution into a new neighborhood of the better solution. PMID:29763419

  2. A Survey of Recent Advances in Particle Filters and Remaining Challenges for Multitarget Tracking

    PubMed Central

    Wang, Xuedong; Sun, Shudong; Corchado, Juan M.

    2017-01-01

    We review some advances of the particle filtering (PF) algorithm that have been achieved in the last decade in the context of target tracking, with regard to either a single target or multiple targets in the presence of false or missing data. The first part of our review is on remarkable achievements that have been made for the single-target PF from several aspects including importance proposal, computing efficiency, particle degeneracy/impoverishment and constrained/multi-modal systems. The second part of our review is on analyzing the intractable challenges raised within the general multitarget (multi-sensor) tracking due to random target birth and termination, false alarm, misdetection, measurement-to-track (M2T) uncertainty and track uncertainty. The mainstream multitarget PF approaches consist of two main classes, one based on M2T association approaches and the other not such as the finite set statistics-based PF. In either case, significant challenges remain due to unknown tracking scenarios and integrated tracking management. PMID:29168772

  3. Using Promotores Programs to Improve Latino Health Outcomes: Implementation Challenges for Community-based Nonprofit Organizations

    PubMed Central

    Twombly, Eric C.; Holtz, Kristen D.; Stringer, Kimberly

    2012-01-01

    Promotores are community lay health workers, who provide outreach and services to Latinos. Little research on the promotores programs exists and the focus of this article is to identify the challenges faced by community-based nonprofits when implementing promotores programs. To explore this type of program telephone interviews were conducted with ten promotores academic experts and nonprofit executives. The results suggest that implementation challenges fall into three major categories: the lack of standardized information on promotores programs, labor issues, and organizational costs. Future recommendations are made which highlight promotores recruitment and retention strategies, and the development of a clearinghouse of programmatic implementation information for community-based nonprofits. PMID:23188929

  4. Using Petri nets for experimental design in a multi-organ elimination pathway.

    PubMed

    Reshetova, Polina; Smilde, Age K; Westerhuis, Johan A; van Kampen, Antoine H C

    2015-08-01

    Genistein is a soy metabolite with estrogenic activity that may result in (un)favorable effects on human health. Elucidation of the mechanisms through which food additives such as genistein exert their beneficiary effects is a major challenge for the food industry. A better understanding of the genistein elimination pathway could shed light on such mechanisms. We developed a Petri net model that represents this multi-organ elimination pathway and which assists in the design of future experiments. Using this model we show that metabolic profiles solely measured in venous blood are not sufficient to uniquely parameterize the model. Based on simulations we suggest two solutions that provide better results: parameterize the model using gut epithelium profiles or add additional biological constrains in the model. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Promoting the University Social Responsibility in the Capacity Development Program for Landslide Risk Reduction in Indonesia

    NASA Astrophysics Data System (ADS)

    Karnawati, D.; Wilopo, W.; Verrier, M.; Fathani, T. F.; Andayani, B.

    2011-12-01

    One of the most challenges efforts for landslides disaster risk reduction in Indonesia is to provide an effective program for capacity development of the community living in the vulnerable area. Limited access for appropriate information and knowledge about the geology and landslide phenomena as well as the social-security constrains are the major challenges in capacity development program in the landslide prone area. Accordingly, an action for conducting community-based research and education program with respect to landslide mitigation and disaster risk reduction at the village level was established by implementing the University Social Responsibility Program. Such program has been conducted regularly in every academic semester as a part of the formal academic program at Universitas Gadjah Mada , Indonesia. Twenty students with multi-discipline backgrounds and supported by their lectures/advisers have to be deployed at the village for two months to carry out such mission. This action is also conducted under the coordination with the local/ national Government together with the local community, and may also with the private sectors. A series of research actions such as landslide investigation and hazard-risk mapping, social mapping and development of landslide early warning system were carried out in parallel with public education and evacuation drill for community empowerment and landslide risk reduction. A Community Task Force for Disaster Risk Reduction was also established during the community empowerment program, in order to guarantee the affectivity and sustainability of the disaster risk reduction program at the village level. It is crucial that this program is not only beneficial for empowering the village community to tackle the landslide problems, but also important to support the education for sustainable development program at the disaster prone area. Indeed, this capacity development program may also be considered as one best practice for transforming the knowledge into action and the action into knowledge enhancement, with respect to landslide disaster risk reduction.
    Scope of problems and the actions conducted by Universitas Gadjah Mada as The University Social Responsibility Program for Landslide Disaster Risk Reduction in Indonesia

  6. Air Quality Conformity Determination Of the Constrained Long Range Plan And The FY99-2004 Transportation Improvement Program For The Washington Metropolitan Region

    DOT National Transportation Integrated Search

    1998-07-15

    This report documents the assessment of the Constrained Long Range Plan (CLRP) and the FY99-2004 Transportation Improvement Program (TIP) with respect to air quality conformity requirements under the 1990 Clean Air Act Amendments. The assessment used...

  7. NEWSUMT: A FORTRAN program for inequality constrained function minimization, users guide

    NASA Technical Reports Server (NTRS)

    Miura, H.; Schmit, L. A., Jr.

    1979-01-01

    A computer program written in FORTRAN subroutine form for the solution of linear and nonlinear constrained and unconstrained function minimization problems is presented. The algorithm is the sequence of unconstrained minimizations using the Newton's method for unconstrained function minimizations. The use of NEWSUMT and the definition of all parameters are described.

  8. LINKS: learning-based multi-source IntegratioN frameworK for Segmentation of infant brain images.

    PubMed

    Wang, Li; Gao, Yaozong; Shi, Feng; Li, Gang; Gilmore, John H; Lin, Weili; Shen, Dinggang

    2015-03-01

    Segmentation of infant brain MR images is challenging due to insufficient image quality, severe partial volume effect, and ongoing maturation and myelination processes. In the first year of life, the image contrast between white and gray matters of the infant brain undergoes dramatic changes. In particular, the image contrast is inverted around 6-8months of age, and the white and gray matter tissues are isointense in both T1- and T2-weighted MR images and thus exhibit the extremely low tissue contrast, which poses significant challenges for automated segmentation. Most previous studies used multi-atlas label fusion strategy, which has the limitation of equally treating the different available image modalities and is often computationally expensive. To cope with these limitations, in this paper, we propose a novel learning-based multi-source integration framework for segmentation of infant brain images. Specifically, we employ the random forest technique to effectively integrate features from multi-source images together for tissue segmentation. Here, the multi-source images include initially only the multi-modality (T1, T2 and FA) images and later also the iteratively estimated and refined tissue probability maps of gray matter, white matter, and cerebrospinal fluid. Experimental results on 119 infants show that the proposed method achieves better performance than other state-of-the-art automated segmentation methods. Further validation was performed on the MICCAI grand challenge and the proposed method was ranked top among all competing methods. Moreover, to alleviate the possible anatomical errors, our method can also be combined with an anatomically-constrained multi-atlas labeling approach for further improving the segmentation accuracy. Copyright © 2014 Elsevier Inc. All rights reserved.

  9. LINKS: Learning-based multi-source IntegratioN frameworK for Segmentation of infant brain images

    PubMed Central

    Wang, Li; Gao, Yaozong; Shi, Feng; Li, Gang; Gilmore, John H.; Lin, Weili; Shen, Dinggang

    2014-01-01

    Segmentation of infant brain MR images is challenging due to insufficient image quality, severe partial volume effect, and ongoing maturation and myelination processes. In the first year of life, the image contrast between white and gray matters of the infant brain undergoes dramatic changes. In particular, the image contrast is inverted around 6-8 months of age, and the white and gray matter tissues are isointense in both T1- and T2-weighted MR images and thus exhibit the extremely low tissue contrast, which poses significant challenges for automated segmentation. Most previous studies used multi-atlas label fusion strategy, which has the limitation of equally treating the different available image modalities and is often computationally expensive. To cope with these limitations, in this paper, we propose a novel learning-based multi-source integration framework for segmentation of infant brain images. Specifically, we employ the random forest technique to effectively integrate features from multi-source images together for tissue segmentation. Here, the multi-source images include initially only the multi-modality (T1, T2 and FA) images and later also the iteratively estimated and refined tissue probability maps of gray matter, white matter, and cerebrospinal fluid. Experimental results on 119 infants show that the proposed method achieves better performance than other state-of-the-art automated segmentation methods. Further validation was performed on the MICCAI grand challenge and the proposed method was ranked top among all competing methods. Moreover, to alleviate the possible anatomical errors, our method can also be combined with an anatomically-constrained multi-atlas labeling approach for further improving the segmentation accuracy. PMID:25541188

  10. Resolution of singularities for multi-loop integrals

    NASA Astrophysics Data System (ADS)

    Bogner, Christian; Weinzierl, Stefan

    2008-04-01

    We report on a program for the numerical evaluation of divergent multi-loop integrals. The program is based on iterated sector decomposition. We improve the original algorithm of Binoth and Heinrich such that the program is guaranteed to terminate. The program can be used to compute numerically the Laurent expansion of divergent multi-loop integrals regulated by dimensional regularisation. The symbolic and the numerical steps of the algorithm are combined into one program. Program summaryProgram title: sector_decomposition Catalogue identifier: AEAG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEAG_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 47 506 No. of bytes in distributed program, including test data, etc.: 328 485 Distribution format: tar.gz Programming language: C++ Computer: all Operating system: Unix RAM: Depending on the complexity of the problem Classification: 4.4 External routines: GiNaC, available from http://www.ginac.de, GNU scientific library, available from http://www.gnu.org/software/gsl Nature of problem: Computation of divergent multi-loop integrals. Solution method: Sector decomposition. Restrictions: Only limited by the available memory and CPU time. Running time: Depending on the complexity of the problem.

  11. JuPOETs: a constrained multiobjective optimization approach to estimate biochemical model ensembles in the Julia programming language.

    PubMed

    Bassen, David M; Vilkhovoy, Michael; Minot, Mason; Butcher, Jonathan T; Varner, Jeffrey D

    2017-01-25

    Ensemble modeling is a promising approach for obtaining robust predictions and coarse grained population behavior in deterministic mathematical models. Ensemble approaches address model uncertainty by using parameter or model families instead of single best-fit parameters or fixed model structures. Parameter ensembles can be selected based upon simulation error, along with other criteria such as diversity or steady-state performance. Simulations using parameter ensembles can estimate confidence intervals on model variables, and robustly constrain model predictions, despite having many poorly constrained parameters. In this software note, we present a multiobjective based technique to estimate parameter or models ensembles, the Pareto Optimal Ensemble Technique in the Julia programming language (JuPOETs). JuPOETs integrates simulated annealing with Pareto optimality to estimate ensembles on or near the optimal tradeoff surface between competing training objectives. We demonstrate JuPOETs on a suite of multiobjective problems, including test functions with parameter bounds and system constraints as well as for the identification of a proof-of-concept biochemical model with four conflicting training objectives. JuPOETs identified optimal or near optimal solutions approximately six-fold faster than a corresponding implementation in Octave for the suite of test functions. For the proof-of-concept biochemical model, JuPOETs produced an ensemble of parameters that gave both the mean of the training data for conflicting data sets, while simultaneously estimating parameter sets that performed well on each of the individual objective functions. JuPOETs is a promising approach for the estimation of parameter and model ensembles using multiobjective optimization. JuPOETs can be adapted to solve many problem types, including mixed binary and continuous variable types, bilevel optimization problems and constrained problems without altering the base algorithm. JuPOETs is open source, available under an MIT license, and can be installed using the Julia package manager from the JuPOETs GitHub repository.

  12. Developing Systems Engineering Experience Accelerator (SEEA) Prototype and Roadmap -- Increment 4

    DTIC Science & Technology

    2017-08-08

    of an acquisition program, two categories of new capabilities were added to the UAV experience. Based on a student project at Stevens Institute of...program for a new unmanned aerial vehicle (UAV) system. It was based on the concept of the learners assuming this role shortly after preliminary...University curriculum for systems engineers. First, several new capabilities have been added. These include a trade study for additional technical

  13. Hazard Interactions and Interaction Networks (Cascades) within Multi-Hazard Methodologies

    NASA Astrophysics Data System (ADS)

    Gill, Joel; Malamud, Bruce D.

    2016-04-01

    Here we combine research and commentary to reinforce the importance of integrating hazard interactions and interaction networks (cascades) into multi-hazard methodologies. We present a synthesis of the differences between 'multi-layer single hazard' approaches and 'multi-hazard' approaches that integrate such interactions. This synthesis suggests that ignoring interactions could distort management priorities, increase vulnerability to other spatially relevant hazards or underestimate disaster risk. We proceed to present an enhanced multi-hazard framework, through the following steps: (i) describe and define three groups (natural hazards, anthropogenic processes and technological hazards/disasters) as relevant components of a multi-hazard environment; (ii) outline three types of interaction relationship (triggering, increased probability, and catalysis/impedance); and (iii) assess the importance of networks of interactions (cascades) through case-study examples (based on literature, field observations and semi-structured interviews). We further propose visualisation frameworks to represent these networks of interactions. Our approach reinforces the importance of integrating interactions between natural hazards, anthropogenic processes and technological hazards/disasters into enhanced multi-hazard methodologies. Multi-hazard approaches support the holistic assessment of hazard potential, and consequently disaster risk. We conclude by describing three ways by which understanding networks of interactions contributes to the theoretical and practical understanding of hazards, disaster risk reduction and Earth system management. Understanding interactions and interaction networks helps us to better (i) model the observed reality of disaster events, (ii) constrain potential changes in physical and social vulnerability between successive hazards, and (iii) prioritise resource allocation for mitigation and disaster risk reduction.

  14. How can a successful multi-family residential recycling programme be initiated within Baltimore City, Maryland?

    PubMed

    Schwebel, Michael B

    2012-07-01

    Baltimore City formally began recycling in 1989 with all neighbourhoods having residential collection by 1992. Although the city of 637 000 has recycled for approximately 20 years, almost all residents in multi-family residential (MFR) housing have been and are still barred from participating at their residences. Discussions with City officials and residents have verified this antiquated policy of exclusion within MFR housing. Yet, the policy is still observed by the Department of Public Works even though the updated single-stream Code states that the 'Director of Public Works must collect all. . .recyclable materials. . .from all dwellings, including multiple-family dwellings'. The purpose of this study's is to provide policies, regulations, and recommendations for implementing requisite MFR recycling within Baltimore City. The study's methodology follows a case study approach by examining three cities in the United States that currently mandate MFR recycling: Chicago, Illinois; Boston, Massachusetts; and Arlington, Virginia. Post-analysis suggests that while some cities' MFR programmes perform poorly, each city's strengths aid in creating specific proposals that can produce a successful MFR recycling program in Baltimore City. These tenets of a future MFR recycling program form the basis of a successful MFR recycling program that will allow all city residents to participate via initiatives in the categories of both programme, accessibility, and informing and self-review.

  15. Monolithic, multi-bandgap, tandem, ultra-thin, strain-counterbalanced, photovoltaic energy converters with optimal subcell bandgaps

    DOEpatents

    Wanlass, Mark W [Golden, CO; Mascarenhas, Angelo [Lakewood, CO

    2012-05-08

    Modeling a monolithic, multi-bandgap, tandem, solar photovoltaic converter or thermophotovoltaic converter by constraining the bandgap value for the bottom subcell to no less than a particular value produces an optimum combination of subcell bandgaps that provide theoretical energy conversion efficiencies nearly as good as unconstrained maximum theoretical conversion efficiency models, but which are more conducive to actual fabrication to achieve such conversion efficiencies than unconstrained model optimum bandgap combinations. Achieving such constrained or unconstrained optimum bandgap combinations includes growth of a graded layer transition from larger lattice constant on the parent substrate to a smaller lattice constant to accommodate higher bandgap upper subcells and at least one graded layer that transitions back to a larger lattice constant to accommodate lower bandgap lower subcells and to counter-strain the epistructure to mitigate epistructure bowing.

  16. When students can choose easy, medium, or hard homework problems

    NASA Astrophysics Data System (ADS)

    Teodorescu, Raluca E.; Seaton, Daniel T.; Cardamone, Caroline N.; Rayyan, Saif; Abbott, Jonathan E.; Barrantes, Analia; Pawl, Andrew; Pritchard, David E.

    2012-02-01

    We investigate student-chosen, multi-level homework in our Integrated Learning Environment for Mechanics [1] built using the LON-CAPA [2] open-source learning system. Multi-level refers to problems categorized as easy, medium, and hard. Problem levels were determined a priori based on the knowledge needed to solve them [3]. We analyze these problems using three measures: time-per-problem, LON-CAPA difficulty, and item difficulty measured by item response theory. Our analysis of student behavior in this environment suggests that time-per-problem is strongly dependent on problem category, unlike either score-based measures. We also found trends in student choice of problems, overall effort, and efficiency across the student population. Allowing students choice in problem solving seems to improve their motivation; 70% of students worked additional problems for which no credit was given.

  17. Genetic algorithm-based multi-objective optimal absorber system for three-dimensional seismic structures

    NASA Astrophysics Data System (ADS)

    Ren, Wenjie; Li, Hongnan; Song, Gangbing; Huo, Linsheng

    2009-03-01

    The problem of optimizing an absorber system for three-dimensional seismic structures is addressed. The objective is to determine the number and position of absorbers to minimize the coupling effects of translation-torsion of structures at minimum cost. A procedure for a multi-objective optimization problem is developed by integrating a dominance-based selection operator and a dominance-based penalty function method. Based on the two-branch tournament genetic algorithm, the selection operator is constructed by evaluating individuals according to their dominance in one run. The technique guarantees the better performing individual winning its competition, provides a slight selection pressure toward individuals and maintains diversity in the population. Moreover, due to the evaluation for individuals in each generation being finished in one run, less computational effort is taken. Penalty function methods are generally used to transform a constrained optimization problem into an unconstrained one. The dominance-based penalty function contains necessary information on non-dominated character and infeasible position of an individual, essential for success in seeking a Pareto optimal set. The proposed approach is used to obtain a set of non-dominated designs for a six-storey three-dimensional building with shape memory alloy dampers subjected to earthquake.

  18. Constraining geostatistical models with hydrological data to improve prediction realism

    NASA Astrophysics Data System (ADS)

    Demyanov, V.; Rojas, T.; Christie, M.; Arnold, D.

    2012-04-01

    Geostatistical models reproduce spatial correlation based on the available on site data and more general concepts about the modelled patters, e.g. training images. One of the problem of modelling natural systems with geostatistics is in maintaining realism spatial features and so they agree with the physical processes in nature. Tuning the model parameters to the data may lead to geostatistical realisations with unrealistic spatial patterns, which would still honour the data. Such model would result in poor predictions, even though although fit the available data well. Conditioning the model to a wider range of relevant data provide a remedy that avoid producing unrealistic features in spatial models. For instance, there are vast amounts of information about the geometries of river channels that can be used in describing fluvial environment. Relations between the geometrical channel characteristics (width, depth, wave length, amplitude, etc.) are complex and non-parametric and are exhibit a great deal of uncertainty, which is important to propagate rigorously into the predictive model. These relations can be described within a Bayesian approach as multi-dimensional prior probability distributions. We propose a way to constrain multi-point statistics models with intelligent priors obtained from analysing a vast collection of contemporary river patterns based on previously published works. We applied machine learning techniques, namely neural networks and support vector machines, to extract multivariate non-parametric relations between geometrical characteristics of fluvial channels from the available data. An example demonstrates how ensuring geological realism helps to deliver more reliable prediction of a subsurface oil reservoir in a fluvial depositional environment.

  19. Effectiveness of interventions that assist caregivers to support people with dementia living in the community: a systematic review.

    PubMed

    Parker, Deborah; Mills, Sandra; Abbey, Jennifer

    The objective of this review was to assess the effectiveness of interventions that assist caregivers to provide support for people living with dementia in the community. Types of participants Adult caregivers who provide support for people with dementia living in the community (non-institutional care).Types of interventions Interventions designed to support caregivers in their role such as skills training, education to assist in caring for a person living with dementia and support groups/programs. Interventions of formal approaches to care designed to support caregivers in their role, care planning, case management and specially designated members of the healthcare team - for example dementia nurse specialist or volunteers trained in caring for someone with dementia.Types of studies This review considered any meta-analyses, systematic reviews, randomised control trials, quasi-experimental studies, cohort studies, case control studies and observational studies without control groups that addressed the effectiveness of interventions that assist caregivers to provide support for people living with dementia in the community. The search sought to identify published studies from 2000 to 2005 through the use of electronic databases. Only studies in English were considered for inclusion. The initial search was conducted of the databases, CINAHL, MEDLINE and PsychINFO using search strategies adapted from the Cochrane Dementia and Cognitive Improvement Group. A second more extensive search was then conducted using the appropriate Medical Subject Headings (MeSH) and keywords for other available databases. Finally, hand searching of reference lists of articles retrieved and of core dementia, geriatric and psycho geriatric journals was undertaken. Methodological quality of each of the articles was assessed by two independent reviewers using appraisal checklist developed by the Joanna Briggs Institute and based on the work of the Cochrane Collaboration and Centre for Reviews and Dissemination. Standardised mean differences or weighted mean differences and their 95% confidence intervals were calculated for each included study reported in the meta-analysis. Results from comparable groups of studies were pooled in statistical meta-analysis using Review Manager Software from the Cochrane Collaboration. Heterogeneity between combined studies was tested using standard chi-square test. Where statistical pooling was not appropriate or possible, the findings are summarised in narrative form. A comprehensive search of relevant databases, hand searching and cross referencing found 685 articles that were assessed for relevance to the review. Eighty-five papers appeared to meet the inclusion criteria based on title and abstract, and the full paper was retrieved. Of the 85 full papers reviewed, 40 were accepted for inclusion, three were systematic reviews, three were meta-analysis, and the remaining 34 were randomised controlled trials. For the randomised controlled trials that were able to be included in a meta-analysis, standardised mean differences or weighted mean differences and their 95% confidence intervals were calculated for each. Results from comparable groups of studies were pooled in statistical meta-analysis using Review Manager Software and heterogeneity between combined studies was assessed by using the chi-square test. Where statistical pooling was not appropriate or possible, the findings are summarised in narrative form.The results are discussed in two main sections. Firstly it was possible to assess the effectiveness of different types of caregiver interventions on the outcome categories of depression, health, subjective well-being, self-efficacy and burden. Secondly, results are reported by main outcome category. For each of these sections, meta-analysis was conducted where it was possible; otherwise, a narrative summary describes the findings. Four categories of intervention were included in the review - psycho-educational, support, multi-component and other.Psycho-educational Thirteen studies used psycho-educational interventions, and all but one showed positive results across a range of outcomes. Eight studies were entered in a meta-analysis. No significant impact of psycho-educational interventions was found for the outcome categories of subjective well-being, self-efficacy or health. However, small but significant results were found for the categories of depression and burden.Support Seven studies discussed support only interventions and two of these showed significant results. These two studies were suitable for meta-analysis and demonstrated a small but significant improvement on caregiver burden.Multi-component Twelve of the studies report multi-component interventions and 10 of these report significant outcomes across a broad range of outcome measures including self-efficacy, depression, subjective well-being and burden. Unfortunately because of the heterogeneity of study designs and outcome measures, no meta-analysis was possible.Other interventions Other interventions included the use of exercise or nutrition which resulted in improvements in psychological distress and health benefits. Case management and a computer aided support intervention provided mixed results. One cognitive behavioural therapy study reported a reduction in anxiety and positive impacts on patient behaviour. In addition to analysis by type of intervention it was possible to analyse results based on some outcome categories that were used across the studies. In particular the impact of interventions on caregiver depression was available for meta-analysis from eight studies. This indicated that multi-component and psycho-educational interventions showed a small but significant positive effect on caregiver depression.Five studies using the outcome category of caregiver burden were entered into a meta-analysis and findings indicated that there were no significant effects of any of interventions. No meta-analysis was possible for the outcome categories of health, self-efficacy or subjective well-being. From this review there is evidence to support the use of well-designed psycho-educational or multi-component interventions for caregivers of people with dementia who live in the community. Factors that appear to positively contribute to effective interventions are those which:Factors which do not appear to have benefit in interventions are those which.

  20. 101 Criteria for Appraising Interactive Video. A Futuremedia Guide.

    ERIC Educational Resources Information Center

    Copeland, Peter

    The criteria in this guide for evaluating interactive video instructional programs are based on principles of learning and motivation, and emphasize the design, production, presentation, and usage of interactive video programs. Presented in the format of a rating scale, the criteria are grouped into nine broad categories: (1) information about the…

  1. Energy Conservation: Policies, Programs, and General Studies. 1979-July, 1980 (Citations from the NTIS Data Base).

    ERIC Educational Resources Information Center

    Hundemann, Audrey S.

    The 135 abstracts presented pertain to national policies, programs, and general strategies for conserving energy. In addition to the abstract, each citation lists the title, author, sponsoring agency, subject categories, number of pages, date, descriptors, identifiers, and ordering information for each document. Topics covered in this compilation…

  2. 76 FR 36926 - National Institute for Occupational Safety and Health (NIOSH); Request for Nominations To Serve...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-23

    ... industrial hygienist; 1 toxicologist; 1 epidemiologist; and, at least 1 mental health professional. For the mental health professional category, specific expertise is sought in trauma-related psychiatry or... WTC Program Administrator based on program needs. Meetings may occur up to four times a year. Members...

  3. From the Bottom Up: Toward a Strategy for Income and Employment Generation among the Disadvantaged. An Interim Report.

    ERIC Educational Resources Information Center

    O'Regan, Fred; Conway, Maureen

    The Aspen Institute's ongoing action-research program, Local Employment Approaches for the Disadvantaged (LEAD), assessed 60 programs nationally. Local initiatives fell into four general categories, with numerous subcategories: self-employment, job training and placement, job creation and retention, and community-based finance. A second breakdown…

  4. "Horses for Courses": Categories of Computer-Based Learning Program and Their Uses in Pharmacology Courses.

    ERIC Educational Resources Information Center

    Hughes, Ian E.

    1998-01-01

    Describes the pharma-CAL-ogy project, funded by Teaching and Learning Technology Programme (TLTP), which has developed various types of software for use in pharmacology courses. Topics include course organization and delivery software, drill and practice software, tutorial-type programs, simulations, and the need to integrate computer-assisted…

  5. Developmental Specificity in Targeting and Teaching Play Activities to Children with Pervasive Developmental Disorders

    ERIC Educational Resources Information Center

    Lifter, Karin; Ellis, James; Cannon, Barbara; Anderson, Stephen R.

    2005-01-01

    Developmentally specific play programs were designed for three children with pervasive developmental disorders being served in a home-based program. Using the Developmental Play Assessment, six activities for each of three adjacent developmentally sequenced play categories were targeted for direct instruction using different toy sets. A modified…

  6. Multi-representation ability of students on the problem solving physics

    NASA Astrophysics Data System (ADS)

    Theasy, Y.; Wiyanto; Sujarwata

    2018-03-01

    Accuracy in representing knowledge possessed by students will show how the level of student understanding. The multi-representation ability of students on the problem solving of physics has been done through qualitative method of grounded theory model and implemented on physics education student of Unnes academic year 2016/2017. Multiforms of representation used are verbal (V), images/diagrams (D), graph (G), and mathematically (M). High and low category students have an accurate use of graphical representation (G) of 83% and 77.78%, and medium category has accurate use of image representation (D) equal to 66%.

  7. Review of Current Aided/Automatic Target Acquisition Technology for Military Target Acquisition Tasks

    DTIC Science & Technology

    2011-07-01

    radar [e.g., synthetic aperture radar (SAR)]. EO/IR includes multi- and hyperspectral imaging. Signal processing of data from nonimaging sensors, such...enhanced recognition ability. Other nonimage -based techniques, such as category theory,45 hierarchical systems,46 and gradient index flow,47 are possible...the battle- field. There is a plethora of imaging and nonimaging sensors on the battlefield that are being networked together for trans- mission of

  8. An enhanced export coefficient based optimization model for supporting agricultural nonpoint source pollution mitigation under uncertainty.

    PubMed

    Rong, Qiangqiang; Cai, Yanpeng; Chen, Bing; Yue, Wencong; Yin, Xin'an; Tan, Qian

    2017-02-15

    In this research, an export coefficient based dual inexact two-stage stochastic credibility constrained programming (ECDITSCCP) model was developed through integrating an improved export coefficient model (ECM), interval linear programming (ILP), fuzzy credibility constrained programming (FCCP) and a fuzzy expected value equation within a general two stage programming (TSP) framework. The proposed ECDITSCCP model can effectively address multiple uncertainties expressed as random variables, fuzzy numbers, pure and dual intervals. Also, the model can provide a direct linkage between pre-regulated management policies and the associated economic implications. Moreover, the solutions under multiple credibility levels can be obtained for providing potential decision alternatives for decision makers. The proposed model was then applied to identify optimal land use structures for agricultural NPS pollution mitigation in a representative upstream subcatchment of the Miyun Reservoir watershed in north China. Optimal solutions of the model were successfully obtained, indicating desired land use patterns and nutrient discharge schemes to get a maximum agricultural system benefits under a limited discharge permit. Also, numerous results under multiple credibility levels could provide policy makers with several options, which could help get an appropriate balance between system benefits and pollution mitigation. The developed ECDITSCCP model can be effectively applied to addressing the uncertain information in agricultural systems and shows great applicability to the land use adjustment for agricultural NPS pollution mitigation. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Evaluation of Atlas-Based White Matter Segmentation with Eve.

    PubMed

    Plassard, Andrew J; Hinton, Kendra E; Venkatraman, Vijay; Gonzalez, Christopher; Resnick, Susan M; Landman, Bennett A

    2015-03-20

    Multi-atlas labeling has come in wide spread use for whole brain labeling on magnetic resonance imaging. Recent challenges have shown that leading techniques are near (or at) human expert reproducibility for cortical gray matter labels. However, these approaches tend to treat white matter as essentially homogeneous (as white matter exhibits isointense signal on structural MRI). The state-of-the-art for white matter atlas is the single-subject Johns Hopkins Eve atlas. Numerous approaches have attempted to use tractography and/or orientation information to identify homologous white matter structures across subjects. Despite success with large tracts, these approaches have been plagued by difficulties in with subtle differences in course, low signal to noise, and complex structural relationships for smaller tracts. Here, we investigate use of atlas-based labeling to propagate the Eve atlas to unlabeled datasets. We evaluate single atlas labeling and multi-atlas labeling using synthetic atlases derived from the single manually labeled atlas. On 5 representative tracts for 10 subjects, we demonstrate that (1) single atlas labeling generally provides segmentations within 2mm mean surface distance, (2) morphologically constraining DTI labels within structural MRI white matter reduces variability, and (3) multi-atlas labeling did not improve accuracy. These efforts present a preliminary indication that single atlas labels with correction is reasonable, but caution should be applied. To purse multi-atlas labeling and more fully characterize overall performance, more labeled datasets would be necessary.

  10. Classifying publications from the clinical and translational science award program along the translational research spectrum: a machine learning approach.

    PubMed

    Surkis, Alisa; Hogle, Janice A; DiazGranados, Deborah; Hunt, Joe D; Mazmanian, Paul E; Connors, Emily; Westaby, Kate; Whipple, Elizabeth C; Adamus, Trisha; Mueller, Meridith; Aphinyanaphongs, Yindalon

    2016-08-05

    Translational research is a key area of focus of the National Institutes of Health (NIH), as demonstrated by the substantial investment in the Clinical and Translational Science Award (CTSA) program. The goal of the CTSA program is to accelerate the translation of discoveries from the bench to the bedside and into communities. Different classification systems have been used to capture the spectrum of basic to clinical to population health research, with substantial differences in the number of categories and their definitions. Evaluation of the effectiveness of the CTSA program and of translational research in general is hampered by the lack of rigor in these definitions and their application. This study adds rigor to the classification process by creating a checklist to evaluate publications across the translational spectrum and operationalizes these classifications by building machine learning-based text classifiers to categorize these publications. Based on collaboratively developed definitions, we created a detailed checklist for categories along the translational spectrum from T0 to T4. We applied the checklist to CTSA-linked publications to construct a set of coded publications for use in training machine learning-based text classifiers to classify publications within these categories. The training sets combined T1/T2 and T3/T4 categories due to low frequency of these publication types compared to the frequency of T0 publications. We then compared classifier performance across different algorithms and feature sets and applied the classifiers to all publications in PubMed indexed to CTSA grants. To validate the algorithm, we manually classified the articles with the top 100 scores from each classifier. The definitions and checklist facilitated classification and resulted in good inter-rater reliability for coding publications for the training set. Very good performance was achieved for the classifiers as represented by the area under the receiver operating curves (AUC), with an AUC of 0.94 for the T0 classifier, 0.84 for T1/T2, and 0.92 for T3/T4. The combination of definitions agreed upon by five CTSA hubs, a checklist that facilitates more uniform definition interpretation, and algorithms that perform well in classifying publications along the translational spectrum provide a basis for establishing and applying uniform definitions of translational research categories. The classification algorithms allow publication analyses that would not be feasible with manual classification, such as assessing the distribution and trends of publications across the CTSA network and comparing the categories of publications and their citations to assess knowledge transfer across the translational research spectrum.

  11. Describing a nurse case manager intervention to empower low-income men with prostate cancer.

    PubMed

    Maliski, Sally L; Clerkin, Barbara; Litwin, Mark S

    2004-01-01

    Describe and categorize nurse case manager (NCM) interventions for low-income, uninsured men with prostate cancer. Descriptive, retrospective record review. Statewide free prostate cancer treatment program in which each patient is assigned an NCM. 7 NCMs who developed interventions based on empowerment through increasing self-efficacy. NCM entries were extracted and coded from 10 electronic patient records, line by line, to reveal initial themes. Themes were grouped under categories. Investigators then reviewed and expanded these categories and their descriptions and postulated linkages. Linkages and relationships among categories were empirically verified with the original data. NCM entries from another 20 records were prepared in the same manner as the original records. Modifications were made until the categories contained all of the data and no new categories emerged. Categories were verified for content validity with the NCMs and reviewed for completeness and representation. NCM interventions. Categories of NCM interventions emerged as assessment, coordination, advocacy, facilitation, teaching, support, collaborative problem solving, and keeping track. Categories overlapped and supported each other. NCMs tailored interventions by combining categories for each patient. The skillful tailoring and execution of intervention strategies depended on the knowledge, experience, and skill that each NCM brought to the clinical situation. NCM categories were consistent with the tenets of the self-efficacy theory. The model, based on NCM interventions, provides a guide for the care of underserved men with prostate cancer. Components of the model need to be tested.

  12. The Control Based on Internal Average Kinetic Energy in Complex Environment for Multi-robot System

    NASA Astrophysics Data System (ADS)

    Yang, Mao; Tian, Yantao; Yin, Xianghua

    In this paper, reference trajectory is designed according to minimum energy consumed for multi-robot system, which nonlinear programming and cubic spline interpolation are adopted. The control strategy is composed of two levels, which lower-level is simple PD control and the upper-level is based on the internal average kinetic energy for multi-robot system in the complex environment with velocity damping. Simulation tests verify the effectiveness of this control strategy.

  13. Resources for Performance-Based Education.

    ERIC Educational Resources Information Center

    Houston, W. Robert; And Others

    This volume presents annotations of resources on performance-based teacher education. The materials, produced after 1967, include films, slide/tapes, modules, programmed texts, and multimedia kits for training pre- and in-service educational personnel. The materials are indexed according to both competency categories and key words, descriptions,…

  14. Multiwavelength studies of the gas and dust disc of IRAS 04158+2805

    NASA Astrophysics Data System (ADS)

    Glauser, A. M.; Ménard, F.; Pinte, C.; Duchêne, G.; Güdel, M.; Monin, J.-L.; Padgett, D. L.

    2008-07-01

    We present a study of the circumstellar environment of IRAS 04158+2805 based on multi-wavelength observations and models. Images in the optical and near-infrared, a polarisation map in the optical, and mid-infrared spectra were obtained with VLT-FORS1, CFHT-IR, and Spitzer-IRS. Additionally we used an X-ray spectrum observed with Chandra. We interpret the observations in terms of a central star surrounded by an axisymmetric circumstellar disc, but without an envelope, to test the validity of this simple geometry. We estimate the structural properties of the disc and its gas and dust content. We modelled the dust disc with a 3D continuum radiative transfer code, MCFOST, based on a Monte-Carlo method that provides synthetic scattered light images and polarisation maps, as well as spectral energy distributions. We find that the disc images and spectral energy distribution narrowly constrain many of the disc model parameters, such as a total dust mass of 1.0-1.75×10-4 M_⊙ and an inclination of 62°-63°. The maximum grain size required to fit all available data is of the order of 1.6-2.8 μm although the upper end of this range is loosely constrained. The observed optical polarisation map is reproduced well by the same disc model, suggesting that the geometry we find is adequate and the optical properties are representative of the visible dust content. We compare the inferred dust column density to the gas column density derived from the X-ray spectrum and find a gas-to-dust ratio along the line of sight that is consistent with the ISM value. To our knowledge, this measurement is the first to directly compare dust and gas column densities in a protoplanetary disc. Based on observations obtained at the Canada-France-Hawaii Telescope (CFHT) which is operated by the National Research Council of Canada, the Institut National des Sciences de l'Univers of the Centre National de la Recherche Scientifique of France, and the University of Hawaii. Based also on data collected at ESO/VLT during observation program 68-C.0171.

  15. Direct Care Workers in the National Drug Abuse Treatment Clinical Trials Network: Characteristics, Opinions, and Beliefs

    PubMed Central

    McCarty, Dennis; Fuller, Bret E.; Arfken, Cynthia; Miller, Michael; Nunes, Edward V.; Edmundson, Eldon; Copersino, Marc; Floyd, Anthony; Forman, Robert; Laws, Reesa; Magruder, Kathy M.; Oyama, Mark; Sindelar, Jody; Wendt, William W.

    2010-01-01

    Objective Individuals with direct care responsibilities in 348 drug abuse treatment units were surveyed to obtain a description of the workforce and to assess support for evidence-based therapies. Methods Surveys were distributed to 112 programs participating in the National Drug Abuse Treatment Clinical Trials Network (CTN). Descriptive analyses characterized the workforce. Analyses of covariance tested the effects of job category (counselors, medical staff, manager-supervisors, and support staff) on opinions about evidence-based practices and controlled for the effects of education, modality (outpatient or residential), race, and gender. Results Women made up two-thirds of the CTN workforce. One-third of the workforce had a master’s or doctoral degree. Responses from 1,757 counselors, 908 support staff, 522 managers-supervisors, and 511 medical staff (71% of eligible participants) suggested that the variables that most consistently influenced responses were job category (19 of 22 items) and education (20 of 22 items). Managers-supervisors were the most supportive of evidence-based therapies, and support staff were the least supportive. Generally, individuals with graduate degrees had more positive opinions about evidence-based therapies. Support for using medications and contingency management was modest across job categories. Conclusions The relatively traditional beliefs of support staff could inhibit the introduction of evidence-based practices. Programs initiating changes in therapeutic approaches may benefit from including all employees in change efforts. PMID:17287373

  16. A knowledge-based expert system for scheduling of airborne astronomical observations

    NASA Technical Reports Server (NTRS)

    Nachtsheim, P. R.; Gevarter, W. B.; Stutz, J. C.; Banda, C. P.

    1985-01-01

    The Kuiper Airborne Observatory Scheduler (KAOS) is a knowledge-based expert system developed at NASA Ames Research Center to assist in route planning of a C-141 flying astronomical observatory. This program determines a sequence of flight legs that enables sequential observations of a set of heavenly bodies derived from a list of desirable objects. The possible flight legs are constrained by problems of observability, avoiding flyovers of warning and restricted military zones, and running out of fuel. A significant contribution of the KAOS program is that it couples computational capability with a reasoning system.

  17. The Evaluation and Research of Multi-Project Programs: Program Component Analysis.

    ERIC Educational Resources Information Center

    Baker, Eva L.

    1977-01-01

    It is difficult to base evaluations on concepts irrelevant to state policy making. Evaluation of a multiproject program requires both time and differentiation of method. Data from the California Early Childhood Program illustrate process variables for program component analysis, and research questions for intraprogram comparison. (CP)

  18. Application-oriented programming model for sensor networks embedded in the human body.

    PubMed

    Barbosa, Talles M G de A; Sene, Iwens G; da Rocha, Adson F; Nascimento, Fransisco A de O; Carvalho, Hervaldo S; Camapum, Juliana F

    2006-01-01

    This work presents a new programming model for sensor networks embedded in the human body which is based on the concept of multi-programming application-oriented software. This model was conceived with a top-down approach of four layers and its main goal is to allow the healthcare professionals to program and to reconfigure the network locally or by the Internet. In order to evaluate this hypothesis, a benchmarking was executed in order to allow the assessment of the mean time spent in the programming of a multi-functional sensor node used for the measurement and transmission of the electrocardiogram.

  19. The Efficiency of Split Panel Designs in an Analysis of Variance Model

    PubMed Central

    Wang, Wei-Guo; Liu, Hai-Jun

    2016-01-01

    We consider split panel design efficiency in analysis of variance models, that is, the determination of the cross-sections series optimal proportion in all samples, to minimize parametric best linear unbiased estimators of linear combination variances. An orthogonal matrix is constructed to obtain manageable expression of variances. On this basis, we derive a theorem for analyzing split panel design efficiency irrespective of interest and budget parameters. Additionally, relative estimator efficiency based on the split panel to an estimator based on a pure panel or a pure cross-section is present. The analysis shows that the gains from split panel can be quite substantial. We further consider the efficiency of split panel design, given a budget, and transform it to a constrained nonlinear integer programming. Specifically, an efficient algorithm is designed to solve the constrained nonlinear integer programming. Moreover, we combine one at time designs and factorial designs to illustrate the algorithm’s efficiency with an empirical example concerning monthly consumer expenditure on food in 1985, in the Netherlands, and the efficient ranges of the algorithm parameters are given to ensure a good solution. PMID:27163447

  20. Effectiveness of interventions that assist caregivers to support people with dementia living in the community: a systematic review.

    PubMed

    Parker, Deborah; Mills, Sandra; Abbey, Jennifer

    2008-06-01

    Objectives  The objective of this review was to assess the effectiveness of interventions that assist caregivers to provide support for people living with dementia in the community. Inclusion criteria  Types of participants  Adult caregivers who provide support for people with dementia living in the community (non-institutional care). Types of interventions  Interventions designed to support caregivers in their role such as skills training, education to assist in caring for a person living with dementia and support groups/programs. Interventions of formal approaches to care designed to support caregivers in their role, care planning, case management and specially designated members of the healthcare team - for example dementia nurse specialist or volunteers trained in caring for someone with dementia. Types of studies  This review considered any meta-analyses, systematic reviews, randomised control trials, quasi-experimental studies, cohort studies, case control studies and observational studies without control groups that addressed the effectiveness of interventions that assist caregivers to provide support for people living with dementia in the community. Search strategy  The search sought to identify published studies from 2000 to 2005 through the use of electronic databases. Only studies in English were considered for inclusion. The initial search was conducted of the databases, CINAHL, MEDLINE and PsychINFO using search strategies adapted from the Cochrane Dementia and Cognitive Improvement Group. A second more extensive search was then conducted using the appropriate Medical Subject Headings (MeSH) and keywords for other available databases. Finally, hand searching of reference lists of articles retrieved and of core dementia, geriatric and psycho geriatric journals was undertaken. Assessment of quality  Methodological quality of each of the articles was assessed by two independent reviewers using appraisal checklist developed by the Joanna Briggs Institute and based on the work of the Cochrane Collaboration and Centre for Reviews and Dissemination. Data collection and analysis  Standardised mean differences or weighted mean differences and their 95% confidence intervals were calculated for each included study reported in the meta-analysis. Results from comparable groups of studies were pooled in statistical meta-analysis using Review Manager Software from the Cochrane Collaboration. Heterogeneity between combined studies was tested using standard chi-square test. Where statistical pooling was not appropriate or possible, the findings are summarised in narrative form. Results  A comprehensive search of relevant databases, hand searching and cross referencing found 685 articles that were assessed for relevance to the review. Eighty-five papers appeared to meet the inclusion criteria based on title and abstract, and the full paper was retrieved. Of the 85 full papers reviewed, 40 were accepted for inclusion, three were systematic reviews, three were meta-analysis, and the remaining 34 were randomised controlled trials. For the randomised controlled trials that were able to be included in a meta-analysis, standardised mean differences or weighted mean differences and their 95% confidence intervals were calculated for each. Results from comparable groups of studies were pooled in statistical meta-analysis using Review Manager Software and heterogeneity between combined studies was assessed by using the chi-square test. Where statistical pooling was not appropriate or possible, the findings are summarised in narrative form. The results are discussed in two main sections. Firstly it was possible to assess the effectiveness of different types of caregiver interventions on the outcome categories of depression, health, subjective well-being, self-efficacy and burden. Secondly, results are reported by main outcome category. For each of these sections, meta-analysis was conducted where it was possible; otherwise, a narrative summary describes the findings. Effectiveness of intervention type  Four categories of intervention were included in the review - psycho-educational, support, multi-component and other. Psycho-educational Thirteen studies used psycho-educational interventions, and all but one showed positive results across a range of outcomes. Eight studies were entered in a meta-analysis. No significant impact of psycho-educational interventions was found for the outcome categories of subjective well-being, self-efficacy or health. However, small but significant results were found for the categories of depression and burden. Support Seven studies discussed support only interventions and two of these showed significant results. These two studies were suitable for meta-analysis and demonstrated a small but significant improvement on caregiver burden. Multi-component Twelve of the studies report multi-component interventions and 10 of these report significant outcomes across a broad range of outcome measures including self-efficacy, depression, subjective well-being and burden. Unfortunately because of the heterogeneity of study designs and outcome measures, no meta-analysis was possible. Other interventions Other interventions included the use of exercise or nutrition which resulted in improvements in psychological distress and health benefits. Case management and a computer aided support intervention provided mixed results. One cognitive behavioural therapy study reported a reduction in anxiety and positive impacts on patient behaviour. Effectiveness of interventions using specific outcome categories  In addition to analysis by type of intervention it was possible to analyse results based on some outcome categories that were used across the studies. In particular the impact of interventions on caregiver depression was available for meta-analysis from eight studies. This indicated that multi-component and psycho-educational interventions showed a small but significant positive effect on caregiver depression. Five studies using the outcome category of caregiver burden were entered into a meta-analysis and findings indicated that there were no significant effects of any of interventions. No meta-analysis was possible for the outcome categories of health, self-efficacy or subjective well-being. Implications for practice  From this review there is evidence to support the use of well-designed psycho-educational or multi-component interventions for caregivers of people with dementia who live in the community. Factors that appear to positively contribute to effective interventions are those which: •  Provide opportunities within the intervention for the person with dementia as well as the caregiver to be involved •  Encourage active participation in educational interventions for caregivers •  Offer individualised programs rather than group sessions •  Provide information on an ongoing basis, with specific information about services and coaching regarding their new role •  Target the care recipient particularly by reduction in behaviours Factors which do not appear to have benefit in interventions are those which: •  Simply refer caregivers to support groups •  Only provide self help materials •  Only offer peer support. © 2008 The Authors. Journal Compilation © Blackwell Publishing Asia Pty Ltd.

  1. Microgravity Science and Applications Program tasks, 1987 revision

    NASA Technical Reports Server (NTRS)

    1988-01-01

    A compilation is presented of the active research tasks as of the end of the FY87 of the Microgravity Science and Applications Program, NASA-Office of Space Science and Applications, involving several NASA centers and other organizations. An overview is provided of the program scope for managers and scientists in industry, university, and government communities. An introductory description is provided of the program along with the strategy and overall goal, identification of the organizational structures and people involved, and a description of each task. A list of recent publications is also provided. The tasks are grouped into six major categories: Electronic Materials; Solidification of Metals, Alloys, and Composites; Fluid Dynamics and Transport Phenomena; Biotechnology; Glasses and Ceramics; and Combustion. Other categories include Experimental Technology, General Studies and Surveys; Foreign Government Affiliations; Industrial Affiliations; and Physics and Chemistry Experiments (PACE). The tasks are divided into ground based and flight experiments.

  2. Microgravity Science and Applications Program tasks, 1988 revision

    NASA Technical Reports Server (NTRS)

    1989-01-01

    The active research tasks as of the end of the fiscal year 1988 of the Microgravity Science and Applications Program, NASA-Office of Space Science and Applications, involving several NASA centers and other organizations are compiled. The purpose is to provide an overview of the program scope for managers and scientists in industry, university, and government communities. Also included are an introductory description of the program, the strategy and overall goal, identification of the organizational structures and people involved, and a description of each task. A list of recent publications is provided. The tasks are grouped into six major categories: electronic materials; solidification of metals, alloys, and composites; fluid dynamics and transport phenomena; biotechnology; glasses and ceramics; and combustion. Other categories include experimental technology, general studies and surveys; foreign government affiliations; industrial affiliations; and Physics And Chemistry Experiments (PACE). The tasks are divided into ground-based and flight experiments.

  3. Design optimization of axial flow hydraulic turbine runner: Part II - multi-objective constrained optimization method

    NASA Astrophysics Data System (ADS)

    Peng, Guoyi; Cao, Shuliang; Ishizuka, Masaru; Hayama, Shinji

    2002-06-01

    This paper is concerned with the design optimization of axial flow hydraulic turbine runner blade geometry. In order to obtain a better design plan with good performance, a new comprehensive performance optimization procedure has been presented by combining a multi-variable multi-objective constrained optimization model with a Q3D inverse computation and a performance prediction procedure. With careful analysis of the inverse design of axial hydraulic turbine runner, the total hydraulic loss and the cavitation coefficient are taken as optimization objectives and a comprehensive objective function is defined using the weight factors. Parameters of a newly proposed blade bound circulation distribution function and parameters describing positions of blade leading and training edges in the meridional flow passage are taken as optimization variables.The optimization procedure has been applied to the design optimization of a Kaplan runner with specific speed of 440 kW. Numerical results show that the performance of designed runner is successfully improved through optimization computation. The optimization model is found to be validated and it has the feature of good convergence. With the multi-objective optimization model, it is possible to control the performance of designed runner by adjusting the value of weight factors defining the comprehensive objective function. Copyright

  4. Planarity constrained multi-view depth map reconstruction for urban scenes

    NASA Astrophysics Data System (ADS)

    Hou, Yaolin; Peng, Jianwei; Hu, Zhihua; Tao, Pengjie; Shan, Jie

    2018-05-01

    Multi-view depth map reconstruction is regarded as a suitable approach for 3D generation of large-scale scenes due to its flexibility and scalability. However, there are challenges when this technique is applied to urban scenes where apparent man-made regular shapes may present. To address this need, this paper proposes a planarity constrained multi-view depth (PMVD) map reconstruction method. Starting with image segmentation and feature matching for each input image, the main procedure is iterative optimization under the constraints of planar geometry and smoothness. A set of candidate local planes are first generated by an extended PatchMatch method. The image matching costs are then computed and aggregated by an adaptive-manifold filter (AMF), whereby the smoothness constraint is applied to adjacent pixels through belief propagation. Finally, multiple criteria are used to eliminate image matching outliers. (Vertical) aerial images, oblique (aerial) images and ground images are used for qualitative and quantitative evaluations. The experiments demonstrated that the PMVD outperforms the popular multi-view depth map reconstruction with an accuracy two times better for the aerial datasets and achieves an outcome comparable to the state-of-the-art for ground images. As expected, PMVD is able to preserve the planarity for piecewise flat structures in urban scenes and restore the edges in depth discontinuous areas.

  5. Multi-target Parallel Processing Approach for Gene-to-structure Determination of the Influenza Polymerase PB2 Subunit

    PubMed Central

    Moen, Spencer O.; Smith, Eric; Raymond, Amy C.; Fairman, James W.; Stewart, Lance J.; Staker, Bart L.; Begley, Darren W.; Edwards, Thomas E.; Lorimer, Donald D.

    2013-01-01

    Pandemic outbreaks of highly virulent influenza strains can cause widespread morbidity and mortality in human populations worldwide. In the United States alone, an average of 41,400 deaths and 1.86 million hospitalizations are caused by influenza virus infection each year 1. Point mutations in the polymerase basic protein 2 subunit (PB2) have been linked to the adaptation of the viral infection in humans 2. Findings from such studies have revealed the biological significance of PB2 as a virulence factor, thus highlighting its potential as an antiviral drug target. The structural genomics program put forth by the National Institute of Allergy and Infectious Disease (NIAID) provides funding to Emerald Bio and three other Pacific Northwest institutions that together make up the Seattle Structural Genomics Center for Infectious Disease (SSGCID). The SSGCID is dedicated to providing the scientific community with three-dimensional protein structures of NIAID category A-C pathogens. Making such structural information available to the scientific community serves to accelerate structure-based drug design. Structure-based drug design plays an important role in drug development. Pursuing multiple targets in parallel greatly increases the chance of success for new lead discovery by targeting a pathway or an entire protein family. Emerald Bio has developed a high-throughput, multi-target parallel processing pipeline (MTPP) for gene-to-structure determination to support the consortium. Here we describe the protocols used to determine the structure of the PB2 subunit from four different influenza A strains. PMID:23851357

  6. A Globally Convergent Augmented Lagrangian Pattern Search Algorithm for Optimization with General Constraints and Simple Bounds

    NASA Technical Reports Server (NTRS)

    Lewis, Robert Michael; Torczon, Virginia

    1998-01-01

    We give a pattern search adaptation of an augmented Lagrangian method due to Conn, Gould, and Toint. The algorithm proceeds by successive bound constrained minimization of an augmented Lagrangian. In the pattern search adaptation we solve this subproblem approximately using a bound constrained pattern search method. The stopping criterion proposed by Conn, Gould, and Toint for the solution of this subproblem requires explicit knowledge of derivatives. Such information is presumed absent in pattern search methods; however, we show how we can replace this with a stopping criterion based on the pattern size in a way that preserves the convergence properties of the original algorithm. In this way we proceed by successive, inexact, bound constrained minimization without knowing exactly how inexact the minimization is. So far as we know, this is the first provably convergent direct search method for general nonlinear programming.

  7. 42 CFR § 414.1365 - Subcategories for the improvement activities performance category.

    Code of Federal Regulations, 2010 CFR

    2017-10-01

    ..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICARE PROGRAM (CONTINUED) PAYMENT FOR PART B MEDICAL AND OTHER HEALTH SERVICES Merit-Based Incentive Payment System and Alternative Payment Model Incentive...

  8. Science educators' perceptions of problems facing science education: A report of five surveys

    NASA Astrophysics Data System (ADS)

    Gallagher, James Joseph; Yager, Robert E.

    Five groups of science educators representing faculty at graduate institutions, graduate students, teachers, supervisors, and leadership conferees were surveyed concerning their perceptions of current problems facing science education. A total of 144 participants provided an average of 4.7 responses. The responses were tabulated using an emergent set of categories that resulted in six major groupings, i.e. conceptual, organizational, teacher; related, student-related, university, and societal. The category with the most problems identified was in the area of conceptual problems. University related problems and organizational problems were the next two most frequently mentioned categories for problems. Specific problems in all categories most often cited include the following:1confusion and uncertainty in goals and objectives;2lack of vision and leadership in schools and universities;3absence of a theoretical base for science education;4poor quality teacher education programs;5inappropriate avenues for continuing education of teachers; limited dialogue between researchers and practitioners; declining enrollments; poor quality teaching and counseling; insufficient programs in science for the wide spectrum of students; and public and parental apathy towards science.

  9. Use of Self-Monitoring to Maintain Program Fidelity of Multi-Tiered Interventions

    ERIC Educational Resources Information Center

    Nelson, J. Ron; Oliver, Regina M.; Hebert, Michael A.; Bohaty, Janet

    2015-01-01

    Multi-tiered system of supports represents one of the most significant advancements in improving the outcomes of students for whom typical instruction is not effective. While many practices need to be in place to make multi-tiered systems of support effective, accurate implementation of evidence-based practices by individuals at all tiers is…

  10. Church-Based ESL Adult Programs: Social Mediators for Empowering "Family Literacy Ecology of Communities"

    ERIC Educational Resources Information Center

    Chao, Xia; Mantero, Miguel

    2014-01-01

    This multi-sited ethnographic study examines the ways in which Latino and Asian immigrant parents' English learning through two church-based ESL programs in a Southeastern U.S. city affects their family literacy and home language practices. It demonstrates that the parents' participation in the programs is an empowering experience promoting ESL…

  11. Quantified Objectives for Assessing the Contribution of Low Clouds to Climate Sensitivity and Variability

    NASA Astrophysics Data System (ADS)

    Del Genio, A. D.; Platnick, S. E.; Bennartz, R.; Klein, S. A.; Marchand, R.; Oreopoulos, L.; Pincus, R.; Wood, R.

    2016-12-01

    Low clouds are central to leading-order questions in climate and subseasonal weather predictability, and are key to the NRC panel report's goals "to understand the signals of the Earth system under a changing climate" and "for improved models and model projections." To achieve both goals requires a mix of continuity observations to document the components of the changing climate and improvements in retrievals of low cloud and boundary layer dynamical/thermodynamic properties to ensure process-oriented observations that constrain the parameterized physics of the models. We discuss four climate/weather objectives that depend sensitively on understanding the behavior of low clouds: 1. Reduce uncertainty in GCM-inferred climate sensitivity by 50% by constraining subtropical low cloud feedbacks. 2. Eliminate the GCM Southern Ocean shortwave flux bias and its effect on cloud feedback and the position of the midlatitude storm track. 3. Eliminate the double Intertropical Convergence Zone bias in GCMs and its potential effects on tropical precipitation over land and the simulation and prediction of El Niño. 4. Increase the subseasonal predictability of tropical warm pool precipitation from 20 to 30 days. We envision advances in three categories of observations that would be highly beneficial for reaching these goals: 1. More accurate observations will facilitate more thorough evaluation of clouds in GCMs. 2. Better observations of the links between cloud properties and the environmental state will be used as the foundation for parameterization improvements. 3. Sufficiently long and higher quality records of cloud properties and environmental state will constrain low cloud feedback purely observationally. To accomplish this, the greatest need is to replace A-Train instruments, which are nearing end-of-life, with enhanced versions. The requirements are sufficient horizontal and vertical resolution to capture boundary layer cloud and thermodynamic spatial structure; more accurate determination of cloud condensate profiles and optical properties; near-coincident observations to permit multi-instrument retrievals and association with dynamic and thermodynamic structure; global coverage; and, for long-term monitoring, measurement and orbit stability and sufficient mission duration.

  12. Advanced Tactical Booster Technologies: Applications for Long-Range Rocket Systems

    DTIC Science & Technology

    2016-09-07

    Applications for Long-Range Rocket Systems 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Matthew McKinna, Jason Mossman 5d...technology advantages currently under development for tactical rocket motors which have direct application to land-based long-range rocket systems...increased rocket payload capacity, improved rocket range or increased rocket loadout from the volumetrically constrained environment of a land-based

  13. Improved multi-objective ant colony optimization algorithm and its application in complex reasoning

    NASA Astrophysics Data System (ADS)

    Wang, Xinqing; Zhao, Yang; Wang, Dong; Zhu, Huijie; Zhang, Qing

    2013-09-01

    The problem of fault reasoning has aroused great concern in scientific and engineering fields. However, fault investigation and reasoning of complex system is not a simple reasoning decision-making problem. It has become a typical multi-constraint and multi-objective reticulate optimization decision-making problem under many influencing factors and constraints. So far, little research has been carried out in this field. This paper transforms the fault reasoning problem of complex system into a paths-searching problem starting from known symptoms to fault causes. Three optimization objectives are considered simultaneously: maximum probability of average fault, maximum average importance, and minimum average complexity of test. Under the constraints of both known symptoms and the causal relationship among different components, a multi-objective optimization mathematical model is set up, taking minimizing cost of fault reasoning as the target function. Since the problem is non-deterministic polynomial-hard(NP-hard), a modified multi-objective ant colony algorithm is proposed, in which a reachability matrix is set up to constrain the feasible search nodes of the ants and a new pseudo-random-proportional rule and a pheromone adjustment mechinism are constructed to balance conflicts between the optimization objectives. At last, a Pareto optimal set is acquired. Evaluation functions based on validity and tendency of reasoning paths are defined to optimize noninferior set, through which the final fault causes can be identified according to decision-making demands, thus realize fault reasoning of the multi-constraint and multi-objective complex system. Reasoning results demonstrate that the improved multi-objective ant colony optimization(IMACO) can realize reasoning and locating fault positions precisely by solving the multi-objective fault diagnosis model, which provides a new method to solve the problem of multi-constraint and multi-objective fault diagnosis and reasoning of complex system.

  14. Population-based evaluation of a suggested anatomic and clinical classification of congenital heart defects based on the International Paediatric and Congenital Cardiac Code.

    PubMed

    Houyel, Lucile; Khoshnood, Babak; Anderson, Robert H; Lelong, Nathalie; Thieulin, Anne-Claire; Goffinet, François; Bonnet, Damien

    2011-10-03

    Classification of the overall spectrum of congenital heart defects (CHD) has always been challenging, in part because of the diversity of the cardiac phenotypes, but also because of the oft-complex associations. The purpose of our study was to establish a comprehensive and easy-to-use classification of CHD for clinical and epidemiological studies based on the long list of the International Paediatric and Congenital Cardiac Code (IPCCC). We coded each individual malformation using six-digit codes from the long list of IPCCC. We then regrouped all lesions into 10 categories and 23 subcategories according to a multi-dimensional approach encompassing anatomic, diagnostic and therapeutic criteria. This anatomic and clinical classification of congenital heart disease (ACC-CHD) was then applied to data acquired from a population-based cohort of patients with CHD in France, made up of 2867 cases (82% live births, 1.8% stillbirths and 16.2% pregnancy terminations). The majority of cases (79.5%) could be identified with a single IPCCC code. The category "Heterotaxy, including isomerism and mirror-imagery" was the only one that typically required more than one code for identification of cases. The two largest categories were "ventricular septal defects" (52%) and "anomalies of the outflow tracts and arterial valves" (20% of cases). Our proposed classification is not new, but rather a regrouping of the known spectrum of CHD into a manageable number of categories based on anatomic and clinical criteria. The classification is designed to use the code numbers of the long list of IPCCC but can accommodate ICD-10 codes. Its exhaustiveness, simplicity, and anatomic basis make it useful for clinical and epidemiologic studies, including those aimed at assessment of risk factors and outcomes.

  15. The benefits of international rotations to resource-limited settings for U.S. surgery residents.

    PubMed

    Henry, Jaymie A; Groen, Reinou S; Price, Raymond R; Nwomeh, Benedict C; Kingham, T Peter; Hardy, Mark A; Kushner, Adam L

    2013-04-01

    U.S. surgery residents increasingly are interested in international experiences. Recently, the Residency Review Committee approved international surgery rotations for credit toward graduation. Despite this growing interest, few U.S. surgery residency programs offer formal international rotations. We aimed to present the benefits of international surgery rotations and how these rotations contribute to the attainment of the 6 Accreditation Council for Graduate Medical Education (ACGME) competencies. An e-mail-based survey was sent in November 2011 to the 188 members of Surgeons OverSeas, a group of surgeons, residents, fellows, and medical students with experience working in resource-limited settings. They were asked to list 5 benefits of international rotations for surgery residents. The frequency of benefits was qualitatively grouped into 4 major categories: educational, personal, benefits to the foreign institution/Global Surgery, and benefits to the home institution. The themes were correlated with the 6 ACGME competencies. The 58 respondents (31% response rate) provided a total of 295 responses. Fifty themes were identified. Top benefits included learning to optimally function with limited resources, exposure to a wide variety of operative pathology, exposure to a foreign culture, and forming relationships with local counterparts. All ACGME competencies were covered by the themes. International surgery rotations to locations in which resources are constrained, operative diseases vary, and patient diversity abound provide unique opportunities for surgery residents to attain the 6 ACGME competencies. General surgery residency programs should be encouraged to establish formal international rotations as part of surgery training to promote resident education and assist with necessary oversight. Copyright © 2013 Mosby, Inc. All rights reserved.

  16. An interval chance-constrained fuzzy modeling approach for supporting land-use planning and eco-environment planning at a watershed level.

    PubMed

    Ou, Guoliang; Tan, Shukui; Zhou, Min; Lu, Shasha; Tao, Yinghui; Zhang, Zuo; Zhang, Lu; Yan, Danping; Guan, Xingliang; Wu, Gang

    2017-12-15

    An interval chance-constrained fuzzy land-use allocation (ICCF-LUA) model is proposed in this study to support solving land resource management problem associated with various environmental and ecological constraints at a watershed level. The ICCF-LUA model is based on the ICCF (interval chance-constrained fuzzy) model which is coupled with interval mathematical model, chance-constrained programming model and fuzzy linear programming model and can be used to deal with uncertainties expressed as intervals, probabilities and fuzzy sets. Therefore, the ICCF-LUA model can reflect the tradeoff between decision makers and land stakeholders, the tradeoff between the economical benefits and eco-environmental demands. The ICCF-LUA model has been applied to the land-use allocation of Wujiang watershed, Guizhou Province, China. The results indicate that under highly land suitable conditions, optimized area of cultivated land, forest land, grass land, construction land, water land, unused land and landfill in Wujiang watershed will be [5015, 5648] hm 2 , [7841, 7965] hm 2 , [1980, 2056] hm 2 , [914, 1423] hm 2 , [70, 90] hm 2 , [50, 70] hm 2 and [3.2, 4.3] hm 2 , the corresponding system economic benefit will be between 6831 and 7219 billion yuan. Consequently, the ICCF-LUA model can effectively support optimized land-use allocation problem in various complicated conditions which include uncertainties, risks, economic objective and eco-environmental constraints. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. A multi-level examination of how the organizational context relates to readiness to implement prevention and evidence-based programming in community settings.

    PubMed

    Chilenski, Sarah M; Olson, Jonathan R; Schulte, Jill A; Perkins, Daniel F; Spoth, Richard

    2015-02-01

    Prior theoretical and empirical research suggests that multiple aspects of an organization's context are likely related to a number of factors, from their interest and ability to adopt new programming, to client outcomes. A limited amount of the prior research has taken a more community-wide perspective by examining factors that associate with community readiness for change, leaving how these findings generalize to community organizations that conduct prevention or positive youth development programs unknown. Thus for the current study, we examined how the organizational context of the Cooperative Extension System (CES) associates with current attitudes and practices regarding prevention and evidence-based programming. Attitudes and practices have been found in the empirical literature to be key indicators of an organization's readiness to adopt prevention and evidence-based programming. Based on multi-level mixed models, results indicate that organizational management practices distinct from program delivery may affect an organization's readiness to adopt and implement new prevention and evidence-based youth programs, thereby limiting the potential public health impact of evidence-based programs. Openness to change, openness of leadership, and communication were the strongest predictors identified within this study. An organization's morale was also found to be a strong predictor of an organization's readiness. The findings of the current study are discussed in terms of implications for prevention and intervention.

  18. A Multi-level Examination of how the Organizational Context Relates to Readiness to Implement Prevention and Evidence-Based Programming in Community Settings

    PubMed Central

    Chilenski, Sarah M.; Olson, Jonathan R.; Schulte, Jill A.; Perkins, Daniel F.; Spoth, Richard

    2015-01-01

    Prior theoretical and empirical research suggests that multiple aspects of an organization’s context are likely related to a number of factors, from their interest and ability to adopt new programming, to client outcomes. A limited amount of the prior research has taken a more community-wide perspective by examining factors that associate with community readiness for change, leaving how these findings generalize to community organizations that conduct prevention or positive youth development programs unknown. Thus for the current study, we examined how the organizational context of the Cooperative Extension System (CES) associates with current attitudes and practices regarding prevention and evidence-based programming. Attitudes and practices have been found in the empirical literature to be key indicators of an organization’s readiness to adopt prevention and evidence-based programming. Based on multi-level mixed models, results indicate that organizational management practices distinct from program delivery may affect an organization’s readiness to adopt and implement new prevention and evidence-based youth programs, thereby limiting the potential public health impact of evidence-based programs. Openness to change, openness of leadership, and communication were the strongest predictors identified within this study. An organization’s morale was also found to be a strong predictor of an organization’s readiness. The findings of the current study are discussed in terms of implications for prevention and intervention. PMID:25463014

  19. Bayesian Atmospheric Radiative Transfer (BART) Code and Application to WASP-43b

    NASA Astrophysics Data System (ADS)

    Blecic, Jasmina; Harrington, Joseph; Cubillos, Patricio; Bowman, Oliver; Rojo, Patricio; Stemm, Madison; Lust, Nathaniel B.; Challener, Ryan; Foster, Austin James; Foster, Andrew S.; Blumenthal, Sarah D.; Bruce, Dylan

    2016-01-01

    We present a new open-source Bayesian radiative-transfer framework, Bayesian Atmospheric Radiative Transfer (BART, https://github.com/exosports/BART), and its application to WASP-43b. BART initializes a model for the atmospheric retrieval calculation, generates thousands of theoretical model spectra using parametrized pressure and temperature profiles and line-by-line radiative-transfer calculation, and employs a statistical package to compare the models with the observations. It consists of three self-sufficient modules available to the community under the reproducible-research license, the Thermochemical Equilibrium Abundances module (TEA, https://github.com/dzesmin/TEA, Blecic et al. 2015}, the radiative-transfer module (Transit, https://github.com/exosports/transit), and the Multi-core Markov-chain Monte Carlo statistical module (MCcubed, https://github.com/pcubillos/MCcubed, Cubillos et al. 2015). We applied BART on all available WASP-43b secondary eclipse data from the space- and ground-based observations constraining the temperature-pressure profile and molecular abundances of the dayside atmosphere of WASP-43b. This work was supported by NASA Planetary Atmospheres grant NNX12AI69G and NASA Astrophysics Data Analysis Program grant NNX13AF38G. JB holds a NASA Earth and Space Science Fellowship.

  20. The European community and its standardization efforts in medical informatics

    NASA Astrophysics Data System (ADS)

    Mattheus, Rudy A.

    1992-07-01

    A summary of the CEN TC 251/4 ''Medical Imaging and Multi-Media'' activities will be given. CEN is the European standardization institute, TC 251 deals with medical informatics. Standardization is a condition for the wide scale use of health care and medical informatics and for the creation of a common market. In the last two years, three important categories-- namely, the Commission of the European Communities with their programs and the mandates, the medical informaticians through their European professional federation, and the national normalization institutes through the European committee--have shown to be aware of this problem and have taken actions. As a result, a number of AIM (Advanced Informatics in Medicine), CEC sponsored projects, the CEC mandates to CEN and EWOS, the EFMI working group on standardization, the technical committee of CEN, and the working groups and project teams of CEN and EWOS are working on the subject. On overview of the CEN TC 251/4 ''Medical Imaging and Multi-Media'' activities will be given, including their relation to other work.

  1. Toward Data-Driven Radiology Education-Early Experience Building Multi-Institutional Academic Trainee Interpretation Log Database (MATILDA).

    PubMed

    Chen, Po-Hao; Loehfelm, Thomas W; Kamer, Aaron P; Lemmon, Andrew B; Cook, Tessa S; Kohli, Marc D

    2016-12-01

    The residency review committee of the Accreditation Council of Graduate Medical Education (ACGME) collects data on resident exam volume and sets minimum requirements. However, this data is not made readily available, and the ACGME does not share their tools or methodology. It is therefore difficult to assess the integrity of the data and determine if it truly reflects relevant aspects of the resident experience. This manuscript describes our experience creating a multi-institutional case log, incorporating data from three American diagnostic radiology residency programs. Each of the three sites independently established automated query pipelines from the various radiology information systems in their respective hospital groups, thereby creating a resident-specific database. Then, the three institutional resident case log databases were aggregated into a single centralized database schema. Three hundred thirty residents and 2,905,923 radiologic examinations over a 4-year span were catalogued using 11 ACGME categories. Our experience highlights big data challenges including internal data heterogeneity and external data discrepancies faced by informatics researchers.

  2. In silico predicted reproductive endocrine transcriptional regulatory networks during zebrafish (Danio rerio) development.

    PubMed

    Hala, D

    2017-03-21

    The interconnected topology of transcriptional regulatory networks (TRNs) readily lends to mathematical (or in silico) representation and analysis as a stoichiometric matrix. Such a matrix can be 'solved' using the mathematical method of extreme pathway (ExPa) analysis, which identifies uniquely activated genes subject to transcription factor (TF) availability. In this manuscript, in silico multi-tissue TRN models of brain, liver and gonad were used to study reproductive endocrine developmental programming in zebrafish (Danio rerio) from 0.25h post fertilization (hpf; zygote) to 90 days post fertilization (dpf; adult life stage). First, properties of TRN models were studied by sequentially activating all genes in multi-tissue models. This analysis showed the brain to exhibit lowest proportion of co-regulated genes (19%) relative to liver (23%) and gonad (32%). This was surprising given that the brain comprised 75% and 25% more TFs than liver and gonad respectively. Such 'hierarchy' of co-regulatory capability (brain

  3. Benefits of CMM-Based Software Process Improvement: Initial Results

    DTIC Science & Technology

    1994-08-01

    Institute Carnegie Mellon University Pittsburgh, Pennsylvania 15213 This report was prepar the SEI Joint Program Office HQ ESC/ENS 5 Eglin Street Hanscom AFB...Miller, Lt Col, USAF SEI Joint Program Office This work is sponsored by the U.S. Department of Defense. Copyright 0 1994 by Carnegie Mellon University...categories: descriptive information about the organizations, information about their process improvement and measurement programs , and data about the

  4. American Chemical Society. 23rd Great Lakes Regional Meeting. Program and abstracts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1990-01-01

    The technical program includes some 250 papers in 38 sessions, featuring 16 symposia with 99 invited speakers. Program highlights include a plenary lecture, The Origin and Consequences of Scientific Illiteracy, by Jon D. Miller. Sessions for general technical papers are scheduled in the following categories: analytical chemistry; biochemistry; inorganic chemistry; organic chemistry; and physical chemistry. Papers have been processed for inclusion on the data base.

  5. Effect of geometrical constraint condition on the formation of nanoscale twins in the Ni-based metallic glass composite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, M H; Kim, B S; Kim, D H

    2014-04-25

    We investigated the effect of geometrically constrained stress-strain conditions on the formation of nanotwins in alpha-brass phase reinforced Ni59Zr20Ti16Si2Sn3 metallic glass (MG) matrix deformed under macroscopic uniaxial compression. The specific geometrically constrained conditions in the samples lead to a deviation from a simple uniaxial state to a multi-axial stress state, for which nanocrystallization in the MG matrix together with nanoscale twinning of the brass reinforcement is observed in localized regions during plastic flow. The nanocrystals in the MG matrix and the appearance of the twinned structure in the reinforcements indicate that the strain energy is highly confined and the localmore » stress reaches a very high level upon yielding. Both the effective distribution of reinforcements on the strain enhancement of composite and the effects of the complicated stress states on the development of nanotwins in the second-phase brass particles are discussed.« less

  6. School-based programmes for preventing smoking.

    PubMed

    Thomas, R; Perera, R

    2006-07-19

    Smoking rates in adolescents are rising in some countries. Helping young people to avoid starting smoking is a widely endorsed goal of public health, but there is uncertainty about how to do this. Schools provide a route for communicating with a large proportion of young people, and school-based programmes for smoking prevention have been widely developed and evaluated. To review all randomized controlled trials of behavioural interventions in schools to prevent children (aged 5 to12) and adolescents (aged 13 to18) starting smoking. We searched the Cochrane Central Register of Controlled Trials (CENTRAL) and the Cochrane Tobacco Addiction Group's Specialized Register, MEDLINE, EMBASE, PsyclNFO, ERIC, CINAHL, Health Star, Dissertation Abstracts and studies identified in the bibliographies of articles. Individual MEDLINE searches were made for 133 authors who had undertaken randomized controlled trials in this area. Types of studies: those in which individual students, classes, schools, or school districts were randomized to the intervention or control groups and followed for at least six months. Children (aged 5 to12) or adolescents (aged 13 to18) in school settings. Types of interventions: Classroom programmes or curricula, including those with associated family and community interventions, intended to deter use of tobacco. We included programmes or curricula that provided information, those that used social influences approaches, those that taught generic social competence, and those that included interventions beyond the school into the community. We included programmes with a drug or alcohol focus if outcomes for tobacco use were reported. Types of outcome measures: Prevalence of non-smoking at follow up among those not smoking at baseline. We did not require biochemical validation of self-reported tobacco use for study inclusion. We assessed whether identified citations were randomized controlled trials. We assessed the quality of design and execution, and abstracted outcome data. Because of the marked heterogeneity of design and outcomes, we computed pooled estimates only for those trials that could be analyzed together and for which statistical data were available. We predominantly synthesized the data using narrative systematic review. We grouped studies by intervention method (information; social competence; social influences; combined social influences/social competence; multi-modal programmes). Within each group, we placed them into three categories (low, medium and high risk of bias) according to validity using quality criteria for reported study design. Of the 94 randomized controlled trials identified, we classified 23 as category one (most valid). There was one category one study of information-giving and two of teaching social comeptence. There were thirteen category one studies of social influences interventions. Of these, nine found some positive effect of intervention on smoking prevalence, and four failed to detect an effect on smoking prevalence. The largest and most rigorous study, the Hutchinson Smoking Prevention Project, found no long-term effect of an intensive eight-year programme on smoking behaviour. There were three category one RCTs of combined social influences and social competence interventions: one provided significant results and one only for instruction by health educators compared to self-instruction. There was a lack of high quality evidence about the effectiveness of combinations of social influences and social competence approaches. There was one category one study providing data on social influences compared with information giving. There were four category one studies of multi-modal approaches but they provided limited evidence about the effectiveness of multi-modal approaches including community initiatives. There is one rigorous test of the effects of information-giving about smoking. There are well-conducted randomized controlled trials to test the effects of social influences interventions: in half of the group of best quality studies those in the intervention group smoke less than those in the control, but many studies failed to detect an effect of the intervention. There are only three high quality RCTs which test the effectiveness of combinations of social influences and social competence interventions, and four which test multi-modal interventions; half showed significant positive results.

  7. A Social-Ecological Framework of Theory, Assessment, and Prevention of Suicide

    PubMed Central

    Cramer, Robert J.; Kapusta, Nestor D.

    2017-01-01

    The juxtaposition of increasing suicide rates with continued calls for suicide prevention efforts begs for new approaches. Grounded in the Centers for Disease Control and Prevention (CDC) framework for tackling health issues, this personal views work integrates relevant suicide risk/protective factor, assessment, and intervention/prevention literatures. Based on these components of suicide risk, we articulate a Social-Ecological Suicide Prevention Model (SESPM) which provides an integration of general and population-specific risk and protective factors. We also use this multi-level perspective to provide a structured approach to understanding current theories and intervention/prevention efforts concerning suicide. Following similar multi-level prevention efforts in interpersonal violence and Human Immunodeficiency Virus (HIV) domains, we offer recommendations for social-ecologically informed suicide prevention theory, training, research, assessment, and intervention programming. Although the SESPM calls for further empirical testing, it provides a suitable backdrop for tailoring of current prevention and intervention programs to population-specific needs. Moreover, the multi-level model shows promise to move suicide risk assessment forward (e.g., development of multi-level suicide risk algorithms or structured professional judgments instruments) to overcome current limitations in the field. Finally, we articulate a set of characteristics of social-ecologically based suicide prevention programs. These include the need to address risk and protective factors with the strongest degree of empirical support at each multi-level layer, incorporate a comprehensive program evaluation strategy, and use a variety of prevention techniques across levels of prevention. PMID:29062296

  8. Voluntary organ donation system adapted to Chinese cultural values and social reality.

    PubMed

    Huang, Jiefu; Millis, J Michael; Mao, Yilei; Millis, M Andrew; Sang, Xinting; Zhong, Shouxian

    2015-04-01

    Organ donation and transplant systems have unique characteristics based on the local culture and socioeconomic context. China's transplant and organ donation systems developed without regulatory oversight until 2006 when regulation and policy were developed and then implemented over the next several years. Most recently, the pilot project of establishing a voluntary citizen-based deceased donor program was established. The pilot program addressed the legal, financial, and cultural barriers to organ donation in China. The pilot program has evolved into a national program. Significantly, it established a uniquely Chinese donor classification system. The Chinese donor classification system recognizes donation after brain death (category I), donation after circulatory death (category II), and donation after brain death followed by circulatory death (category III). Through August 2014, the system has identified 2326 donors and provided 6416 organs that have been allocated though a transparent organ allocation system. The estimated number of donors in 2014 is 1147. As China's attitudes toward organ donation have matured and evolved and as China, as a nation, is taking its place on the world stage, it is recognizing that its past practice of using organs from executed prisoners is not sustainable. It is time to recognize that the efforts to regulate transplantation and provide voluntary citizen-based deceased organ donation have been successful and that China should use this system to provide organs for all transplants in every province and hospital in China. At the national organ transplant congress on October 30, 2014, the Chairman of the China's national organ donation and transplantation committee, Jeifu Huang required all hospitals to stop using organs from executed prisoners immediately and the civilian organ donation will be sole source for organ transplant in China starting January 2015. © 2015 American Association for the Study of Liver Diseases.

  9. Best Practices for Serving Students with Special Food and/or Nutrition Needs in School Nutrition Programs

    ERIC Educational Resources Information Center

    Castillo, Alexandra; Carr, Deborah; Nettles, Mary Frances

    2010-01-01

    Purpose/Objectives: The purpose of this research project was to identify goals and establish best practices for school nutrition (SN) programs that serve students with special food and/or nutrition needs based on the four practice categories identified in previous National Food Service Management Institute, Applied Research Division (NFSMI, ARD)…

  10. The Case for Diabetes Population Health Improvement: Evidence-Based Programming for Population Outcomes in Diabetes.

    PubMed

    Golden, Sherita Hill; Maruthur, Nisa; Mathioudakis, Nestoras; Spanakis, Elias; Rubin, Daniel; Zilbermint, Mihail; Hill-Briggs, Felicia

    2017-07-01

    The goal of this review is to describe diabetes within a population health improvement framework and to review the evidence for a diabetes population health continuum of intervention approaches, including diabetes prevention and chronic and acute diabetes management, to improve clinical and economic outcomes. Recent studies have shown that compared to usual care, lifestyle interventions in prediabetes lower diabetes risk at the population-level and that group-based programs have low incremental medial cost effectiveness ratio for health systems. Effective outpatient interventions that improve diabetes control and process outcomes are multi-level, targeting the patient, provider, and healthcare system simultaneously and integrate community health workers as a liaison between the patient and community-based healthcare resources. A multi-faceted approach to diabetes management is also effective in the inpatient setting. Interventions shown to promote safe and effective glycemic control and use of evidence-based glucose management practices include provider reminder and clinical decision support systems, automated computer order entry, provider education, and organizational change. Future studies should examine the cost-effectiveness of multi-faceted outpatient and inpatient diabetes management programs to determine the best financial models for incorporating them into diabetes population health strategies.

  11. Investigations of flowfields found in typical combustor geometries

    NASA Technical Reports Server (NTRS)

    Lilley, D. G.

    1982-01-01

    Measurements and computations are being applied to an axisymmetric swirling flow, emerging from swirl vanes at angle phi, entering a large chamber test section via a sudden expansion of various side-wall angles alpha. New features are: the turbulence measurements are being performed on swirling as well as nonswirling flow; and all measurements and computations are also being performed on a confined jet flowfield with realistic downstream blockage. Recent activity falls into three categories: (1) Time-mean flowfield characterization by five-hole pitot probe measurements and by flow visualization; (2) Turbulence measurements by a variety of single- and multi-wire hot-wire probe techniques; and (3) Flowfield computations using the computer code developed during the previous year's research program.

  12. Employing Machine-Learning Methods to Study Young Stellar Objects

    NASA Astrophysics Data System (ADS)

    Moore, Nicholas

    2018-01-01

    Vast amounts of data exist in the astronomical data archives, and yet a large number of sources remain unclassified. We developed a multi-wavelength pipeline to classify infrared sources. The pipeline uses supervised machine learning methods to classify objects into the appropriate categories. The program is fed data that is already classified to train it, and is then applied to unknown catalogues. The primary use for such a pipeline is the rapid classification and cataloging of data that would take a much longer time to classify otherwise. While our primary goal is to study young stellar objects (YSOs), the applications extend beyond the scope of this project. We present preliminary results from our analysis and discuss future applications.

  13. Effectiveness of Workplace Interventions in Return-to-Work for Musculoskeletal, Pain-Related and Mental Health Conditions: An Update of the Evidence and Messages for Practitioners.

    PubMed

    Cullen, K L; Irvin, E; Collie, A; Clay, F; Gensby, U; Jennings, P A; Hogg-Johnson, S; Kristman, V; Laberge, M; McKenzie, D; Newnam, S; Palagyi, A; Ruseckaite, R; Sheppard, D M; Shourie, S; Steenstra, I; Van Eerd, D; Amick, B C

    2018-03-01

    Purpose The objective of this systematic review was to synthesize evidence on the effectiveness of workplace-based return-to-work (RTW) interventions and work disability management (DM) interventions that assist workers with musculoskeletal (MSK) and pain-related conditions and mental health (MH) conditions with RTW. Methods We followed a systematic review process developed by the Institute for Work & Health and an adapted best evidence synthesis that ranked evidence as strong, moderate, limited, or insufficient. Results Seven electronic databases were searched from January 1990 until April 2015, yielding 8898 non-duplicate references. Evidence from 36 medium and high quality studies were synthesized on 12 different intervention categories across three broad domains: health-focused, service coordination, and work modification interventions. There was strong evidence that duration away from work from both MSK or pain-related conditions and MH conditions were significantly reduced by multi-domain interventions encompassing at least two of the three domains. There was moderate evidence that these multi-domain interventions had a positive impact on cost outcomes. There was strong evidence that cognitive behavioural therapy interventions that do not also include workplace modifications or service coordination components are not effective in helping workers with MH conditions in RTW. Evidence for the effectiveness of other single-domain interventions was mixed, with some studies reporting positive effects and others reporting no effects on lost time and work functioning. Conclusions While there is substantial research literature focused on RTW, there are only a small number of quality workplace-based RTW intervention studies that involve workers with MSK or pain-related conditions and MH conditions. We recommend implementing multi-domain interventions (i.e. with healthcare provision, service coordination, and work accommodation components) to help reduce lost time for MSK or pain-related conditions and MH conditions. Practitioners should also consider implementing these programs to help improve work functioning and reduce costs associated with work disability.

  14. A chance-constrained programming model to allocate wildfire initial attack resources for a fire season

    Treesearch

    Yu Wei; Michael Bevers; Erin Belval; Benjamin Bird

    2015-01-01

    This research developed a chance-constrained two-stage stochastic programming model to support wildfire initial attack resource acquisition and location on a planning unit for a fire season. Fire growth constraints account for the interaction between fire perimeter growth and construction to prevent overestimation of resource requirements. We used this model to examine...

  15. Shape-Constrained Segmentation Approach for Arctic Multiyear Sea Ice Floe Analysis

    NASA Technical Reports Server (NTRS)

    Tarabalka, Yuliya; Brucker, Ludovic; Ivanoff, Alvaro; Tilton, James C.

    2013-01-01

    The melting of sea ice is correlated to increases in sea surface temperature and associated climatic changes. Therefore, it is important to investigate how rapidly sea ice floes melt. For this purpose, a new Tempo Seg method for multi temporal segmentation of multi year ice floes is proposed. The microwave radiometer is used to track the position of an ice floe. Then,a time series of MODIS images are created with the ice floe in the image center. A Tempo Seg method is performed to segment these images into two regions: Floe and Background.First, morphological feature extraction is applied. Then, the central image pixel is marked as Floe, and shape-constrained best merge region growing is performed. The resulting tworegionmap is post-filtered by applying morphological operators.We have successfully tested our method on a set of MODIS images and estimated the area of a sea ice floe as afunction of time.

  16. Conceptual model of acid attacks based on survivor's experiences: Lessons from a qualitative exploration.

    PubMed

    Sabzi Khoshnami, Mohammad; Mohammadi, Elham; Addelyan Rasi, Hamideh; Khankeh, Hamid Reza; Arshi, Maliheh

    2017-05-01

    Acid attack, a worldwide phenomenon, has been increasing in recent years. In addition to severe injuries to the face and body, such violence leads to psychological and social problems that affect the survivors' quality of life. The present study provides a more in-depth understanding of this phenomenon and explores the nature and dimensions of acid attacks based on survivors' experiences. A grounded theory study using semi-structured, recorded interviews and applying purposeful theoretical sampling was conducted with 12 acid attack survivors in Iran. Data were analysed using constant comparison in open, axial and selective coding stages. A conceptual model was developed to explain the relationships among the main categories extracted through the grounded theory study. Physical and psychological wounds emerged as a core category. Traditional context and extreme beauty value in society acted as the context of the physical and psychological wounds experienced. Living with a drug abuser with behavioural disorders and lack of problem-solving skills in interpersonal conflict were found to be causal conditions. Action strategies to deal with this experience were found to be composed of individual, interpersonal and structural levels. Education, percentage and place of burning acted as intervening conditions that influenced survivors' strategies. Finally, adverse consequences of social deprivation and feeling helpless and hindered were found to have an important impact. Acid attack lead to physical and psychological wounds in survivors. This is a multi-dimensional phenomenon involving illness, disability, and victimization, and requires a wide range of strategies at different levels. The conceptual model derived through this study can serve as a good basis for intervention programs. Copyright © 2016 Elsevier Ltd and ISBI. All rights reserved.

  17. 42 CFR § 414.1335 - Data submission criteria for the quality performance category.

    Code of Federal Regulations, 2010 CFR

    2017-10-01

    ..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICARE PROGRAM (CONTINUED) PAYMENT FOR PART B MEDICAL AND OTHER HEALTH SERVICES Merit-Based Incentive Payment System and Alternative Payment Model Incentive...

  18. Implementation multi representation and oral communication skills in Department of Physics Education on Elementary Physics II

    NASA Astrophysics Data System (ADS)

    Kusumawati, Intan; Marwoto, Putut; Linuwih, Suharto

    2015-09-01

    The ability of multi representation has been widely studied, but there has been no implementation through a model of learning. This study aimed to determine the ability of the students multi representation, relationships multi representation capabilities and oral communication skills, as well as the application of the relations between the two capabilities through learning model Presentatif Based on Multi representation (PBM) in solving optical geometric (Elementary Physics II). A concurrent mixed methods research methods with qualitative-quantitative weights. Means of collecting data in the form of the pre-test and post-test with essay form, observation sheets oral communication skills, and assessment of learning by observation sheet PBM-learning models all have a high degree of respectively validity category is 3.91; 4.22; 4.13; 3.88. Test reliability with Alpha Cronbach technique, reliability coefficient of 0.494. The students are department of Physics Education Unnes as a research subject. Sequence multi representation tendency of students from high to low in sequence, representation of M, D, G, V; whereas the order of accuracy, the group representation V, D, G, M. Relationship multi representation ability and oral communication skills, comparable/proportional. Implementation conjunction generate grounded theory. This study should be applied to the physics of matter, or any other university for comparison.

  19. Design and Analysis of the Measurement Characteristics of a Bidirectional-Decoupling Over-Constrained Six-Dimensional Parallel-Mechanism Force Sensor

    PubMed Central

    Zhao, Tieshi; Zhao, Yanzhi; Hu, Qiangqiang; Ding, Shixing

    2017-01-01

    The measurement of large forces and the presence of errors due to dimensional coupling are significant challenges for multi-dimensional force sensors. To address these challenges, this paper proposes an over-constrained six-dimensional force sensor based on a parallel mechanism of steel ball structures as a measurement module. The steel ball structure can be subject to rolling friction instead of sliding friction, thus reducing the influence of friction. However, because the structure can only withstand unidirectional pressure, the application of steel balls in a six-dimensional force sensor is difficult. Accordingly, a new design of the sensor measurement structure was designed in this study. The static equilibrium and displacement compatibility equations of the sensor prototype’s over-constrained structure were established to obtain the transformation function, from which the forces in the measurement branches of the proposed sensor were then analytically derived. The sensor’s measurement characteristics were then analysed through numerical examples. Finally, these measurement characteristics were confirmed through calibration and application experiments. The measurement accuracy of the proposed sensor was determined to be 1.28%, with a maximum coupling error of 1.98%, indicating that the proposed sensor successfully overcomes the issues related to steel ball structures and provides sufficient accuracy. PMID:28867812

  20. 14 CFR 29.1505 - Never-exceed speed.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... variation of VNE. (c) For helicopters, a stabilized power-off VNE denoted as VNE (power-off) may be... speed used in meeting the requirements of— (i) § 29.67(a)(3) for Category A helicopters; (ii) § 29.65(a) for Category B helicopters, except multi-engine helicopters meeting the requirements of § 29.67(b...

  1. 14 CFR 29.1505 - Never-exceed speed.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... variation of VNE. (c) For helicopters, a stabilized power-off VNE denoted as VNE (power-off) may be... speed used in meeting the requirements of— (i) § 29.67(a)(3) for Category A helicopters; (ii) § 29.65(a) for Category B helicopters, except multi-engine helicopters meeting the requirements of § 29.67(b...

  2. Object-based attentional modulation of biological motion processing: spatiotemporal dynamics using functional magnetic resonance imaging and electroencephalography.

    PubMed

    Safford, Ashley S; Hussey, Elizabeth A; Parasuraman, Raja; Thompson, James C

    2010-07-07

    Although it is well documented that the ability to perceive biological motion is mediated by the lateral temporal cortex, whether and when neural activity in this brain region is modulated by attention is unknown. In particular, it is unclear whether the processing of biological motion requires attention or whether such stimuli are processed preattentively. Here, we used functional magnetic resonance imaging, high-density electroencephalography, and cortically constrained source estimation methods to investigate the spatiotemporal effects of attention on the processing of biological motion. Directing attention to tool motion in overlapping movies of biological motion and tool motion suppressed the blood oxygenation level-dependent (BOLD) response of the right superior temporal sulcus (STS)/middle temporal gyrus (MTG), while directing attention to biological motion suppressed the BOLD response of the left inferior temporal sulcus (ITS)/MTG. Similarly, category-based modulation of the cortical current source density estimates from the right STS/MTG and left ITS was observed beginning at approximately 450 ms following stimulus onset. Our results indicate that the cortical processing of biological motion is strongly modulated by attention. These findings argue against preattentive processing of biological motion in the presence of stimuli that compete for attention. Our findings also suggest that the attention-based segregation of motion category-specific responses only emerges relatively late (several hundred milliseconds) in processing.

  3. Genre Complexes in Popular Music

    PubMed Central

    Childress, C. Clayton

    2016-01-01

    Recent work in the sociology of music suggests a declining importance of genre categories. Yet other work in this research stream and in the sociology of classification argues for the continued prevalence of genres as a meaningful tool through which creators, critics and consumers focus their attention in the topology of available works. Building from work in the study of categories and categorization we examine how boundary strength and internal differentiation structure the genre pairings of some 3 million musicians and groups. Using a range of network-based and statistical techniques, we uncover three musical “complexes,” which are collectively constituted by 16 smaller genre communities. Our analysis shows that the musical universe is not monolithically organized but rather composed of multiple worlds that are differently structured—i.e., uncentered, single-centered, and multi-centered. PMID:27203852

  4. Genre Complexes in Popular Music.

    PubMed

    Silver, Daniel; Lee, Monica; Childress, C Clayton

    2016-01-01

    Recent work in the sociology of music suggests a declining importance of genre categories. Yet other work in this research stream and in the sociology of classification argues for the continued prevalence of genres as a meaningful tool through which creators, critics and consumers focus their attention in the topology of available works. Building from work in the study of categories and categorization we examine how boundary strength and internal differentiation structure the genre pairings of some 3 million musicians and groups. Using a range of network-based and statistical techniques, we uncover three musical "complexes," which are collectively constituted by 16 smaller genre communities. Our analysis shows that the musical universe is not monolithically organized but rather composed of multiple worlds that are differently structured-i.e., uncentered, single-centered, and multi-centered.

  5. Energygrams: Brief descriptions of energy technology

    NASA Astrophysics Data System (ADS)

    Simpson, W. F., Jr.

    This compilation of technical notes (called Energygrams) is published by the Technical Information Center. Energygrams are usually one-page illustrated bulletins describing DOE technology or data and telling how to obtain the technical reports or other material on which they are based. Frequently a personal contact is given who can provide program information in addition to the data found in the reports. The compilation is organized by subject categories, and, within each category, Energygrams are presented alphabetically by Energygram title.

  6. Design Issues for Traffic Management for the ATM UBR + Service for TCP Over Satellite Networks

    NASA Technical Reports Server (NTRS)

    Jain, Raj

    1999-01-01

    This project was a comprehensive research program for developing techniques for improving the performance of Internet protocols over Asynchronous Transfer Mode (ATM) based satellite networks. Among the service categories provided by ATM networks, the most commonly used category for data traffic is the unspecified bit rate (UBR) service. UBR allows sources to send data into the network without any feedback control. The project resulted in the numerous ATM Forum contributions and papers.

  7. Stepping Stones Triple P: The Theoretical Basis and Development of an Evidence-Based Positive Parenting Program for Families with a Child Who Has a Disability

    ERIC Educational Resources Information Center

    Sanders, Matthew; Mazzucchelli, Trevor; Studman, Lisa

    2004-01-01

    Stepping Stones Triple P is the first in a series of programs based on the Triple P--Positive Parenting Program that has been specifically designed for families who have a child with a disability. This paper presents the rationale, theoretical foundations, historical development and distinguishing features of the program. The multi-level…

  8. Scattering amplitudes from multivariate polynomial division

    NASA Astrophysics Data System (ADS)

    Mastrolia, Pierpaolo; Mirabella, Edoardo; Ossola, Giovanni; Peraro, Tiziano

    2012-11-01

    We show that the evaluation of scattering amplitudes can be formulated as a problem of multivariate polynomial division, with the components of the integration-momenta as indeterminates. We present a recurrence relation which, independently of the number of loops, leads to the multi-particle pole decomposition of the integrands of the scattering amplitudes. The recursive algorithm is based on the weak Nullstellensatz theorem and on the division modulo the Gröbner basis associated to all possible multi-particle cuts. We apply it to dimensionally regulated one-loop amplitudes, recovering the well-known integrand-decomposition formula. Finally, we focus on the maximum-cut, defined as a system of on-shell conditions constraining the components of all the integration-momenta. By means of the Finiteness Theorem and of the Shape Lemma, we prove that the residue at the maximum-cut is parametrized by a number of coefficients equal to the number of solutions of the cut itself.

  9. Virtual optical network mapping and core allocation in elastic optical networks using multi-core fibers

    NASA Astrophysics Data System (ADS)

    Xuan, Hejun; Wang, Yuping; Xu, Zhanqi; Hao, Shanshan; Wang, Xiaoli

    2017-11-01

    Virtualization technology can greatly improve the efficiency of the networks by allowing the virtual optical networks to share the resources of the physical networks. However, it will face some challenges, such as finding the efficient strategies for virtual nodes mapping, virtual links mapping and spectrum assignment. It is even more complex and challenging when the physical elastic optical networks using multi-core fibers. To tackle these challenges, we establish a constrained optimization model to determine the optimal schemes of optical network mapping, core allocation and spectrum assignment. To solve the model efficiently, tailor-made encoding scheme, crossover and mutation operators are designed. Based on these, an efficient genetic algorithm is proposed to obtain the optimal schemes of the virtual nodes mapping, virtual links mapping, core allocation. The simulation experiments are conducted on three widely used networks, and the experimental results show the effectiveness of the proposed model and algorithm.

  10. Distinguishing remobilized ash from erupted volcanic plumes using space-borne multi-angle imaging.

    PubMed

    Flower, Verity J B; Kahn, Ralph A

    2017-10-28

    Volcanic systems are comprised of a complex combination of ongoing eruptive activity and secondary hazards, such as remobilized ash plumes. Similarities in the visual characteristics of remobilized and erupted plumes, as imaged by satellite-based remote sensing, complicate the accurate classification of these events. The stereo imaging capabilities of the Multi-angle Imaging SpectroRadiometer (MISR) were used to determine the altitude and distribution of suspended particles. Remobilized ash shows distinct dispersion, with particles distributed within ~1.5 km of the surface. Particle transport is consistently constrained by local topography, limiting dispersion pathways downwind. The MISR Research Aerosol (RA) retrieval algorithm was used to assess plume particle microphysical properties. Remobilized ash plumes displayed a dominance of large particles with consistent absorption and angularity properties, distinct from emitted plumes. The combination of vertical distribution, topographic control, and particle microphysical properties makes it possible to distinguish remobilized ash flows from eruptive plumes, globally.

  11. MCSCF wave functions for excited states of polar molecules - Application to BeO. [Multi-Configuration Self-Consistent Field

    NASA Technical Reports Server (NTRS)

    Bauschlicher, C. W., Jr.; Yarkony, D. R.

    1980-01-01

    A previously reported multi-configuration self-consistent field (MCSCF) algorithm based on the generalized Brillouin theorem is extended in order to treat the excited states of polar molecules. In particular, the algorithm takes into account the proper treatment of nonorthogonality in the space of single excitations and invokes, when necessary, a constrained optimization procedure to prevent the variational collapse of excited states. In addition, a configuration selection scheme (suitable for use in conjunction with extended configuration interaction methods) is proposed for the MCSCF procedure. The algorithm is used to study the low-lying singlet states of BeO, a system which has not previously been studied using an MCSCF procedure. MCSCF wave functions are obtained for three 1 Sigma + and two 1 Pi states. The 1 Sigma + results are juxtaposed with comparable results for MgO in order to assess the generality of the description presented here.

  12. 78 FR 20411 - Supplemental Nutrition Assistance Program: Nutrition Education and Obesity Prevention Grant Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-05

    ..., multi- level interventions; and community and public health approaches. To improve program design... prevention services and an evidence-based approach are provided for States to use in their SNAP-Ed programming. These definitions provide States with greater flexibility to include environmental approaches and...

  13. Multi-objects recognition for distributed intelligent sensor networks

    NASA Astrophysics Data System (ADS)

    He, Haibo; Chen, Sheng; Cao, Yuan; Desai, Sachi; Hohil, Myron E.

    2008-04-01

    This paper proposes an innovative approach for multi-objects recognition for homeland security and defense based intelligent sensor networks. Unlike the conventional way of information analysis, data mining in such networks is typically characterized with high information ambiguity/uncertainty, data redundancy, high dimensionality and real-time constrains. Furthermore, since a typical military based network normally includes multiple mobile sensor platforms, ground forces, fortified tanks, combat flights, and other resources, it is critical to develop intelligent data mining approaches to fuse different information resources to understand dynamic environments, to support decision making processes, and finally to achieve the goals. This paper aims to address these issues with a focus on multi-objects recognition. Instead of classifying a single object as in the traditional image classification problems, the proposed method can automatically learn multiple objectives simultaneously. Image segmentation techniques are used to identify the interesting regions in the field, which correspond to multiple objects such as soldiers or tanks. Since different objects will come with different feature sizes, we propose a feature scaling method to represent each object in the same number of dimensions. This is achieved by linear/nonlinear scaling and sampling techniques. Finally, support vector machine (SVM) based learning algorithms are developed to learn and build the associations for different objects, and such knowledge will be adaptively accumulated for objects recognition in the testing stage. We test the effectiveness of proposed method in different simulated military environments.

  14. Determining flexor-tendon repair techniques via soft computing

    NASA Technical Reports Server (NTRS)

    Johnson, M.; Firoozbakhsh, K.; Moniem, M.; Jamshidi, M.

    2001-01-01

    An SC-based multi-objective decision-making method for determining the optimal flexor-tendon repair technique from experimental and clinical survey data, and with variable circumstances, was presented. Results were compared with those from the Taguchi method. Using the Taguchi method results in the need to perform ad-hoc decisions when the outcomes for individual objectives are contradictory to a particular preference or circumstance, whereas the SC-based multi-objective technique provides a rigorous straightforward computational process in which changing preferences and importance of differing objectives are easily accommodated. Also, adding more objectives is straightforward and easily accomplished. The use of fuzzy-set representations of information categories provides insight into their performance throughout the range of their universe of discourse. The ability of the technique to provide a "best" medical decision given a particular physician, hospital, patient, situation, and other criteria was also demonstrated.

  15. Determining flexor-tendon repair techniques via soft computing.

    PubMed

    Johnson, M; Firoozbakhsh, K; Moniem, M; Jamshidi, M

    2001-01-01

    An SC-based multi-objective decision-making method for determining the optimal flexor-tendon repair technique from experimental and clinical survey data, and with variable circumstances, was presented. Results were compared with those from the Taguchi method. Using the Taguchi method results in the need to perform ad-hoc decisions when the outcomes for individual objectives are contradictory to a particular preference or circumstance, whereas the SC-based multi-objective technique provides a rigorous straightforward computational process in which changing preferences and importance of differing objectives are easily accommodated. Also, adding more objectives is straightforward and easily accomplished. The use of fuzzy-set representations of information categories provides insight into their performance throughout the range of their universe of discourse. The ability of the technique to provide a "best" medical decision given a particular physician, hospital, patient, situation, and other criteria was also demonstrated.

  16. Aid Professional Growth. Module CG D-2 of Category D--Operating. Competency-Based Career Guidance Modules.

    ERIC Educational Resources Information Center

    Ruff, Eldon E.

    This learning module, one in a series of competency-based guidance program training packages focusing upon professional and paraprofessional competencies of guidance personnel, deals with aiding professional growth. Addressed in the module are the following topics: assessing competencies; determining certification, licensure, and registration…

  17. An OpenACC-Based Unified Programming Model for Multi-accelerator Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Jungwon; Lee, Seyong; Vetter, Jeffrey S

    2015-01-01

    This paper proposes a novel SPMD programming model of OpenACC. Our model integrates the different granularities of parallelism from vector-level parallelism to node-level parallelism into a single, unified model based on OpenACC. It allows programmers to write programs for multiple accelerators using a uniform programming model whether they are in shared or distributed memory systems. We implement a prototype of our model and evaluate its performance with a GPU-based supercomputer using three benchmark applications.

  18. Block sparsity-based joint compressed sensing recovery of multi-channel ECG signals.

    PubMed

    Singh, Anurag; Dandapat, Samarendra

    2017-04-01

    In recent years, compressed sensing (CS) has emerged as an effective alternative to conventional wavelet based data compression techniques. This is due to its simple and energy-efficient data reduction procedure, which makes it suitable for resource-constrained wireless body area network (WBAN)-enabled electrocardiogram (ECG) telemonitoring applications. Both spatial and temporal correlations exist simultaneously in multi-channel ECG (MECG) signals. Exploitation of both types of correlations is very important in CS-based ECG telemonitoring systems for better performance. However, most of the existing CS-based works exploit either of the correlations, which results in a suboptimal performance. In this work, within a CS framework, the authors propose to exploit both types of correlations simultaneously using a sparse Bayesian learning-based approach. A spatiotemporal sparse model is employed for joint compression/reconstruction of MECG signals. Discrete wavelets transform domain block sparsity of MECG signals is exploited for simultaneous reconstruction of all the channels. Performance evaluations using Physikalisch-Technische Bundesanstalt MECG diagnostic database show a significant gain in the diagnostic reconstruction quality of the MECG signals compared with the state-of-the art techniques at reduced number of measurements. Low measurement requirement may lead to significant savings in the energy-cost of the existing CS-based WBAN systems.

  19. An attribute-based approach to contingent valuation of forest protection programs

    Treesearch

    Christopher C. Moore; Thomas P. Holmes; Kathleen P. Bell

    2011-01-01

    The hemlock woolly adelgid is an invasive insect that is damaging hemlock forests in the eastern United States. Several control methods are available but forest managers are constrained by cost, availability, and environmental concerns. As a result forest managers must decide how to allocate limited conservation resources over heterogeneous landscapes. We develop an...

  20. Connectivity inference from neural recording data: Challenges, mathematical bases and research directions.

    PubMed

    Magrans de Abril, Ildefons; Yoshimoto, Junichiro; Doya, Kenji

    2018-06-01

    This article presents a review of computational methods for connectivity inference from neural activity data derived from multi-electrode recordings or fluorescence imaging. We first identify biophysical and technical challenges in connectivity inference along the data processing pipeline. We then review connectivity inference methods based on two major mathematical foundations, namely, descriptive model-free approaches and generative model-based approaches. We investigate representative studies in both categories and clarify which challenges have been addressed by which method. We further identify critical open issues and possible research directions. Copyright © 2018 The Author(s). Published by Elsevier Ltd.. All rights reserved.

Top