Sample records for objective function defined

  1. Recursive search method for the image elements of functionally defined surfaces

    NASA Astrophysics Data System (ADS)

    Vyatkin, S. I.

    2017-05-01

    This paper touches upon the synthesis of high-quality images in real time and the technique for specifying three-dimensional objects on the basis of perturbation functions. The recursive search method for the image elements of functionally defined objects with the use of graphics processing units is proposed. The advantages of such an approach over the frame-buffer visualization method are shown.

  2. Generalized local emission tomography

    DOEpatents

    Katsevich, Alexander J.

    1998-01-01

    Emission tomography enables locations and values of internal isotope density distributions to be determined from radiation emitted from the whole object. In the method for locating the values of discontinuities, the intensities of radiation emitted from either the whole object or a region of the object containing the discontinuities are inputted to a local tomography function .function..sub..LAMBDA..sup.(.PHI.) to define the location S of the isotope density discontinuity. The asymptotic behavior of .function..sub..LAMBDA..sup.(.PHI.) is determined in a neighborhood of S, and the value for the discontinuity is estimated from the asymptotic behavior of .function..sub..LAMBDA..sup.(.PHI.) knowing pointwise values of the attenuation coefficient within the object. In the method for determining the location of the discontinuity, the intensities of radiation emitted from an object are inputted to a local tomography function .function..sub..LAMBDA..sup.(.PHI.) to define the location S of the density discontinuity and the location .GAMMA. of the attenuation coefficient discontinuity. Pointwise values of the attenuation coefficient within the object need not be known in this case.

  3. An object oriented extension to CLIPS

    NASA Technical Reports Server (NTRS)

    Sobkowicz, Clifford

    1990-01-01

    A presentation of software sub-system developed to augment C Language Production Systems (CLIPS) with facilities for object oriented Knowledge representation. Functions are provided to define classes, instantiate objects, access attributes, and assert object related facts. This extension is implemented via the CLIPS user function interface and does not require modification of any CLIPS code. It does rely on internal CLIPS functions for memory management and symbol representation.

  4. Discordance between Psychometric Testing and Questionnaire-Based Definitions of Executive Function Deficits in Individuals with ADHD

    ERIC Educational Resources Information Center

    Biederman, Joseph; Petty, Carter R.; Fried, Ronna; Black, Sarah; Faneuil, Alicia; Doyle, Alysa E.; Seidman, Larry J.; Faraone, Stephen V.

    2008-01-01

    Objective: One suspected source of negative outcomes associated with ADHD has been deficits in executive functions. Although both psychometrically defined and self-reported executive function deficits (EFDs) have been shown to be associated with poor academic and occupational outcomes, whether these two approaches define the same individuals…

  5. Comparison of particle swarm optimization and simulated annealing for locating additional boreholes considering combined variance minimization

    NASA Astrophysics Data System (ADS)

    Soltani-Mohammadi, Saeed; Safa, Mohammad; Mokhtari, Hadi

    2016-10-01

    One of the most important stages in complementary exploration is optimal designing the additional drilling pattern or defining the optimum number and location of additional boreholes. Quite a lot research has been carried out in this regard in which for most of the proposed algorithms, kriging variance minimization as a criterion for uncertainty assessment is defined as objective function and the problem could be solved through optimization methods. Although kriging variance implementation is known to have many advantages in objective function definition, it is not sensitive to local variability. As a result, the only factors evaluated for locating the additional boreholes are initial data configuration and variogram model parameters and the effects of local variability are omitted. In this paper, with the goal of considering the local variability in boundaries uncertainty assessment, the application of combined variance is investigated to define the objective function. Thus in order to verify the applicability of the proposed objective function, it is used to locate the additional boreholes in Esfordi phosphate mine through the implementation of metaheuristic optimization methods such as simulated annealing and particle swarm optimization. Comparison of results from the proposed objective function and conventional methods indicates that the new changes imposed on the objective function has caused the algorithm output to be sensitive to the variations of grade, domain's boundaries and the thickness of mineralization domain. The comparison between the results of different optimization algorithms proved that for the presented case the application of particle swarm optimization is more appropriate than simulated annealing.

  6. Methods and apparatus for extraction and tracking of objects from multi-dimensional sequence data

    NASA Technical Reports Server (NTRS)

    Hill, Matthew L. (Inventor); Chang, Yuan-Chi (Inventor); Li, Chung-Sheng (Inventor); Castelli, Vittorio (Inventor); Bergman, Lawrence David (Inventor)

    2008-01-01

    An object tracking technique is provided which, given: (i) a potentially large data set; (ii) a set of dimensions along which the data has been ordered; and (iii) a set of functions for measuring the similarity between data elements, a set of objects are produced. Each of these objects is defined by a list of data elements. Each of the data elements on this list contains the probability that the data element is part of the object. The method produces these lists via an adaptive, knowledge-based search function which directs the search for high-probability data elements. This serves to reduce the number of data element combinations evaluated while preserving the most flexibility in defining the associations of data elements which comprise an object.

  7. Methods and apparatus for extraction and tracking of objects from multi-dimensional sequence data

    NASA Technical Reports Server (NTRS)

    Hill, Matthew L. (Inventor); Chang, Yuan-Chi (Inventor); Li, Chung-Sheng (Inventor); Castelli, Vittorio (Inventor); Bergman, Lawrence David (Inventor)

    2005-01-01

    An object tracking technique is provided which, given: (i) a potentially large data set; (ii) a set of dimensions along which the data has been ordered; and (iii) a set of functions for measuring the similarity between data elements, a set of objects are produced. Each of these objects is defined by a list of data elements. Each of the data elements on this list contains the probability that the data element is part of the object. The method produces these lists via an adaptive, knowledge-based search function which directs the search for high-probability data elements. This serves to reduce the number of data element combinations evaluated while preserving the most flexibility in defining the associations of data elements which comprise an object.

  8. Desired Precision in Multi-Objective Optimization: Epsilon Archiving or Rounding Objectives?

    NASA Astrophysics Data System (ADS)

    Asadzadeh, M.; Sahraei, S.

    2016-12-01

    Multi-objective optimization (MO) aids in supporting the decision making process in water resources engineering and design problems. One of the main goals of solving a MO problem is to archive a set of solutions that is well-distributed across a wide range of all the design objectives. Modern MO algorithms use the epsilon dominance concept to define a mesh with pre-defined grid-cell size (often called epsilon) in the objective space and archive at most one solution at each grid-cell. Epsilon can be set to the desired precision level of each objective function to make sure that the difference between each pair of archived solutions is meaningful. This epsilon archiving process is computationally expensive in problems that have quick-to-evaluate objective functions. This research explores the applicability of a similar but computationally more efficient approach to respect the desired precision level of all objectives in the solution archiving process. In this alternative approach each objective function is rounded to the desired precision level before comparing any new solution to the set of archived solutions that already have rounded objective function values. This alternative solution archiving approach is compared to the epsilon archiving approach in terms of efficiency and quality of archived solutions for solving mathematical test problems and hydrologic model calibration problems.

  9. Variance approach for multi-objective linear programming with fuzzy random of objective function coefficients

    NASA Astrophysics Data System (ADS)

    Indarsih, Indrati, Ch. Rini

    2016-02-01

    In this paper, we define variance of the fuzzy random variables through alpha level. We have a theorem that can be used to know that the variance of fuzzy random variables is a fuzzy number. We have a multi-objective linear programming (MOLP) with fuzzy random of objective function coefficients. We will solve the problem by variance approach. The approach transform the MOLP with fuzzy random of objective function coefficients into MOLP with fuzzy of objective function coefficients. By weighted methods, we have linear programming with fuzzy coefficients and we solve by simplex method for fuzzy linear programming.

  10. Case studies on optimization problems in MATLAB and COMSOL multiphysics by means of the livelink

    NASA Astrophysics Data System (ADS)

    Ozana, Stepan; Pies, Martin; Docekal, Tomas

    2016-06-01

    LiveLink for COMSOL is a tool that integrates COMSOL Multiphysics with MATLAB to extend one's modeling with scripting programming in the MATLAB environment. It allows user to utilize the full power of MATLAB and its toolboxes in preprocessing, model manipulation, and post processing. At first, the head script launches COMSOL with MATLAB and defines initial value of all parameters, refers to the objective function J described in the objective function and creates and runs the defined optimization task. Once the task is launches, the COMSOL model is being called in the iteration loop (from MATLAB environment by use of API interface), changing defined optimization parameters so that the objective function is minimized, using fmincon function to find a local or global minimum of constrained linear or nonlinear multivariable function. Once the minimum is found, it returns exit flag, terminates optimization and returns the optimized values of the parameters. The cooperation with MATLAB via LiveLink enhances a powerful computational environment with complex multiphysics simulations. The paper will introduce using of the LiveLink for COMSOL for chosen case studies in the field of technical cybernetics and bioengineering.

  11. Defining functional groups based on running kinematics using Self-Organizing Maps and Support Vector Machines.

    PubMed

    Hoerzer, Stefan; von Tscharner, Vinzenz; Jacob, Christian; Nigg, Benno M

    2015-07-16

    A functional group is a collection of individuals who react in a similar way to a specific intervention/product such as a sport shoe. Matching footwear features to a functional group can possibly enhance footwear-related comfort, improve running performance, and decrease the risk of movement-related injuries. To match footwear features to a functional group, one has to first define the different groups using their distinctive movement patterns. Therefore, the main objective of this study was to propose and apply a methodological approach to define functional groups with different movement patterns using Self-Organizing Maps and Support Vector Machines. Further study objectives were to identify differences in age, gender and footwear-related comfort preferences between the functional groups. Kinematic data and subjective comfort preferences of 88 subjects (16-76 years; 45 m/43 f) were analysed. Eight functional groups with distinctive movement patterns were defined. The findings revealed that most of the groups differed in age or gender. Certain functional groups differed in their comfort preferences and, therefore, had group-specific footwear requirements to enhance footwear-related comfort. Some of the groups, which had group-specific footwear requirements, did not show any differences in age or gender. This is important because when defining functional groups simply using common grouping criteria like age or gender, certain functional groups with group-specific movement patterns and footwear requirements might not be detected. This emphasises the power of the proposed pattern recognition approach to automatically define groups by their distinctive movement patterns in order to be able to address their group-specific product requirements. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Solving a class of generalized fractional programming problems using the feasibility of linear programs.

    PubMed

    Shen, Peiping; Zhang, Tongli; Wang, Chunfeng

    2017-01-01

    This article presents a new approximation algorithm for globally solving a class of generalized fractional programming problems (P) whose objective functions are defined as an appropriate composition of ratios of affine functions. To solve this problem, the algorithm solves an equivalent optimization problem (Q) via an exploration of a suitably defined nonuniform grid. The main work of the algorithm involves checking the feasibility of linear programs associated with the interesting grid points. It is proved that the proposed algorithm is a fully polynomial time approximation scheme as the ratio terms are fixed in the objective function to problem (P), based on the computational complexity result. In contrast to existing results in literature, the algorithm does not require the assumptions on quasi-concavity or low-rank of the objective function to problem (P). Numerical results are given to illustrate the feasibility and effectiveness of the proposed algorithm.

  13. DEVELOPMENTS IN GRworkbench

    NASA Astrophysics Data System (ADS)

    Moylan, Andrew; Scott, Susan M.; Searle, Anthony C.

    2006-02-01

    The software tool GRworkbench is an ongoing project in visual, numerical General Relativity at The Australian National University. Recently, GRworkbench has been significantly extended to facilitate numerical experimentation in analytically-defined space-times. The numerical differential geometric engine has been rewritten using functional programming techniques, enabling objects which are normally defined as functions in the formalism of differential geometry and General Relativity to be directly represented as function variables in the C++ code of GRworkbench. The new functional differential geometric engine allows for more accurate and efficient visualisation of objects in space-times and makes new, efficient computational techniques available. Motivated by the desire to investigate a recent scientific claim using GRworkbench, new tools for numerical experimentation have been implemented, allowing for the simulation of complex physical situations.

  14. Using Approximations to Accelerate Engineering Design Optimization

    NASA Technical Reports Server (NTRS)

    Torczon, Virginia; Trosset, Michael W.

    1998-01-01

    Optimization problems that arise in engineering design are often characterized by several features that hinder the use of standard nonlinear optimization techniques. Foremost among these features is that the functions used to define the engineering optimization problem often are computationally intensive. Within a standard nonlinear optimization algorithm, the computational expense of evaluating the functions that define the problem would necessarily be incurred for each iteration of the optimization algorithm. Faced with such prohibitive computational costs, an attractive alternative is to make use of surrogates within an optimization context since surrogates can be chosen or constructed so that they are typically much less expensive to compute. For the purposes of this paper, we will focus on the use of algebraic approximations as surrogates for the objective. In this paper we introduce the use of so-called merit functions that explicitly recognize the desirability of improving the current approximation to the objective during the course of the optimization. We define and experiment with the use of merit functions chosen to simultaneously improve both the solution to the optimization problem (the objective) and the quality of the approximation. Our goal is to further improve the effectiveness of our general approach without sacrificing any of its rigor.

  15. Designing Class Methods from Dataflow Diagrams

    NASA Astrophysics Data System (ADS)

    Shoval, Peretz; Kabeli-Shani, Judith

    A method for designing the class methods of an information system is described. The method is part of FOOM - Functional and Object-Oriented Methodology. In the analysis phase of FOOM, two models defining the users' requirements are created: a conceptual data model - an initial class diagram; and a functional model - hierarchical OO-DFDs (object-oriented dataflow diagrams). Based on these models, a well-defined process of methods design is applied. First, the OO-DFDs are converted into transactions, i.e., system processes that supports user task. The components and the process logic of each transaction are described in detail, using pseudocode. Then, each transaction is decomposed, according to well-defined rules, into class methods of various types: basic methods, application-specific methods and main transaction (control) methods. Each method is attached to a proper class; messages between methods express the process logic of each transaction. The methods are defined using pseudocode or message charts.

  16. Flexible Method for Inter-object Communication in C++

    NASA Technical Reports Server (NTRS)

    Curlett, Brian P.; Gould, Jack J.

    1994-01-01

    A method has been developed for organizing and sharing large amounts of information between objects in C++ code. This method uses a set of object classes to define variables and group them into tables. The variable tables presented here provide a convenient way of defining and cataloging data, as well as a user-friendly input/output system, a standardized set of access functions, mechanisms for ensuring data integrity, methods for interprocessor data transfer, and an interpretive language for programming relationships between parameters. The object-oriented nature of these variable tables enables the use of multiple data types, each with unique attributes and behavior. Because each variable provides its own access methods, redundant table lookup functions can be bypassed, thus decreasing access times while maintaining data integrity. In addition, a method for automatic reference counting was developed to manage memory safely.

  17. The VIS-AD data model: Integrating metadata and polymorphic display with a scientific programming language

    NASA Technical Reports Server (NTRS)

    Hibbard, William L.; Dyer, Charles R.; Paul, Brian E.

    1994-01-01

    The VIS-AD data model integrates metadata about the precision of values, including missing data indicators and the way that arrays sample continuous functions, with the data objects of a scientific programming language. The data objects of this data model form a lattice, ordered by the precision with which they approximate mathematical objects. We define a similar lattice of displays and study visualization processes as functions from data lattices to display lattices. Such functions can be applied to visualize data objects of all data types and are thus polymorphic.

  18. Cortical Thickness in Fusiform Face Area Predicts Face and Object Recognition Performance

    PubMed Central

    McGugin, Rankin W.; Van Gulick, Ana E.; Gauthier, Isabel

    2016-01-01

    The fusiform face area (FFA) is defined by its selectivity for faces. Several studies have shown that the response of FFA to non-face objects can predict behavioral performance for these objects. However, one possible account is that experts pay more attention to objects in their domain of expertise, driving signals up. Here we show an effect of expertise with non-face objects in FFA that cannot be explained by differential attention to objects of expertise. We explore the relationship between cortical thickness of FFA and face and object recognition using the Cambridge Face Memory Test and Vanderbilt Expertise Test, respectively. We measured cortical thickness in functionally-defined regions in a group of men who evidenced functional expertise effects for cars in FFA. Performance with faces and objects together accounted for approximately 40% of the variance in cortical thickness of several FFA patches. While subjects with a thicker FFA cortex performed better with vehicles, those with a thinner FFA cortex performed better with faces and living objects. The results point to a domain-general role of FFA in object perception and reveal an interesting double dissociation that does not contrast faces and objects, but rather living and non-living objects. PMID:26439272

  19. Calculation of the twilight visibility function of near-sun objects

    NASA Technical Reports Server (NTRS)

    Kastner, S. O.

    1976-01-01

    The visibility function, defined here as the magnitude difference between the excess brightness of a given object and that of the background sky, of near-sun objects during twilight is obtained from a general calculation which considers the twilight sky background, atmospheric extinction, and night glow. Visibility curves are computed for a number of cases in which observations have been recorded, particularly that of comet Kohoutek. For this object, the computed visibility maxima agree well in time with the reported times of observation.

  20. One Hand, Two Objects: Emergence of Affordance in Contexts

    ERIC Educational Resources Information Center

    Borghi, Anna M.; Flumini, Andrea; Natraj, Nikhilesh; Wheaton, Lewis A.

    2012-01-01

    Studies on affordances typically focus on single objects. We investigated whether affordances are modulated by the context, defined by the relation between two objects and a hand. Participants were presented with pictures displaying two manipulable objects linked by a functional (knife-butter), a spatial (knife-coffee mug), or by no relation. They…

  1. Defining Child Neglect Based on Child Protective Services Data

    ERIC Educational Resources Information Center

    Dubowitz, H.; Pitts, S.C.; Litrownik, A.J.; Cox, C.E.; Runyan, D.; Black, M.M.

    2005-01-01

    Objectives:: To compare neglect defined by Child Protective Services official codes with neglect defined by a review of CPS narrative data, and to examine the validity of the different neglect measures using children's functioning at age 8 years. Methods:: Data are from 740 children participating in a consortium of longitudinal studies on child…

  2. Individual differences in cortical face selectivity predict behavioral performance in face recognition

    PubMed Central

    Huang, Lijie; Song, Yiying; Li, Jingguang; Zhen, Zonglei; Yang, Zetian; Liu, Jia

    2014-01-01

    In functional magnetic resonance imaging studies, object selectivity is defined as a higher neural response to an object category than other object categories. Importantly, object selectivity is widely considered as a neural signature of a functionally-specialized area in processing its preferred object category in the human brain. However, the behavioral significance of the object selectivity remains unclear. In the present study, we used the individual differences approach to correlate participants' face selectivity in the face-selective regions with their behavioral performance in face recognition measured outside the scanner in a large sample of healthy adults. Face selectivity was defined as the z score of activation with the contrast of faces vs. non-face objects, and the face recognition ability was indexed as the normalized residual of the accuracy in recognizing previously-learned faces after regressing out that for non-face objects in an old/new memory task. We found that the participants with higher face selectivity in the fusiform face area (FFA) and the occipital face area (OFA), but not in the posterior part of the superior temporal sulcus (pSTS), possessed higher face recognition ability. Importantly, the association of face selectivity in the FFA and face recognition ability cannot be accounted for by FFA response to objects or behavioral performance in object recognition, suggesting that the association is domain-specific. Finally, the association is reliable, confirmed by the replication from another independent participant group. In sum, our finding provides empirical evidence on the validity of using object selectivity as a neural signature in defining object-selective regions in the human brain. PMID:25071513

  3. ITS system specification. Appendix B, requirements by service/function/subfunction

    DOT National Transportation Integrated Search

    1997-01-01

    The objective of the Polaris Project is to define an Intelligent Transportation Systems (ITS) architecture for the state of Minnesota. An architecture is a framework that defines how multiple ITS Components interrelate and contribute to the overall I...

  4. IDENTIFICATION AND CHARACTERIZATION OF DISEASE USING PULMONARY FUNCTION TESTS

    EPA Science Inventory

    Abstract
    Pulmonary function testing is used routinely in human medicine to objectively define functional deficits in individuals with respiratory disease. Despite the fact that respiratory disease is a common problem in veterinary medicine, evaluation of the small animal pa...

  5. Segmentation precedes face categorization under suboptimal conditions.

    PubMed

    Van Den Boomen, Carlijn; Fahrenfort, Johannes J; Snijders, Tineke M; Kemner, Chantal

    2015-01-01

    Both categorization and segmentation processes play a crucial role in face perception. However, the functional relation between these subprocesses is currently unclear. The present study investigates the temporal relation between segmentation-related and category-selective responses in the brain, using electroencephalography (EEG). Surface segmentation and category content were both manipulated using texture-defined objects, including faces. This allowed us to study brain activity related to segmentation and to categorization. In the main experiment, participants viewed texture-defined objects for a duration of 800 ms. EEG results revealed that segmentation-related responses precede category-selective responses. Three additional experiments revealed that the presence and timing of categorization depends on stimulus properties and presentation duration. Photographic objects were presented for a long and short (92 ms) duration and evoked fast category-selective responses in both cases. On the other hand, presentation of texture-defined objects for a short duration only evoked segmentation-related but no category-selective responses. Category-selective responses were much slower when evoked by texture-defined than by photographic objects. We suggest that in case of categorization of objects under suboptimal conditions, such as when low-level stimulus properties are not sufficient for fast object categorization, segmentation facilitates the slower categorization process.

  6. Segmentation precedes face categorization under suboptimal conditions

    PubMed Central

    Van Den Boomen, Carlijn; Fahrenfort, Johannes J.; Snijders, Tineke M.; Kemner, Chantal

    2015-01-01

    Both categorization and segmentation processes play a crucial role in face perception. However, the functional relation between these subprocesses is currently unclear. The present study investigates the temporal relation between segmentation-related and category-selective responses in the brain, using electroencephalography (EEG). Surface segmentation and category content were both manipulated using texture-defined objects, including faces. This allowed us to study brain activity related to segmentation and to categorization. In the main experiment, participants viewed texture-defined objects for a duration of 800 ms. EEG results revealed that segmentation-related responses precede category-selective responses. Three additional experiments revealed that the presence and timing of categorization depends on stimulus properties and presentation duration. Photographic objects were presented for a long and short (92 ms) duration and evoked fast category-selective responses in both cases. On the other hand, presentation of texture-defined objects for a short duration only evoked segmentation-related but no category-selective responses. Category-selective responses were much slower when evoked by texture-defined than by photographic objects. We suggest that in case of categorization of objects under suboptimal conditions, such as when low-level stimulus properties are not sufficient for fast object categorization, segmentation facilitates the slower categorization process. PMID:26074838

  7. ITS system specification. Appendix C, data flows by function for ITS services

    DOT National Transportation Integrated Search

    1997-01-01

    The objective of the Polaris Project is to define an Intelligent Transportation Systems (ITS) architecture for the state of Minnesota. An architecture is a framework that defines how multiple ITS Components interrelate and contribute to the overall I...

  8. Design optimization of axial flow hydraulic turbine runner: Part II - multi-objective constrained optimization method

    NASA Astrophysics Data System (ADS)

    Peng, Guoyi; Cao, Shuliang; Ishizuka, Masaru; Hayama, Shinji

    2002-06-01

    This paper is concerned with the design optimization of axial flow hydraulic turbine runner blade geometry. In order to obtain a better design plan with good performance, a new comprehensive performance optimization procedure has been presented by combining a multi-variable multi-objective constrained optimization model with a Q3D inverse computation and a performance prediction procedure. With careful analysis of the inverse design of axial hydraulic turbine runner, the total hydraulic loss and the cavitation coefficient are taken as optimization objectives and a comprehensive objective function is defined using the weight factors. Parameters of a newly proposed blade bound circulation distribution function and parameters describing positions of blade leading and training edges in the meridional flow passage are taken as optimization variables.The optimization procedure has been applied to the design optimization of a Kaplan runner with specific speed of 440 kW. Numerical results show that the performance of designed runner is successfully improved through optimization computation. The optimization model is found to be validated and it has the feature of good convergence. With the multi-objective optimization model, it is possible to control the performance of designed runner by adjusting the value of weight factors defining the comprehensive objective function. Copyright

  9. The Representation of Object-Directed Action and Function Knowledge in the Human Brain

    PubMed Central

    Chen, Quanjing; Garcea, Frank E.; Mahon, Bradford Z.

    2016-01-01

    The appropriate use of everyday objects requires the integration of action and function knowledge. Previous research suggests that action knowledge is represented in frontoparietal areas while function knowledge is represented in temporal lobe regions. Here we used multivoxel pattern analysis to investigate the representation of object-directed action and function knowledge while participants executed pantomimes of familiar tool actions. A novel approach for decoding object knowledge was used in which classifiers were trained on one pair of objects and then tested on a distinct pair; this permitted a measurement of classification accuracy over and above object-specific information. Region of interest (ROI) analyses showed that object-directed actions could be decoded in tool-preferring regions of both parietal and temporal cortex, while no independently defined tool-preferring ROI showed successful decoding of object function. However, a whole-brain searchlight analysis revealed that while frontoparietal motor and peri-motor regions are engaged in the representation of object-directed actions, medial temporal lobe areas in the left hemisphere are involved in the representation of function knowledge. These results indicate that both action and function knowledge are represented in a topographically coherent manner that is amenable to study with multivariate approaches, and that the left medial temporal cortex represents knowledge of object function. PMID:25595179

  10. An adjoint view on flux consistency and strong wall boundary conditions to the Navier–Stokes equations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stück, Arthur, E-mail: arthur.stueck@dlr.de

    2015-11-15

    Inconsistent discrete expressions in the boundary treatment of Navier–Stokes solvers and in the definition of force objective functionals can lead to discrete-adjoint boundary treatments that are not a valid representation of the boundary conditions to the corresponding adjoint partial differential equations. The underlying problem is studied for an elementary 1D advection–diffusion problem first using a node-centred finite-volume discretisation. The defect of the boundary operators in the inconsistently defined discrete-adjoint problem leads to oscillations and becomes evident with the additional insight of the continuous-adjoint approach. A homogenisation of the discretisations for the primal boundary treatment and the force objective functional yieldsmore » second-order functional accuracy and eliminates the defect in the discrete-adjoint boundary treatment. Subsequently, the issue is studied for aerodynamic Reynolds-averaged Navier–Stokes problems in conjunction with a standard finite-volume discretisation on median-dual grids and a strong implementation of noslip walls, found in many unstructured general-purpose flow solvers. Going out from a base-line discretisation of force objective functionals which is independent of the boundary treatment in the flow solver, two improved flux-consistent schemes are presented; based on either body wall-defined or farfield-defined control-volumes they resolve the dual inconsistency. The behaviour of the schemes is investigated on a sequence of grids in 2D and 3D.« less

  11. Free-form geometric modeling by integrating parametric and implicit PDEs.

    PubMed

    Du, Haixia; Qin, Hong

    2007-01-01

    Parametric PDE techniques, which use partial differential equations (PDEs) defined over a 2D or 3D parametric domain to model graphical objects and processes, can unify geometric attributes and functional constraints of the models. PDEs can also model implicit shapes defined by level sets of scalar intensity fields. In this paper, we present an approach that integrates parametric and implicit trivariate PDEs to define geometric solid models containing both geometric information and intensity distribution subject to flexible boundary conditions. The integrated formulation of second-order or fourth-order elliptic PDEs permits designers to manipulate PDE objects of complex geometry and/or arbitrary topology through direct sculpting and free-form modeling. We developed a PDE-based geometric modeling system for shape design and manipulation of PDE objects. The integration of implicit PDEs with parametric geometry offers more general and arbitrary shape blending and free-form modeling for objects with intensity attributes than pure geometric models.

  12. Spinal-Exercise Prescription in Sport: Classifying Physical Training and Rehabilitation by Intention and Outcome

    PubMed Central

    Spencer, Simon; Wolf, Alex; Rushton, Alison

    2016-01-01

    Context: Identification of strategies to prevent spinal injury, optimize rehabilitation, and enhance performance is a priority for practitioners. Different exercises produce different effects on neuromuscular performance. Clarity of the purpose of a prescribed exercise is central to a successful outcome. Spinal exercises need to be classified according to the objective of the exercise and planned physical outcome. Objective: To define the modifiable spinal abilities that underpin optimal function during skilled athletic performance, clarify the effect of spinal pain and pathologic conditions, and classify spinal exercises according to the objective of the exercise and intended physical outcomes to inform training and rehabilitation. Design: Qualitative study. Data Collection and Analysis: We conducted a qualitative consensus method of 4 iterative phases. An exploratory panel carried out an extended review of the English-language literature using CINAHL, EMBASE, MEDLINE, and PubMed to identify key themes and subthemes to inform the definitions of exercise categories, physical abilities, and physical outcomes. An expert project group reviewed panel findings. A draft classification was discussed with physiotherapists (n = 49) and international experts. Lead physiotherapy and strength and conditioning teams (n = 17) reviewed a revised classification. Consensus was defined as unanimous agreement. Results: After the literature review and subsequent analysis, we defined spinal abilities in 4 categories: mobility, motor control, work capacity, and strength. Exercises were subclassified by functionality as nonfunctional or functional and by spinal displacement as either static (neutral spinal posture with no segmental displacement) or dynamic (dynamic segmental movement). The proposed terminology and classification support commonality of language for practitioners. Conclusions: The spinal-exercise classification will support clinical reasoning through a framework of spinal-exercise objectives that clearly define the nature of the exercise prescription required to deliver intended physical outcomes. PMID:27661792

  13. Application of Quasi-Linearization Techniques to Rail Vehicle Dynamic Analyses

    DOT National Transportation Integrated Search

    1978-11-01

    The objective of the work reported here was to define methods for applying the describing function technique to realistic models of nonlinear rail cars. The describing function method offers a compromise between the accuracy of nonlinear digital simu...

  14. Fuzzy Logic Controller Design for A Robot Grasping System with Different Membership Functions

    NASA Astrophysics Data System (ADS)

    Ahmad, Hamzah; Razali, Saifudin; Rusllim Mohamed, Mohd

    2013-12-01

    This paper investigates the effects of the membership function to the object grasping for a three fingered gripper system. The performance of three famously used membership functions is compared to identify their behavior in lifting a defined object shape. MATLAB Simulink and SimMechanics toolboxes are used to examine the performance. Our preliminary results proposed that the Gaussian membership function surpassed the two other membership functions; triangular and trapezoid memberships especially in the context of firmer grasping and less time consumption during operations. Therefore, Gaussian membership function could be the best solution when time consumption and firmer grasp are considered.

  15. The Representation of Object-Directed Action and Function Knowledge in the Human Brain.

    PubMed

    Chen, Quanjing; Garcea, Frank E; Mahon, Bradford Z

    2016-04-01

    The appropriate use of everyday objects requires the integration of action and function knowledge. Previous research suggests that action knowledge is represented in frontoparietal areas while function knowledge is represented in temporal lobe regions. Here we used multivoxel pattern analysis to investigate the representation of object-directed action and function knowledge while participants executed pantomimes of familiar tool actions. A novel approach for decoding object knowledge was used in which classifiers were trained on one pair of objects and then tested on a distinct pair; this permitted a measurement of classification accuracy over and above object-specific information. Region of interest (ROI) analyses showed that object-directed actions could be decoded in tool-preferring regions of both parietal and temporal cortex, while no independently defined tool-preferring ROI showed successful decoding of object function. However, a whole-brain searchlight analysis revealed that while frontoparietal motor and peri-motor regions are engaged in the representation of object-directed actions, medial temporal lobe areas in the left hemisphere are involved in the representation of function knowledge. These results indicate that both action and function knowledge are represented in a topographically coherent manner that is amenable to study with multivariate approaches, and that the left medial temporal cortex represents knowledge of object function. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  16. Object-oriented analysis and design of a health care management information system.

    PubMed

    Krol, M; Reich, D L

    1999-04-01

    We have created a prototype for a universal object-oriented model of a health care system compatible with the object-oriented approach used in version 3.0 of the HL7 standard for communication messages. A set of three models has been developed: (1) the Object Model describes the hierarchical structure of objects in a system--their identity, relationships, attributes, and operations; (2) the Dynamic Model represents the sequence of operations in time as a collection of state diagrams for object classes in the system; and (3) functional Diagram represents the transformation of data within a system by means of data flow diagrams. Within these models, we have defined major object classes of health care participants and their subclasses, associations, attributes and operators, states, and behavioral scenarios. We have also defined the major processes and subprocesses. The top-down design approach allows use, reuse, and cloning of standard components.

  17. Analysis of the Effects of the Commander’s Battle Positioning on Unit Combat Performance

    DTIC Science & Technology

    1991-03-01

    Analysis ......... .. 58 Logistic Regression Analysis ......... .. 61 Canonical Correlation Analysis ........ .. 62 Descriminant Analysis...entails classifying objects into two or more distinct groups, or responses. Dillon defines descriminant analysis as "deriving linear combinations of the...object given it’s predictor variables. The second objective is, through analysis of the parameters of the descriminant functions, determine those

  18. Functional Outcomes in the Treatment of Adults with ADHD

    ERIC Educational Resources Information Center

    Adler, Lenard A.; Spencer, Thomas J.; Levine, Louise R.; Ramsey, Janet L.; Tamura, Roy; Kelsey, Douglas; Ball, Susan G.; Allen, Albert J.; Biederman, Joseph

    2008-01-01

    Objective: ADHD is associated with significant functional impairment in adults. The present study examined functional outcomes following 6-month double-blind treatment with either atomoxetine or placebo. Method: Patients were 410 adults (58.5% male) with "DSM-IV"--defined ADHD. They were randomly assigned to receive either atomoxetine 40 mg/day to…

  19. Uncovering Mental Representations with Markov Chain Monte Carlo

    ERIC Educational Resources Information Center

    Sanborn, Adam N.; Griffiths, Thomas L.; Shiffrin, Richard M.

    2010-01-01

    A key challenge for cognitive psychology is the investigation of mental representations, such as object categories, subjective probabilities, choice utilities, and memory traces. In many cases, these representations can be expressed as a non-negative function defined over a set of objects. We present a behavioral method for estimating these…

  20. A depictive neural model for the representation of motion verbs.

    PubMed

    Rao, Sunil; Aleksander, Igor

    2011-11-01

    In this paper, we present a depictive neural model for the representation of motion verb semantics in neural models of visual awareness. The problem of modelling motion verb representation is shown to be one of function application, mapping a set of given input variables defining the moving object and the path of motion to a defined output outcome in the motion recognition context. The particular function-applicative implementation and consequent recognition model design presented are seen as arising from a noun-adjective recognition model enabling the recognition of colour adjectives as applied to a set of shapes representing objects to be recognised. The presence of such a function application scheme and a separately implemented position identification and path labelling scheme are accordingly shown to be the primitives required to enable the design and construction of a composite depictive motion verb recognition scheme. Extensions to the presented design to enable the representation of transitive verbs are also discussed.

  1. An improved level set method for brain MR images segmentation and bias correction.

    PubMed

    Chen, Yunjie; Zhang, Jianwei; Macione, Jim

    2009-10-01

    Intensity inhomogeneities cause considerable difficulty in the quantitative analysis of magnetic resonance (MR) images. Thus, bias field estimation is a necessary step before quantitative analysis of MR data can be undertaken. This paper presents a variational level set approach to bias correction and segmentation for images with intensity inhomogeneities. Our method is based on an observation that intensities in a relatively small local region are separable, despite of the inseparability of the intensities in the whole image caused by the overall intensity inhomogeneity. We first define a localized K-means-type clustering objective function for image intensities in a neighborhood around each point. The cluster centers in this objective function have a multiplicative factor that estimates the bias within the neighborhood. The objective function is then integrated over the entire domain to define the data term into the level set framework. Our method is able to capture bias of quite general profiles. Moreover, it is robust to initialization, and thereby allows fully automated applications. The proposed method has been used for images of various modalities with promising results.

  2. New Basis Functions for the Electromagnetic Solution of Arbitrarily-shaped, Three Dimensional Conducting Bodies using Method of Moments

    NASA Technical Reports Server (NTRS)

    Mackenzie, Anne I.; Baginski, Michael E.; Rao, Sadasiva M.

    2008-01-01

    In this work, we present a new set of basis functions, defined over a pair of planar triangular patches, for the solution of electromagnetic scattering and radiation problems associated with arbitrarily-shaped surfaces using the method of moments solution procedure. The basis functions are constant over the function subdomain and resemble pulse functions for one and two dimensional problems. Further, another set of basis functions, point-wise orthogonal to the first set, is also defined over the same function space. The primary objective of developing these basis functions is to utilize them for the electromagnetic solution involving conducting, dielectric, and composite bodies. However, in the present work, only the conducting body solution is presented and compared with other data.

  3. A framework for joint image-and-shape analysis

    NASA Astrophysics Data System (ADS)

    Gao, Yi; Tannenbaum, Allen; Bouix, Sylvain

    2014-03-01

    Techniques in medical image analysis are many times used for the comparison or regression on the intensities of images. In general, the domain of the image is a given Cartesian grids. Shape analysis, on the other hand, studies the similarities and differences among spatial objects of arbitrary geometry and topology. Usually, there is no function defined on the domain of shapes. Recently, there has been a growing needs for defining and analyzing functions defined on the shape space, and a coupled analysis on both the shapes and the functions defined on them. Following this direction, in this work we present a coupled analysis for both images and shapes. As a result, the statistically significant discrepancies in both the image intensities as well as on the underlying shapes are detected. The method is applied on both brain images for the schizophrenia and heart images for atrial fibrillation patients.

  4. Management and Operations Auditing: A Business Oriented Management Structure For a Unified School District.

    ERIC Educational Resources Information Center

    Conway, Ernest J.; And Others

    An operations audit was conducted for a school district. The purpose of the audit was to determine the organization of the central office and reorganize its structure and staff as appropriate to clearly define goals and objectives, specify roles and responsibilities, eliminate wasted or duplicated efforts, and functionally define operational work…

  5. Defining the cortical visual systems: "what", "where", and "how"

    NASA Technical Reports Server (NTRS)

    Creem, S. H.; Proffitt, D. R.; Kaiser, M. K. (Principal Investigator)

    2001-01-01

    The visual system historically has been defined as consisting of at least two broad subsystems subserving object and spatial vision. These visual processing streams have been organized both structurally as two distinct pathways in the brain, and functionally for the types of tasks that they mediate. The classic definition by Ungerleider and Mishkin labeled a ventral "what" stream to process object information and a dorsal "where" stream to process spatial information. More recently, Goodale and Milner redefined the two visual systems with a focus on the different ways in which visual information is transformed for different goals. They relabeled the dorsal stream as a "how" system for transforming visual information using an egocentric frame of reference in preparation for direct action. This paper reviews recent research from psychophysics, neurophysiology, neuropsychology and neuroimaging to define the roles of the ventral and dorsal visual processing streams. We discuss a possible solution that allows for both "where" and "how" systems that are functionally and structurally organized within the posterior parietal lobe.

  6. Space station automation study. Volume 2: Technical report. Autonomous systems and assembly

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The application of automation to space station functions is discussed. A summary is given of the evolutionary functions associated with long range missions and objectives. Mission tasks and requirements are defined. Space station sub-systems, mission models, assembly, and construction are discussed.

  7. Multiscale moment-based technique for object matching and recognition

    NASA Astrophysics Data System (ADS)

    Thio, HweeLi; Chen, Liya; Teoh, Eam-Khwang

    2000-03-01

    A new method is proposed to extract features from an object for matching and recognition. The features proposed are a combination of local and global characteristics -- local characteristics from the 1-D signature function that is defined to each pixel on the object boundary, global characteristics from the moments that are generated from the signature function. The boundary of the object is first extracted, then the signature function is generated by computing the angle between two lines from every point on the boundary as a function of position along the boundary. This signature function is position, scale and rotation invariant (PSRI). The shape of the signature function is then described quantitatively by using moments. The moments of the signature function are the global characters of a local feature set. Using moments as the eventual features instead of the signature function reduces the time and complexity of an object matching application. Multiscale moments are implemented to produce several sets of moments that will generate more accurate matching. Basically multiscale technique is a coarse to fine procedure and makes the proposed method more robust to noise. This method is proposed to match and recognize objects under simple transformation, such as translation, scale changes, rotation and skewing. A simple logo indexing system is implemented to illustrate the performance of the proposed method.

  8. Effluent treatment for nuclear thermal propulsion ground testing

    NASA Technical Reports Server (NTRS)

    Shipers, Larry R.

    1993-01-01

    The objectives are to define treatment functions, review concept options, discuss PIPET effluent treatment system (ETS), and outline future activities. The topics covered include the following: reactor exhaust; effluent treatment functions; effluent treatment categories; effluent treatment options; concept evaluation; PIPETS ETS envelope; PIPET effluent treatment concept; and future activities.

  9. Anti-food allergic activity of sulfated polysaccharide from gracilaria lemaneiformis is dependent on immunosuppression and inhibition of p38 mapk

    USDA-ARS?s Scientific Manuscript database

    Polysaccharides from marine sources offer diverse therapeutic functions due to their multifarious biological nature. Polysaccharide from Gracilaria lemaneiformis possesses various bioactive functions, but its anti-allergic activity remains incompletely defined. Objective: This study aimed to extract...

  10. Astromag data system concept

    NASA Technical Reports Server (NTRS)

    Roos, Darrell; Cheng, Chieh-San; Newsome, Penny; Nath, Nitya

    1989-01-01

    A feasible, top-level data system is defined that could accomplish and support the Astromag Data System functions and interfaces necessary to support the scientific objectives of Astromag. This data system must also be able to function in the environment of the Space Station Freedom Manned Base (SSFMB) and other anticipated NASA elements.

  11. Professional Competencies of Cuban Specialists in Intensive Care and Emergency Medicine.

    PubMed

    Véliz-Martínez, Pedro L; Jorna-Calixto, Ana R; Oramas-González, René

    2016-10-01

    INTRODUCTION The quality of medical training and practice reflects the competency level of the professionals involved. The intensive care and emergency medicine specialty in Cuba has not defined its competencies. OBJECTIVE Identify the competencies required for specialty practice in intensive care and emergency medicine. METHODS The study was conducted from January 2014 to December 2015, using qualitative techniques; 48 professionals participated. We undertook functional occupational analysis, based on functions defined in a previous study. Three expert groups were utilized: the first used various group techniques; the second, the Delphi method; and the third, the Delphi method and a Likert questionnaire. RESULTS A total of 73 specific competencies were defined, grouped in 11 units: 44 in the patient care function, 16 in management, 7 in teaching and 6 in research. A competency map is provided. CONCLUSIONS The intensive care and emergency medicine specialty competencies identified will help improve professional standards, ensure health workforce quality, improve patient care and academic performance, and enable objective evaluation of specialists' competence and performance. KEYWORDS Clinical competency, competency-based education, professional education, intensive care, emergency medicine, urgent care, continuing medical education, curriculum, medical residency, Cuba.

  12. Free-standing supramolecular hydrogel objects by reaction-diffusion

    PubMed Central

    Lovrak, Matija; Hendriksen, Wouter E. J.; Maity, Chandan; Mytnyk, Serhii; van Steijn, Volkert; Eelkema, Rienk; van Esch, Jan H.

    2017-01-01

    Self-assembly provides access to a variety of molecular materials, yet spatial control over structure formation remains difficult to achieve. Here we show how reaction–diffusion (RD) can be coupled to a molecular self-assembly process to generate macroscopic free-standing objects with control over shape, size, and functionality. In RD, two or more reactants diffuse from different positions to give rise to spatially defined structures on reaction. We demonstrate that RD can be used to locally control formation and self-assembly of hydrazone molecular gelators from their non-assembling precursors, leading to soft, free-standing hydrogel objects with sizes ranging from several hundred micrometres up to centimeters. Different chemical functionalities and gradients can easily be integrated in the hydrogel objects by using different reactants. Our methodology, together with the vast range of organic reactions and self-assembling building blocks, provides a general approach towards the programmed fabrication of soft microscale objects with controlled functionality and shape. PMID:28580948

  13. Software Techniques for Non-Von Neumann Architectures

    DTIC Science & Technology

    1990-01-01

    Commtopo programmable Benes net.; hypercubic lattice for QCD Control CENTRALIZED Assign STATIC Memory :SHARED Synch UNIVERSAL Max-cpu 566 Proessor...boards (each = 4 floating point units, 2 multipliers) Cpu-size 32-bit floating point chips Perform 11.4 Gflops Market quantum chromodynamics ( QCD ...functions there should exist a capability to define hierarchies and lattices of complex objects. A complex object can be made up of a set of simple objects

  14. Provisional-Ideal-Point-Based Multi-objective Optimization Method for Drone Delivery Problem

    NASA Astrophysics Data System (ADS)

    Omagari, Hiroki; Higashino, Shin-Ichiro

    2018-04-01

    In this paper, we proposed a new evolutionary multi-objective optimization method for solving drone delivery problems (DDP). It can be formulated as a constrained multi-objective optimization problem. In our previous research, we proposed the "aspiration-point-based method" to solve multi-objective optimization problems. However, this method needs to calculate the optimal values of each objective function value in advance. Moreover, it does not consider the constraint conditions except for the objective functions. Therefore, it cannot apply to DDP which has many constraint conditions. To solve these issues, we proposed "provisional-ideal-point-based method." The proposed method defines a "penalty value" to search for feasible solutions. It also defines a new reference solution named "provisional-ideal point" to search for the preferred solution for a decision maker. In this way, we can eliminate the preliminary calculations and its limited application scope. The results of the benchmark test problems show that the proposed method can generate the preferred solution efficiently. The usefulness of the proposed method is also demonstrated by applying it to DDP. As a result, the delivery path when combining one drone and one truck drastically reduces the traveling distance and the delivery time compared with the case of using only one truck.

  15. Matrix management for aerospace 2000

    NASA Technical Reports Server (NTRS)

    Mccarthy, J. F., Jr.

    1980-01-01

    The martix management approach to program management is an organized effort for attaining program objectives by defining and structuring all elements so as to form a single system whose parts are united by interaction. The objective of the systems approach is uncompromisingly complete coverage of the program management endeavor. Starting with an analysis of the functions necessary to carry out a given program, a model must be defined; a matrix of responsibility assignment must be prepared; and each operational process must be examined to establish how it is to be carried out and how it relates to all other processes.

  16. Vector dissimilarity and clustering.

    PubMed

    Lefkovitch, L P

    1991-04-01

    Based on the description of objects by m attributes, an m-element vector dissimilarity function is defined that, unlike scalar functions, retains the distinction among attributes. This function, which satisfies the conditions for a metric, allows the definition of betweenness, which can then be used for clustering. Applications to the subset-generation phase of conditional clustering and to nearest-neighbor-type algorithms are described.

  17. A Functional Framework for Database Management Systems.

    DTIC Science & Technology

    1980-02-01

    Furctionat Approach 13 7.2. Objects in a 080S 14 ".2.1. ExternaL Objects 15 ;.2.2. Conceptual Objects 15 -. 2.3. Internal Objects 15 7.2.4. Externat...standpoint of their ’-efinitional and conceptual goals. 2. To make it posibLe to define arc specify the neeos as the ’irst phase cf the design process...methods. This ain is analogcus to the one in which programming language techrotogy has beer captured and supported through the conceptual lan;4age

  18. Closed-loop, pilot/vehicle analysis of the approach and landing task

    NASA Technical Reports Server (NTRS)

    Anderson, M. R.; Schmidt, D. K.

    1986-01-01

    In the case of approach and landing, it is universally accepted that the pilot uses more than one vehicle response, or output, to close his control loops. Therefore, to model this task, a multi-loop analysis technique is required. The analysis problem has been in obtaining reasonable analytic estimates of the describing functions representing the pilot's loop compensation. Once these pilot describing functions are obtained, appropriate performance and workload metrics must then be developed for the landing task. The optimal control approach provides a powerful technique for obtaining the necessary describing functions, once the appropriate task objective is defined in terms of a quadratic objective function. An approach is presented through the use of a simple, reasonable objective function and model-based metrics to evaluate loop performance and pilot workload. The results of an analysis of the LAHOS (Landing and Approach of Higher Order Systems) study performed by R.E. Smith is also presented.

  19. In Which Ways and to What Extent Do English and Shanghai Students Understand Linear Function?

    ERIC Educational Resources Information Center

    Wang, Yuqian; Barmby, Patrick; Bolden, David

    2017-01-01

    This study investigates how students in England and Shanghai understand linear function. Understanding is defined theoretically in terms of five hierarchical levels: Dependent Relationship; Connecting Representations; Property Noticing; Object Analysis; and Inventising. A pilot study instrument presented a set of problems to both cohorts, showing…

  20. Some classes of analytic functions involving Noor integral operator

    NASA Astrophysics Data System (ADS)

    Patel, J.; Cho, N. E.

    2005-12-01

    The object of the present paper is to investigate some inclusion properties of certain subclasses of analytic functions defined by using the Noor integral operator. The integral preserving properties in connection with the operator are also considered. Relevant connections of the results presented here with those obtained in earlier works are pointed out.

  1. Collaborative real-time scheduling of multiple PTZ cameras for multiple object tracking in video surveillance

    NASA Astrophysics Data System (ADS)

    Liu, Yu-Che; Huang, Chung-Lin

    2013-03-01

    This paper proposes a multi-PTZ-camera control mechanism to acquire close-up imagery of human objects in a surveillance system. The control algorithm is based on the output of multi-camera, multi-target tracking. Three main concerns of the algorithm are (1) the imagery of human object's face for biometric purposes, (2) the optimal video quality of the human objects, and (3) minimum hand-off time. Here, we define an objective function based on the expected capture conditions such as the camera-subject distance, pan tile angles of capture, face visibility and others. Such objective function serves to effectively balance the number of captures per subject and quality of captures. In the experiments, we demonstrate the performance of the system which operates in real-time under real world conditions on three PTZ cameras.

  2. Visual search for arbitrary objects in real scenes

    PubMed Central

    Alvarez, George A.; Rosenholtz, Ruth; Kuzmova, Yoana I.; Sherman, Ashley M.

    2011-01-01

    How efficient is visual search in real scenes? In searches for targets among arrays of randomly placed distractors, efficiency is often indexed by the slope of the reaction time (RT) × Set Size function. However, it may be impossible to define set size for real scenes. As an approximation, we hand-labeled 100 indoor scenes and used the number of labeled regions as a surrogate for set size. In Experiment 1, observers searched for named objects (a chair, bowl, etc.). With set size defined as the number of labeled regions, search was very efficient (~5 ms/item). When we controlled for a possible guessing strategy in Experiment 2, slopes increased somewhat (~15 ms/item), but they were much shallower than search for a random object among other distinctive objects outside of a scene setting (Exp. 3: ~40 ms/item). In Experiments 4–6, observers searched repeatedly through the same scene for different objects. Increased familiarity with scenes had modest effects on RTs, while repetition of target items had large effects (>500 ms). We propose that visual search in scenes is efficient because scene-specific forms of attentional guidance can eliminate most regions from the “functional set size” of items that could possibly be the target. PMID:21671156

  3. Visual search for arbitrary objects in real scenes.

    PubMed

    Wolfe, Jeremy M; Alvarez, George A; Rosenholtz, Ruth; Kuzmova, Yoana I; Sherman, Ashley M

    2011-08-01

    How efficient is visual search in real scenes? In searches for targets among arrays of randomly placed distractors, efficiency is often indexed by the slope of the reaction time (RT) × Set Size function. However, it may be impossible to define set size for real scenes. As an approximation, we hand-labeled 100 indoor scenes and used the number of labeled regions as a surrogate for set size. In Experiment 1, observers searched for named objects (a chair, bowl, etc.). With set size defined as the number of labeled regions, search was very efficient (~5 ms/item). When we controlled for a possible guessing strategy in Experiment 2, slopes increased somewhat (~15 ms/item), but they were much shallower than search for a random object among other distinctive objects outside of a scene setting (Exp. 3: ~40 ms/item). In Experiments 4-6, observers searched repeatedly through the same scene for different objects. Increased familiarity with scenes had modest effects on RTs, while repetition of target items had large effects (>500 ms). We propose that visual search in scenes is efficient because scene-specific forms of attentional guidance can eliminate most regions from the "functional set size" of items that could possibly be the target.

  4. Development of an Optimization Methodology for the Aluminum Alloy Wheel Casting Process

    NASA Astrophysics Data System (ADS)

    Duan, Jianglan; Reilly, Carl; Maijer, Daan M.; Cockcroft, Steve L.; Phillion, Andre B.

    2015-08-01

    An optimization methodology has been developed for the aluminum alloy wheel casting process. The methodology is focused on improving the timing of cooling processes in a die to achieve improved casting quality. This methodology utilizes (1) a casting process model, which was developed within the commercial finite element package, ABAQUS™—ABAQUS is a trademark of Dassault Systèms; (2) a Python-based results extraction procedure; and (3) a numerical optimization module from the open-source Python library, Scipy. To achieve optimal casting quality, a set of constraints have been defined to ensure directional solidification, and an objective function, based on the solidification cooling rates, has been defined to either maximize, or target a specific, cooling rate. The methodology has been applied to a series of casting and die geometries with different cooling system configurations, including a 2-D axisymmetric wheel and die assembly generated from a full-scale prototype wheel. The results show that, with properly defined constraint and objective functions, solidification conditions can be improved and optimal cooling conditions can be achieved leading to process productivity and product quality improvements.

  5. Four Common Simplifications of Multi-Criteria Decision Analysis do not hold for River Rehabilitation

    PubMed Central

    2016-01-01

    River rehabilitation aims at alleviating negative effects of human impacts such as loss of biodiversity and reduction of ecosystem services. Such interventions entail difficult trade-offs between different ecological and often socio-economic objectives. Multi-Criteria Decision Analysis (MCDA) is a very suitable approach that helps assessing the current ecological state and prioritizing river rehabilitation measures in a standardized way, based on stakeholder or expert preferences. Applications of MCDA in river rehabilitation projects are often simplified, i.e. using a limited number of objectives and indicators, assuming linear value functions, aggregating individual indicator assessments additively, and/or assuming risk neutrality of experts. Here, we demonstrate an implementation of MCDA expert preference assessments to river rehabilitation and provide ample material for other applications. To test whether the above simplifications reflect common expert opinion, we carried out very detailed interviews with five river ecologists and a hydraulic engineer. We defined essential objectives and measurable quality indicators (attributes), elicited the experts´ preferences for objectives on a standardized scale (value functions) and their risk attitude, and identified suitable aggregation methods. The experts recommended an extensive objectives hierarchy including between 54 and 93 essential objectives and between 37 to 61 essential attributes. For 81% of these, they defined non-linear value functions and in 76% recommended multiplicative aggregation. The experts were risk averse or risk prone (but never risk neutral), depending on the current ecological state of the river, and the experts´ personal importance of objectives. We conclude that the four commonly applied simplifications clearly do not reflect the opinion of river rehabilitation experts. The optimal level of model complexity, however, remains highly case-study specific depending on data and resource availability, the context, and the complexity of the decision problem. PMID:26954353

  6. Shaped-Based Recognition of 3D Objects From 2D Projections

    DTIC Science & Technology

    2006-12-01

    functions for a typical minimization by the graduated assignment algorithm. (The solid line is E , which uses the Euclid- ean distances to the nearest...of E and E0 generally decrease during the optimiza- tion process, but they can also rise because of changes in the assignment variables Mjk...m+ 1)× (n+ 1) match matrix M that minimizes the objective function E = mX j=1 nX k=1 Mjk ³ d (T (lj) , l 0 k) 2 − δ2 ´ . (7) M defines the

  7. Optimization techniques applied to spectrum management for communications satellites

    NASA Astrophysics Data System (ADS)

    Ottey, H. R.; Sullivan, T. M.; Zusman, F. S.

    This paper describes user requirements, algorithms and software design features for the application of optimization techniques to the management of the geostationary orbit/spectrum resource. Relevant problems include parameter sensitivity analyses, frequency and orbit position assignment coordination, and orbit position allotment planning. It is shown how integer and nonlinear programming as well as heuristic search techniques can be used to solve these problems. Formalized mathematical objective functions that define the problems are presented. Constraint functions that impart the necessary solution bounds are described. A versatile program structure is outlined, which would allow problems to be solved in stages while varying the problem space, solution resolution, objective function and constraints.

  8. Anti-Idling Battery for Truck Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keith Kelly

    2011-09-30

    In accordance to the Assistance Agreement DE-EE0001036, the objective of this project was to develop an advanced high voltage lithium-ion battery for use in an all-electric HVAC system for Class-7-8 heavy duty trucks. This system will help heavy duty truck drivers meet the tough new anti-idling laws being implemented by over 23 states. Quallion will be partnering with a major OEM supplier of HVAC systems to develop this system. The major OEM supplier will provide Quallion the necessary interface requirements and HVAC hardware to ensure successful testing of the all-electric system. At the end of the program, Quallion will delivermore » test data on three (3) batteries as well as test data for the prototype HVAC system. The objectives of the program are: (1) Battery Development - Objective 1 - Define battery and electronics specifications in preparation for building the prototype module. (Completed - summary included in report) and Objective 2 - Establish a functional prototype battery and characterize three batteries in-house. (Completed - photos and data included in report); (2) HVAC Development - Objective 1 - Collaborate with manufacturers to define HVAC components, layout, and electronics in preparation for establishing the prototype system. (Completed - photos and data included in report) and Objective 2 - Acquire components for three functional prototypes for use by Quallion. (Completed - photos and data included in report).« less

  9. MODOPTIM: A general optimization program for ground-water flow model calibration and ground-water management with MODFLOW

    USGS Publications Warehouse

    Halford, Keith J.

    2006-01-01

    MODOPTIM is a non-linear ground-water model calibration and management tool that simulates flow with MODFLOW-96 as a subroutine. A weighted sum-of-squares objective function defines optimal solutions for calibration and management problems. Water levels, discharges, water quality, subsidence, and pumping-lift costs are the five direct observation types that can be compared in MODOPTIM. Differences between direct observations of the same type can be compared to fit temporal changes and spatial gradients. Water levels in pumping wells, wellbore storage in the observation wells, and rotational translation of observation wells also can be compared. Negative and positive residuals can be weighted unequally so inequality constraints such as maximum chloride concentrations or minimum water levels can be incorporated in the objective function. Optimization parameters are defined with zones and parameter-weight matrices. Parameter change is estimated iteratively with a quasi-Newton algorithm and is constrained to a user-defined maximum parameter change per iteration. Parameters that are less sensitive than a user-defined threshold are not estimated. MODOPTIM facilitates testing more conceptual models by expediting calibration of each conceptual model. Examples of applying MODOPTIM to aquifer-test analysis, ground-water management, and parameter estimation problems are presented.

  10. Evaluating the accuracy of self-report for the diagnosis of HIV-associated neurocognitive disorder (HAND): defining “symptomatic” versus “asymptomatic” HAND

    PubMed Central

    Obermeit, Lisa C.; Beltran, Jessica; Casaletto, Kaitlin B.; Franklin, Donald R.; Letendre, Scott; Ellis, Ronald; Fennema-Notestine, Christine; Vaida, Florin; Collier, Ann C.; Marra, Christina M.; Clifford, David; Gelman, Benjamin; Sacktor, Ned; Morgello, Susan; Simpson, David; McCutchan, J. Allen; Grant, Igor

    2016-01-01

    The criteria for differentiating symptomatic from asymptomatic HIV-associated neurocognitive disorder require evaluation of (1) cognitive impairment, (2) daily functioning declines, and (3) whether the functional declines are attributable to cognitive versus physical problems. Many providers rely only on self-report to evaluate these latter criteria. However, the accuracy of patient-provided information may be limited. This study evaluated the validity of self-assessment for HIV-associated neurocognitive disorder (HAND) diagnoses by comparing objective findings with self-report of criteria 2 and 3 above. Self-reports were used to stratify 277 cognitively impaired HIV+ individuals into functionally dependent (n = 159) and independent (n = 118) groups, followed by group comparisons of objective functional problems. The dependent group was then divided into those who self-attributed their functional dependence to only cognitive (n = 80) versus only physical (n = 79) causes, for further comparisons on objective findings. The functionally dependent group was significantly worse than the independent group on all objective disability characteristics except severity of cognitive impairment, while those who attributed their dependence to physical (versus cognitive) factors were similar on all objective physical, cognitive, and functioning variables. Of note, 28 % of physical attributors showed no physical abnormalities on neuromedical examinations. Results suggest that patient report is consistently associated with objective measures of functional loss; in contrast, patient identification of physical versus cognitive causes is poorly associated with objective criteria. These findings caution against relying solely on patient self-report to determine whether functional disability in cognitively impaired HIV+ individuals can be attributed to strictly physical causes. PMID:27557777

  11. Interaction between the Learners' Initial Grasp of the Object of Learning and the Learning Resource Afforded

    ERIC Educational Resources Information Center

    Pang, Ming Fai; Marton, Ference

    2013-01-01

    Two studies are reported in this paper. The object of learning in both is the economic principle of changes in price as a function of changes in the relative magnitude of changes in demand and supply. The patterns of variation and invariance, defining the conditions compared were built into pedagogical tools (text, graphs, and worksheets). The…

  12. Fast periodic stimulation (FPS): a highly effective approach in fMRI brain mapping.

    PubMed

    Gao, Xiaoqing; Gentile, Francesco; Rossion, Bruno

    2018-06-01

    Defining the neural basis of perceptual categorization in a rapidly changing natural environment with low-temporal resolution methods such as functional magnetic resonance imaging (fMRI) is challenging. Here, we present a novel fast periodic stimulation (FPS)-fMRI approach to define face-selective brain regions with natural images. Human observers are presented with a dynamic stream of widely variable natural object images alternating at a fast rate (6 images/s). Every 9 s, a short burst of variable face images contrasting with object images in pairs induces an objective face-selective neural response at 0.111 Hz. A model-free Fourier analysis achieves a twofold increase in signal-to-noise ratio compared to a conventional block-design approach with identical stimuli and scanning duration, allowing to derive a comprehensive map of face-selective areas in the ventral occipito-temporal cortex, including the anterior temporal lobe (ATL), in all individual brains. Critically, periodicity of the desired category contrast and random variability among widely diverse images effectively eliminates the contribution of low-level visual cues, and lead to the highest values (80-90%) of test-retest reliability in the spatial activation map yet reported in imaging higher level visual functions. FPS-fMRI opens a new avenue for understanding brain function with low-temporal resolution methods.

  13. Models and algorithm of optimization launch and deployment of virtual network functions in the virtual data center

    NASA Astrophysics Data System (ADS)

    Bolodurina, I. P.; Parfenov, D. I.

    2017-10-01

    The goal of our investigation is optimization of network work in virtual data center. The advantage of modern infrastructure virtualization lies in the possibility to use software-defined networks. However, the existing optimization of algorithmic solutions does not take into account specific features working with multiple classes of virtual network functions. The current paper describes models characterizing the basic structures of object of virtual data center. They including: a level distribution model of software-defined infrastructure virtual data center, a generalized model of a virtual network function, a neural network model of the identification of virtual network functions. We also developed an efficient algorithm for the optimization technology of containerization of virtual network functions in virtual data center. We propose an efficient algorithm for placing virtual network functions. In our investigation we also generalize the well renowned heuristic and deterministic algorithms of Karmakar-Karp.

  14. A Distributed Trajectory-Oriented Approach to Managing Traffic Complexity

    NASA Technical Reports Server (NTRS)

    Idris, Husni; Wing, David J.; Vivona, Robert; Garcia-Chico, Jose-Luis

    2007-01-01

    In order to handle the expected increase in air traffic volume, the next generation air transportation system is moving towards a distributed control architecture, in which ground-based service providers such as controllers and traffic managers and air-based users such as pilots share responsibility for aircraft trajectory generation and management. While its architecture becomes more distributed, the goal of the Air Traffic Management (ATM) system remains to achieve objectives such as maintaining safety and efficiency. It is, therefore, critical to design appropriate control elements to ensure that aircraft and groundbased actions result in achieving these objectives without unduly restricting user-preferred trajectories. This paper presents a trajectory-oriented approach containing two such elements. One is a trajectory flexibility preservation function, by which aircraft plan their trajectories to preserve flexibility to accommodate unforeseen events. And the other is a trajectory constraint minimization function by which ground-based agents, in collaboration with air-based agents, impose just-enough restrictions on trajectories to achieve ATM objectives, such as separation assurance and flow management. The underlying hypothesis is that preserving trajectory flexibility of each individual aircraft naturally achieves the aggregate objective of avoiding excessive traffic complexity, and that trajectory flexibility is increased by minimizing constraints without jeopardizing the intended ATM objectives. The paper presents conceptually how the two functions operate in a distributed control architecture that includes self separation. The paper illustrates the concept through hypothetical scenarios involving conflict resolution and flow management. It presents a functional analysis of the interaction and information flow between the functions. It also presents an analytical framework for defining metrics and developing methods to preserve trajectory flexibility and minimize its constraints. In this framework flexibility is defined in terms of robustness and adaptability to disturbances and the impact of constraints is illustrated through analysis of a trajectory solution space with limited degrees of freedom and in simple constraint situations involving meeting multiple times of arrival and resolving a conflict.

  15. The origin and function of mirror neurons: the missing link.

    PubMed

    Lingnau, Angelika; Caramazza, Alfonso

    2014-04-01

    We argue, by analogy to the neural organization of the object recognition system, that demonstration of modulation of mirror neurons by associative learning does not imply absence of genetic adaptation. Innate connectivity defines the types of processes mirror neurons can participate in while allowing for extensive local plasticity. However, the proper function of these neurons remains to be worked out.

  16. Reduction of Subjective and Objective System Complexity

    NASA Technical Reports Server (NTRS)

    Watson, Michael D.

    2015-01-01

    Occam's razor is often used in science to define the minimum criteria to establish a physical or philosophical idea or relationship. Albert Einstein is attributed the saying "everything should be made as simple as possible, but not simpler". These heuristic ideas are based on a belief that there is a minimum state or set of states for a given system or phenomena. In looking at system complexity, these heuristics point us to an idea that complexity can be reduced to a minimum. How then, do we approach a reduction in complexity? Complexity has been described as a subjective concept and an objective measure of a system. Subjective complexity is based on human cognitive comprehension of the functions and inter relationships of a system. Subjective complexity is defined by the ability to fully comprehend the system. Simplifying complexity, in a subjective sense, is thus gaining a deeper understanding of the system. As Apple's Jonathon Ive has stated," It's not just minimalism or the absence of clutter. It involves digging through the depth of complexity. To be truly simple, you have to go really deep". Simplicity is not the absence of complexity but a deeper understanding of complexity. Subjective complexity, based on this human comprehension, cannot then be discerned from the sociological concept of ignorance. The inability to comprehend a system can be either a lack of knowledge, an inability to understand the intricacies of a system, or both. Reduction in this sense is based purely on a cognitive ability to understand the system and no system then may be truly complex. From this view, education and experience seem to be the keys to reduction or eliminating complexity. Objective complexity, is the measure of the systems functions and interrelationships which exist independent of human comprehension. Jonathon Ive's statement does not say that complexity is removed, only that the complexity is understood. From this standpoint, reduction of complexity can be approached in finding the optimal or 'best balance' of the system functions and interrelationships. This is achievable following von Bertalanffy's approach of describing systems as a set of equations representing both the system functions and the system interrelationships. Reduction is found based on an objective function defining the system output given variations in the system inputs and the system operating environment. By minimizing the objective function with respect to these inputs and environments, a reduced system can be found. Thus, a reduction of the system complexity is feasible.

  17. Enabling complex queries to drug information sources through functional composition.

    PubMed

    Peters, Lee; Mortensen, Jonathan; Nguyen, Thang; Bodenreider, Olivier

    2013-01-01

    Our objective was to enable an end-user to create complex queries to drug information sources through functional composition, by creating sequences of functions from application program interfaces (API) to drug terminologies. The development of a functional composition model seeks to link functions from two distinct APIs. An ontology was developed using Protégé to model the functions of the RxNorm and NDF-RT APIs by describing the semantics of their input and output. A set of rules were developed to define the interoperable conditions for functional composition. The operational definition of interoperability between function pairs is established by executing the rules on the ontology. We illustrate that the functional composition model supports common use cases, including checking interactions for RxNorm drugs and deploying allergy lists defined in reference to drug properties in NDF-RT. This model supports the RxMix application (http://mor.nlm.nih.gov/RxMix/), an application we developed for enabling complex queries to the RxNorm and NDF-RT APIs.

  18. Iterative optimizing quantization method for reconstructing three-dimensional images from a limited number of views

    DOEpatents

    Lee, Heung-Rae

    1997-01-01

    A three-dimensional image reconstruction method comprises treating the object of interest as a group of elements with a size that is determined by the resolution of the projection data, e.g., as determined by the size of each pixel. One of the projections is used as a reference projection. A fictitious object is arbitrarily defined that is constrained by such reference projection. The method modifies the known structure of the fictitious object by comparing and optimizing its four projections to those of the unknown structure of the real object and continues to iterate until the optimization is limited by the residual sum of background noise. The method is composed of several sub-processes that acquire four projections from the real data and the fictitious object: generate an arbitrary distribution to define the fictitious object, optimize the four projections, generate a new distribution for the fictitious object, and enhance the reconstructed image. The sub-process for the acquisition of the four projections from the input real data is simply the function of acquiring the four projections from the data of the transmitted intensity. The transmitted intensity represents the density distribution, that is, the distribution of absorption coefficients through the object.

  19. Life sciences payload definition and integration study. Volume 1: Management summary

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The objectives of a study program to determine the life sciences payloads required for conducting biomedical experiments during space missions are presented. The objectives are defined as: (1) to identify the research functions which must be performed aboard life sciences spacecraft laboratories and the equipment needed to support these functions and (2) to develop layouts and preliminary conceptual designs of several potential baseline payloads for the accomplishment of life research in space. Payload configurations and subsystems are described and illustrated. Tables of data are included to identify the material requirements for the space missions.

  20. C++ Programming Language

    NASA Technical Reports Server (NTRS)

    Shaykhian, Gholam Ali

    2007-01-01

    C++ Programming Language: The C++ seminar covers the fundamentals of C++ programming language. The C++ fundamentals are grouped into three parts where each part includes both concept and programming examples aimed at for hands-on practice. The first part covers the functional aspect of C++ programming language with emphasis on function parameters and efficient memory utilization. The second part covers the essential framework of C++ programming language, the object-oriented aspects. Information necessary to evaluate various features of object-oriented programming; including encapsulation, polymorphism and inheritance will be discussed. The last part of the seminar covers template and generic programming. Examples include both user defined and standard templates.

  1. Some single-machine scheduling problems with learning effects and two competing agents.

    PubMed

    Li, Hongjie; Li, Zeyuan; Yin, Yunqiang

    2014-01-01

    This study considers a scheduling environment in which there are two agents and a set of jobs, each of which belongs to one of the two agents and its actual processing time is defined as a decreasing linear function of its starting time. Each of the two agents competes to process its respective jobs on a single machine and has its own scheduling objective to optimize. The objective is to assign the jobs so that the resulting schedule performs well with respect to the objectives of both agents. The objective functions addressed in this study include the maximum cost, the total weighted completion time, and the discounted total weighted completion time. We investigate three problems arising from different combinations of the objectives of the two agents. The computational complexity of the problems is discussed and solution algorithms where possible are presented.

  2. Experimentally-Based Ocean Acoustic Propagation and Coherence Studies

    DTIC Science & Technology

    2013-09-30

    degradation and/or exploitation of available sonic information. OBJECTIVES An objective is to quantify and explain underwater sound fluctuation...focusing did not entirely match the data because of terrain uncertainty. The paper further notes that a rapid drop-off of received sound level from the...covariance functions of complex demodulated signals. The array gain is defined as the signal to noise ratio for coherently added (beam steered) acoustic

  3. Transition Room Program, 1967 Report.

    ERIC Educational Resources Information Center

    Glassner, Leonard E.

    The Transition Room Program of the Pittsburgh Schools was defined and evaluated by the staff, the administration, and a program evaluator from the Office of Research. The definition included general objectives, anticipated outcomes, student criteria and characteristics, staff qualifications and functions, media, student activities, and staff…

  4. Application of Probabilistic Methods for the Determination of an Economically Robust HSCT Configuration

    NASA Technical Reports Server (NTRS)

    Mavris, Dimitri N.; Bandte, Oliver; Schrage, Daniel P.

    1996-01-01

    This paper outlines an approach for the determination of economically viable robust design solutions using the High Speed Civil Transport (HSCT) as a case study. Furthermore, the paper states the advantages of a probability based aircraft design over the traditional point design approach. It also proposes a new methodology called Robust Design Simulation (RDS) which treats customer satisfaction as the ultimate design objective. RDS is based on a probabilistic approach to aerospace systems design, which views the chosen objective as a distribution function introduced by so called noise or uncertainty variables. Since the designer has no control over these variables, a variability distribution is defined for each one of them. The cumulative effect of all these distributions causes the overall variability of the objective function. For cases where the selected objective function depends heavily on these noise variables, it may be desirable to obtain a design solution that minimizes this dependence. The paper outlines a step by step approach on how to achieve such a solution for the HSCT case study and introduces an evaluation criterion which guarantees the highest customer satisfaction. This customer satisfaction is expressed by the probability of achieving objective function values less than a desired target value.

  5. The Effect of Nasal Functions on the Integrity of Grafts after Myringoplasty

    PubMed Central

    Eser, Başak Çaypınar; Yılmaz, Aslı Şahin; Toros, Sema Zer; Oysu, Çağatay

    2017-01-01

    Objective We aimed to evaluate the effects of nasal functions for the integrity of grafts after myringoplasty. Methods In our study 78 patients who underwent myringoplasty operation between 2011–2013 were included. Group I was defined as the group with an intact tympanic membrane following surgery. Group II was defined as the group with a tympanic membrane perforation following surgery. Group I consisted of 44 and Group II consisted of 34 patients. Subjective and objective measurements of nasal functions, Eustachian tube function (ETF), and allergic status were performed using nasal obstruction symptom evaluation (NOSE) scale, visual analog scale (VAS), and the score for allergic rhinitis (SFAR) questionnaires and acoustic rhinometry and saccharin test. It was investigated whether there was any difference between these two groups in terms of these parameters. Results There was statistically no significant difference between groups according to the age, sex and the presence of tubal dysfunction and allergic rhinitis (p>0.05). In the group of intact tympanic membranes, the likelihood of right ear being the operated one was significantly higher compared to the group of myringoplasty failures (p=0.037). The VAS and NOSE scales did not show any significant difference between groups in terms of successful outcome of myringoplasty (p>0.05). The nasal congestion index (NCI) and the mucociliary clearance (MCC) did not show any significant difference between groups in terms of successful outcome of myringoplasty (p>0.05). Conclusion This study has shown that nasal functions measured by objective and subjective methods had no effects on the success of myringoplasty. PMID:29515926

  6. Enhanced local tomography

    DOEpatents

    Katsevich, Alexander J.; Ramm, Alexander G.

    1996-01-01

    Local tomography is enhanced to determine the location and value of a discontinuity between a first internal density of an object and a second density of a region within the object. A beam of radiation is directed in a predetermined pattern through the region of the object containing the discontinuity. Relative attenuation data of the beam is determined within the predetermined pattern having a first data component that includes attenuation data through the region. In a first method for evaluating the value of the discontinuity, the relative attenuation data is inputted to a local tomography function .function..sub..LAMBDA. to define the location S of the density discontinuity. The asymptotic behavior of .function..sub..LAMBDA. is determined in a neighborhood of S, and the value for the discontinuity is estimated from the asymptotic behavior of .function..sub..LAMBDA.. In a second method for evaluating the value of the discontinuity, a gradient value for a mollified local tomography function .gradient..function..sub..LAMBDA..epsilon. (x.sub.ij) is determined along the discontinuity; and the value of the jump of the density across the discontinuity curve (or surface) S is estimated from the gradient values.

  7. Solving quantum optimal control problems using Clebsch variables and Lin constraints

    NASA Astrophysics Data System (ADS)

    Delgado-Téllez, M.; Ibort, A.; Rodríguez de la Peña, T.

    2018-01-01

    Clebsch variables (and Lin constraints) are applied to the study of a class of optimal control problems for affine-controlled quantum systems. The optimal control problem will be modelled with controls defined on an auxiliary space where the dynamical group of the system acts freely. The reciprocity between both theories: the classical theory defined by the objective functional and the quantum system, is established by using a suitable version of Lagrange’s multipliers theorem and a geometrical interpretation of the constraints of the system as defining a subspace of horizontal curves in an associated bundle. It is shown how the solutions of the variational problem defined by the objective functional determine solutions of the quantum problem. Then a new way of obtaining explicit solutions for a family of optimal control problems for affine-controlled quantum systems (finite or infinite dimensional) is obtained. One of its main advantages, is the the use of Clebsch variables allows to compute such solutions from solutions of invariant problems that can often be computed explicitly. This procedure can be presented as an algorithm that can be applied to a large class of systems. Finally, some simple examples, spin control, a simple quantum Hamiltonian with an ‘Elroy beanie’ type classical model and a controlled one-dimensional quantum harmonic oscillator, illustrating the main features of the theory, will be discussed.

  8. Minus-Lens–Stimulated Accommodative Amplitude Decreases Sigmoidally with Age: A Study of Objectively Measured Accommodative Amplitudes from Age 3

    PubMed Central

    Anderson, Heather A.; Hentz, Gloria; Glasser, Adrian; Stuebing, Karla K.; Manny, Ruth E.

    2009-01-01

    Purpose Guidelines for predicting accommodative amplitude by age are often based on subjective push-up test data that overestimate the accommodative response. Studies in which objective measurements were used have defined expected amplitudes for adults, but expected amplitudes for children remain unknown. In this study, objective methods were used to measure accommodative amplitude in a wide age range of individuals, to define the relationship of amplitude and age from age 3. Methods Accommodative responses were measured in 140 subjects aged 3 to 40 years. Measurements were taken with the Grand Seiko autorefractor (RyuSyo Industrial Co., Ltd., Kagawa, Japan) as the subjects viewed a high-contrast target at 33 cm through minus lenses of increasing power until the responses showed no further increase in accommodation. Results The maximum accommodative amplitude of each subject was plotted by age, and a curvilinear function fit to the data: y = 7.33 − 0.0035(age − 3)2 (P < 0.001). Tangent analysis of the fit indicated that the accommodative amplitude remained relatively stable until age 20. Data from this study were then pooled with objective amplitudes from previous studies of adults up to age 70. A sigmoidal function was fit to the data: y = 7.083/(1 + e[0.2031(age-36.2)−0.6109]) (P < 0.001). The sigmoidal function indicated relatively stable amplitudes below age 20 years, a rapid linear decline between 20 and 50 years, and a taper to 0 beyond 50 years. Conclusions These data indicate that accommodative amplitude decreases in a curvilinear manner from 3 to 40 years. When combined with data from previous studies, a sigmoidal function describes the overall trend throughout life with the biggest decrease occurring between 20 and 50 years. PMID:18326693

  9. Equilibrium Propagation: Bridging the Gap between Energy-Based Models and Backpropagation

    PubMed Central

    Scellier, Benjamin; Bengio, Yoshua

    2017-01-01

    We introduce Equilibrium Propagation, a learning framework for energy-based models. It involves only one kind of neural computation, performed in both the first phase (when the prediction is made) and the second phase of training (after the target or prediction error is revealed). Although this algorithm computes the gradient of an objective function just like Backpropagation, it does not need a special computation or circuit for the second phase, where errors are implicitly propagated. Equilibrium Propagation shares similarities with Contrastive Hebbian Learning and Contrastive Divergence while solving the theoretical issues of both algorithms: our algorithm computes the gradient of a well-defined objective function. Because the objective function is defined in terms of local perturbations, the second phase of Equilibrium Propagation corresponds to only nudging the prediction (fixed point or stationary distribution) toward a configuration that reduces prediction error. In the case of a recurrent multi-layer supervised network, the output units are slightly nudged toward their target in the second phase, and the perturbation introduced at the output layer propagates backward in the hidden layers. We show that the signal “back-propagated” during this second phase corresponds to the propagation of error derivatives and encodes the gradient of the objective function, when the synaptic update corresponds to a standard form of spike-timing dependent plasticity. This work makes it more plausible that a mechanism similar to Backpropagation could be implemented by brains, since leaky integrator neural computation performs both inference and error back-propagation in our model. The only local difference between the two phases is whether synaptic changes are allowed or not. We also show experimentally that multi-layer recurrently connected networks with 1, 2, and 3 hidden layers can be trained by Equilibrium Propagation on the permutation-invariant MNIST task. PMID:28522969

  10. On global optimization using an estimate of Lipschitz constant and simplicial partition

    NASA Astrophysics Data System (ADS)

    Gimbutas, Albertas; Žilinskas, Antanas

    2016-10-01

    A new algorithm is proposed for finding the global minimum of a multi-variate black-box Lipschitz function with an unknown Lipschitz constant. The feasible region is initially partitioned into simplices; in the subsequent iteration, the most suitable simplices are selected and bisected via the middle point of the longest edge. The suitability of a simplex for bisection is evaluated by minimizing of a surrogate function which mimics the lower bound for the considered objective function over that simplex. The surrogate function is defined using an estimate of the Lipschitz constant and the objective function values at the vertices of a simplex. The novelty of the algorithm is the sophisticated method of estimating the Lipschitz constant, and the appropriate method to minimize the surrogate function. The proposed algorithm was tested using 600 random test problems of different complexity, showing competitive results with two popular advanced algorithms which are based on similar assumptions.

  11. Penalized likelihood and multi-objective spatial scans for the detection and inference of irregular clusters

    PubMed Central

    2010-01-01

    Background Irregularly shaped spatial clusters are difficult to delineate. A cluster found by an algorithm often spreads through large portions of the map, impacting its geographical meaning. Penalized likelihood methods for Kulldorff's spatial scan statistics have been used to control the excessive freedom of the shape of clusters. Penalty functions based on cluster geometry and non-connectivity have been proposed recently. Another approach involves the use of a multi-objective algorithm to maximize two objectives: the spatial scan statistics and the geometric penalty function. Results & Discussion We present a novel scan statistic algorithm employing a function based on the graph topology to penalize the presence of under-populated disconnection nodes in candidate clusters, the disconnection nodes cohesion function. A disconnection node is defined as a region within a cluster, such that its removal disconnects the cluster. By applying this function, the most geographically meaningful clusters are sifted through the immense set of possible irregularly shaped candidate cluster solutions. To evaluate the statistical significance of solutions for multi-objective scans, a statistical approach based on the concept of attainment function is used. In this paper we compared different penalized likelihoods employing the geometric and non-connectivity regularity functions and the novel disconnection nodes cohesion function. We also build multi-objective scans using those three functions and compare them with the previous penalized likelihood scans. An application is presented using comprehensive state-wide data for Chagas' disease in puerperal women in Minas Gerais state, Brazil. Conclusions We show that, compared to the other single-objective algorithms, multi-objective scans present better performance, regarding power, sensitivity and positive predicted value. The multi-objective non-connectivity scan is faster and better suited for the detection of moderately irregularly shaped clusters. The multi-objective cohesion scan is most effective for the detection of highly irregularly shaped clusters. PMID:21034451

  12. Performance index and meta-optimization of a direct search optimization method

    NASA Astrophysics Data System (ADS)

    Krus, P.; Ölvander, J.

    2013-10-01

    Design optimization is becoming an increasingly important tool for design, often using simulation as part of the evaluation of the objective function. A measure of the efficiency of an optimization algorithm is of great importance when comparing methods. The main contribution of this article is the introduction of a singular performance criterion, the entropy rate index based on Shannon's information theory, taking both reliability and rate of convergence into account. It can also be used to characterize the difficulty of different optimization problems. Such a performance criterion can also be used for optimization of the optimization algorithms itself. In this article the Complex-RF optimization method is described and its performance evaluated and optimized using the established performance criterion. Finally, in order to be able to predict the resources needed for optimization an objective function temperament factor is defined that indicates the degree of difficulty of the objective function.

  13. Multidisciplinary conceptual design optimization of aircraft using a sound-matching-based objective function

    NASA Astrophysics Data System (ADS)

    Diez, Matteo; Iemma, Umberto

    2012-05-01

    The article presents a novel approach to include community noise considerations based on sound quality in the Multidisciplinary Conceptual Design Optimization (MCDO) of civil transportation aircraft. The novelty stems from the use of an unconventional objective function, defined as a measure of the difference between the noise emission of the aircraft under analysis and a reference 'weakly annoying' noise, the target sound. The minimization of such a merit factor yields an aircraft concept with a noise signature as close as possible to the given target. The reference sound is one of the outcomes of the European Research Project SEFA (Sound Engineering For Aircraft, VI Framework Programme, 2004-2007), and used here as an external input. The aim of the present work is to address the definition and the inclusion of the sound-matching-based objective function in the MCDO of aircraft.

  14. Ecosystem health: I. Measuring ecosystem health

    NASA Astrophysics Data System (ADS)

    Schaeffer, David J.; Herricks, Edwin E.; Kerster, Harold W.

    1988-07-01

    Ecosystem analysis has been advanced by an improved understanding of how ecosystems are structured and how they function. Ecology has advanced from an emphasis on natural history to consideration of energetics, the relationships and connections between species, hierarchies, and systems theory. Still, we consider ecosystems as entities with a distinctive character and individual characteristics. Ecosystem maintenance and preservation form the objective of impact analysis, hazard evaluation, and other management or regulation activities. In this article we explore an approach to ecosystem analysis which identifies and quantifies factors which define the condition or state of an ecosystem in terms of health criteria. We relate ecosystem health to human/nonhuman animal health and explore the difficulties of defining ecosystem health and suggest criteria which provide a functional definition of state and condition. We suggest that, as has been found in human/nonhuman animal health studies, disease states can be recognized before disease is of clinical magnitude. Example disease states for ecosystems are functionally defined and discussed, together with test systems for their early detection.

  15. Are functional foods redefining nutritional requirements?

    PubMed

    Jones, Peter J; Varady, Krista A

    2008-02-01

    Functional foods are increasing in popularity owing to their ability to confer health and physiological benefits. Nevertheless, the notion that functional foods improve health when providing nutrients at levels above and beyond existing recommended intakes is inconsistent with the definition of requirement. This disparity highlights the need for an alternative definition of nutrient requirement. The present objective is to examine distinctions between optimization of health, as defined by what we currently deem as required intakes, versus adding physiological benefit using bioactive agents found in functional foods. Presently, requirement is defined as the lowest amount of intake of a nutrient that will maintain a defined level of nourishment for a specific indicator of adequacy. In contrast, functional foods are described as ingredients that are not necessary for body function, yet provide added physiological benefit that confer better overall health. Plant sterols are one example of such an ingredient. Plant sterols lower plasma cholesterol concentrations, and may thus be considered essential nutrients in physiological situations where circulating cholesterol concentrations are high. Similarly, intakes of omega-3 fats beyond existing requirement may confer additional health benefits such as hypolipidemic and anti-diabetic effects. These examples underscore the inconsistencies between what is defined as a nutrient requirement versus what is identified as a health benefit of a functional food. Such discrepancies emphasize the need for a more all-encompassing definition of a nutrient requirement; that is, one that moves beyond the prevention of overt deficiency to encompass improved health and disease risk reduction.

  16. A Guide to MERLIN.

    ERIC Educational Resources Information Center

    Pennsylvania State Dept. of Education, Harrisburg.

    The guide describes the mission, objective, and function of the Migrant Education Resources List and Information Network (MERLIN) and defines the scopes of interest currently identified as national priorities in migrant education. MERLIN is a federally funded project designed to improve interstate and intrastate coordination of migrant education…

  17. Structural Tailoring of Advanced Turboprops (STAT). Theoretical manual

    NASA Technical Reports Server (NTRS)

    Brown, K. W.

    1992-01-01

    This manual describes the theories in the Structural Tailoring of Advanced Turboprops (STAT) computer program, which was developed to perform numerical optimizations on highly swept propfan blades. The optimization procedure seeks to minimize an objective function, defined as either direct operating cost or aeroelastic differences between a blade and its scaled model, by tuning internal and external geometry variables that must satisfy realistic blade design constraints. The STAT analyses include an aerodynamic efficiency evaluation, a finite element stress and vibration analysis, an acoustic analysis, a flutter analysis, and a once-per-revolution (1-p) forced response life prediction capability. The STAT constraints include blade stresses, blade resonances, flutter, tip displacements, and a 1-P forced response life fraction. The STAT variables include all blade internal and external geometry parameters needed to define a composite material blade. The STAT objective function is dependent upon a blade baseline definition which the user supplies to describe a current blade design for cost optimization or for the tailoring of an aeroelastic scale model.

  18. Structural Tailoring of Advanced Turboprops (STAT). Theoretical manual

    NASA Astrophysics Data System (ADS)

    Brown, K. W.

    1992-10-01

    This manual describes the theories in the Structural Tailoring of Advanced Turboprops (STAT) computer program, which was developed to perform numerical optimizations on highly swept propfan blades. The optimization procedure seeks to minimize an objective function, defined as either direct operating cost or aeroelastic differences between a blade and its scaled model, by tuning internal and external geometry variables that must satisfy realistic blade design constraints. The STAT analyses include an aerodynamic efficiency evaluation, a finite element stress and vibration analysis, an acoustic analysis, a flutter analysis, and a once-per-revolution (1-p) forced response life prediction capability. The STAT constraints include blade stresses, blade resonances, flutter, tip displacements, and a 1-P forced response life fraction. The STAT variables include all blade internal and external geometry parameters needed to define a composite material blade. The STAT objective function is dependent upon a blade baseline definition which the user supplies to describe a current blade design for cost optimization or for the tailoring of an aeroelastic scale model.

  19. Applying a Participatory Design Approach to Define Objectives and Properties of a “Data Profiling” Tool for Electronic Health Data

    PubMed Central

    Estiri, Hossein; Lovins, Terri; Afzalan, Nader; Stephens, Kari A.

    2016-01-01

    We applied a participatory design approach to define the objectives, characteristics, and features of a “data profiling” tool for primary care Electronic Health Data (EHD). Through three participatory design workshops, we collected input from potential tool users who had experience working with EHD. We present 15 recommended features and characteristics for the data profiling tool. From these recommendations we derived three overarching objectives and five properties for the tool. A data profiling tool, in Biomedical Informatics, is a visual, clear, usable, interactive, and smart tool that is designed to inform clinical and biomedical researchers of data utility and let them explore the data, while conveniently orienting the users to the tool’s functionalities. We suggest that developing scalable data profiling tools will provide new capacities to disseminate knowledge about clinical data that will foster translational research and accelerate new discoveries. PMID:27570651

  20. [Upper Age Limit in Outpatient Anesthesia: Opportunities and Risks].

    PubMed

    Hüppe, Tobias; Kneller, Nicole; Raddatz, Alexander

    2018-05-01

    Ambulatory surgery in elderly patients continues to increase - avoiding hospitalization and thus postoperative cognitive dysfunction in older patients being its major objectives. An upper age limit in outpatient anesthesia does not exist to date. However, functional rather than chronological age is crucial in patient selection. In consensus discussion, baseline functional status should be evaluated regularly - defined as everyday behaviors necessary to maintain daily life and encompassing areas of physical, cognitive, and social functioning. Moreover, frailty in elderly patients can be quantified objectively and is associated with increased perioperative morbidity in ambulatory general surgery. The decision for or against outpatient anesthesia therefore remains a case-by-case decision which should be discussed within a team. Georg Thieme Verlag KG Stuttgart · New York.

  1. Wave drag as the objective function in transonic fighter wing optimization

    NASA Technical Reports Server (NTRS)

    Phillips, P. S.

    1984-01-01

    The original computational method for determining wave drag in a three dimensional transonic analysis method was replaced by a wave drag formula based on the loss in momentum across an isentropic shock. This formula was used as the objective function in a numerical optimization procedure to reduce the wave drag of a fighter wing at transonic maneuver conditions. The optimization procedure minimized wave drag through modifications to the wing section contours defined by a wing profile shape function. A significant reduction in wave drag was achieved while maintaining a high lift coefficient. Comparisons of the pressure distributions for the initial and optimized wing geometries showed significant reductions in the leading-edge peaks and shock strength across the span.

  2. The International VEGA "Venus-Halley" (1984-1986) Experiment: Description and Scientific Objectives

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The Venus-Halley (Vega) project will provide a unique opportunity to combine a mission over Venus with a transfer flight to Halley's comet. This project is based on three research goals: (1) to study the surface of Venus; (2) to study the air circulation on Venus and its meteorological parameters; and (3) to study Halley's comet. The objective of the study of Halley's comet is to: determine the physical characteristics of its nucleus; define the structure and dynamics of the coma around the nucleus; define the gas composition near the nucleus; investigate the dust particle distribution as a function of mass at various distances from the nucleus; and investigate the solar wind interaction with the atmosphere and ionosphere of the comet.

  3. The reinvigoration of public health nursing: methods and innovations.

    PubMed

    Avila, Margaret; Smith, Kathleen

    2003-01-01

    Los Angeles County (LAC) restructured and reinvigorated public health in response to nationwide concern over the adequacy of all public health infrastructures and functions. LAC's reorganization into geographically defined service planning areas (SPAs) has facilitated the integration of core public health functions into local practice. Public health nurses practicing as generalists within their SPA identified three initial objectives to address in population-based care: (1) expanding practice beyond disease control to a more holistic approach, (2) providing consultation using the Ask-the-Nurse innovation, and (3) developing a community assessment database for interdisciplinary SPA health planning. Additional innovative objectives are planned for the future.

  4. Teleman localization of Hochschild homology in a singular setting

    NASA Astrophysics Data System (ADS)

    Brasselet, J.-P.; Legrand, A.

    2009-09-01

    The aim of this paper is to generalize the Hochschild-Kostant-Rosenberg theorem to the case of singular varieties, more precisely, to manifolds with boundary and to varieties with isolated singularities. In these situations, we define suitable algebras of functions and study the localization of the corresponding Hochschild homology. The tool we use is the Teleman localization process. In the case of isolated singularities, the closed Hochschild homology corresponds to the intersection complex which relates the objects defined here to intersection homology.

  5. Image Segmentation for Improvised Explosive Devices

    DTIC Science & Technology

    2012-12-01

    us to generate color models for IEDs without user input that labels parts of the IED. v THIS PAGE INTENTIONALLY LEFT BLANK vi Table of Contents 1...has to be generated. All graph cut algorithms we analyze define the undirected network G( V ,E) as a set of nodes V , edges E, and capacities C: E → R. 3...algorithms we study, this objective function is the sum of the two functions U and V , where the function U is a region property which evaluates the

  6. Markov and semi-Markov processes as a failure rate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabski, Franciszek

    2016-06-08

    In this paper the reliability function is defined by the stochastic failure rate process with a non negative and right continuous trajectories. Equations for the conditional reliability functions of an object, under assumption that the failure rate is a semi-Markov process with an at most countable state space are derived. A proper theorem is presented. The linear systems of equations for the appropriate Laplace transforms allow to find the reliability functions for the alternating, the Poisson and the Furry-Yule failure rate processes.

  7. The Capability And Enhancement Of VDANL And TWOPAS For Analyzing Vehicle Performance On Upgrades And Downgrades Within IHSDM

    DOT National Transportation Integrated Search

    2000-01-21

    This report documents the results and recommendations for defining and analyzing "Vehicle Performance on Upgrades and Downgrades" on two lane rural roads. The contract objective was to develop functional requirements (and identify gaps) to enhance th...

  8. Communication and Cultural Interpretation.

    ERIC Educational Resources Information Center

    Carbaugh, Donal

    1991-01-01

    Argues with John Fiske's position on the nature and function of cultural interpretation. Defines and defends cultural interpretation as an investigative mode the main objective of which is to render participants' communication practices coherent and intelligible, through an explication of a system of symbols, symbolic forms, and meanings which is…

  9. Iterative optimizing quantization method for reconstructing three-dimensional images from a limited number of views

    DOEpatents

    Lee, H.R.

    1997-11-18

    A three-dimensional image reconstruction method comprises treating the object of interest as a group of elements with a size that is determined by the resolution of the projection data, e.g., as determined by the size of each pixel. One of the projections is used as a reference projection. A fictitious object is arbitrarily defined that is constrained by such reference projection. The method modifies the known structure of the fictitious object by comparing and optimizing its four projections to those of the unknown structure of the real object and continues to iterate until the optimization is limited by the residual sum of background noise. The method is composed of several sub-processes that acquire four projections from the real data and the fictitious object: generate an arbitrary distribution to define the fictitious object, optimize the four projections, generate a new distribution for the fictitious object, and enhance the reconstructed image. The sub-process for the acquisition of the four projections from the input real data is simply the function of acquiring the four projections from the data of the transmitted intensity. The transmitted intensity represents the density distribution, that is, the distribution of absorption coefficients through the object. 5 figs.

  10. Color object detection using spatial-color joint probability functions.

    PubMed

    Luo, Jiebo; Crandall, David

    2006-06-01

    Object detection in unconstrained images is an important image understanding problem with many potential applications. There has been little success in creating a single algorithm that can detect arbitrary objects in unconstrained images; instead, algorithms typically must be customized for each specific object. Consequently, it typically requires a large number of exemplars (for rigid objects) or a large amount of human intuition (for nonrigid objects) to develop a robust algorithm. We present a robust algorithm designed to detect a class of compound color objects given a single model image. A compound color object is defined as having a set of multiple, particular colors arranged spatially in a particular way, including flags, logos, cartoon characters, people in uniforms, etc. Our approach is based on a particular type of spatial-color joint probability function called the color edge co-occurrence histogram. In addition, our algorithm employs perceptual color naming to handle color variation, and prescreening to limit the search scope (i.e., size and location) for the object. Experimental results demonstrated that the proposed algorithm is insensitive to object rotation, scaling, partial occlusion, and folding, outperforming a closely related algorithm based on color co-occurrence histograms by a decisive margin.

  11. It's all connected: Pathways in visual object recognition and early noun learning.

    PubMed

    Smith, Linda B

    2013-11-01

    A developmental pathway may be defined as the route, or chain of events, through which a new structure or function forms. For many human behaviors, including object name learning and visual object recognition, these pathways are often complex and multicausal and include unexpected dependencies. This article presents three principles of development that suggest the value of a developmental psychology that explicitly seeks to trace these pathways and uses empirical evidence on developmental dependencies among motor development, action on objects, visual object recognition, and object name learning in 12- to 24-month-old infants to make the case. The article concludes with a consideration of the theoretical implications of this approach. (PsycINFO Database Record (c) 2013 APA, all rights reserved).

  12. The importance of chemistry in creating well-defined nanoscopic embedded therapeutics: devices capable of the dual functions of imaging and therapy.

    PubMed

    Nyström, Andreas M; Wooley, Karen L

    2011-10-18

    Nanomedicine is a rapidly evolving field, for which polymer building blocks are proving useful for the construction of sophisticated devices that provide enhanced diagnostic imaging and treatment of disease, known as theranostics. These well-defined nanoscopic objects have high loading capacities, can protect embedded therapeutic cargo, and offer control over the conditions and rates of release. Theranostics also offer external surface area for the conjugation of ligands to impart stealth characteristics and/or direct their interactions with biological receptors and provide a framework for conjugation of imaging agents to track delivery to diseased site(s). The nanoscopic dimensions allow for extensive biological circulation. The incorporation of such multiple functions is complicated, requiring exquisite chemical control during production and rigorous characterization studies to confirm the compositions, structures, properties, and performance. We are particularly interested in the study of nanoscopic objects designed for treatment of lung infections and acute lung injury, urinary tract infections, and cancer. This Account highlights our work over several years to tune the assembly of unique nanostructures. We provide examples of how the composition, structure, dimensions, and morphology of theranostic devices can tune their performance as drug delivery agents for the treatment of infectious diseases and cancer. The evolution of nanostructured materials from relatively simple overall shapes and internal morphologies to those of increasing complexity is driving the development of synthetic methodologies for the preparation of increasingly complex nanomedicine devices. Our nanomedicine devices are derived from macromolecules that have well-defined compositions, structures, and topologies, which provide a framework for their programmed assembly into nanostructures with controlled sizes, shapes, and morphologies. The inclusion of functional units within selective compartments/domains allows us to create (multi)functional materials. We employ combinations of controlled radical and ring-opening polymerizations, chemical transformations, and supramolecular assembly to construct such materials as functional entities. The use of multifunctional monomers with selective polymerization chemistries affords regiochemically functionalized polymers. Further supramolecular assembly processes in water with further chemical transformations provide discrete nanoscopic objects within aqueous solutions. This approach echoes processes in nature, whereby small molecules (amino acids, nucleic acids, saccharides) are linked into polymers (proteins, DNA/RNA, polysaccharides, respectively) and then those polymers fold into three-dimensional conformations that can lead to nanoscopic functional entities.

  13. REFLEX MODIFICATION: AN APPROACH TO INCORPORATE INTO A DEVELOPMENTAL NEUROTOXICITY (DNT) STUDY DESIGN WITH COMMERCIAL EQUIPMENT.

    EPA Science Inventory

    Reflex modification (RM) of the startle response is a very useful tool for testing sensory function and the integrity of a well-defined complement of neural circuits. Advantages of this procedure include the ability to rapidly acquire objective measurements and differentiate sen...

  14. Health Occupations Module. The Skeletal System--I.

    ERIC Educational Resources Information Center

    Temple Univ., Philadelphia, PA. Div. of Vocational Education.

    This module on the skeletal system is one of eight modules designed for individualized instruction in health occupations education programs at both the secondary and postsecondary levels. This module contains an introduction to the module topic, three objectives (e.g., define the skeletal system and list its functions), and three learning…

  15. Fast global image smoothing based on weighted least squares.

    PubMed

    Min, Dongbo; Choi, Sunghwan; Lu, Jiangbo; Ham, Bumsub; Sohn, Kwanghoon; Do, Minh N

    2014-12-01

    This paper presents an efficient technique for performing a spatially inhomogeneous edge-preserving image smoothing, called fast global smoother. Focusing on sparse Laplacian matrices consisting of a data term and a prior term (typically defined using four or eight neighbors for 2D image), our approach efficiently solves such global objective functions. In particular, we approximate the solution of the memory-and computation-intensive large linear system, defined over a d-dimensional spatial domain, by solving a sequence of 1D subsystems. Our separable implementation enables applying a linear-time tridiagonal matrix algorithm to solve d three-point Laplacian matrices iteratively. Our approach combines the best of two paradigms, i.e., efficient edge-preserving filters and optimization-based smoothing. Our method has a comparable runtime to the fast edge-preserving filters, but its global optimization formulation overcomes many limitations of the local filtering approaches. Our method also achieves high-quality results as the state-of-the-art optimization-based techniques, but runs ∼10-30 times faster. Besides, considering the flexibility in defining an objective function, we further propose generalized fast algorithms that perform Lγ norm smoothing (0 < γ < 2) and support an aggregated (robust) data term for handling imprecise data constraints. We demonstrate the effectiveness and efficiency of our techniques in a range of image processing and computer graphics applications.

  16. A method for automatically optimizing medical devices for treating heart failure: designing polymeric injection patterns.

    PubMed

    Wenk, Jonathan F; Wall, Samuel T; Peterson, Robert C; Helgerson, Sam L; Sabbah, Hani N; Burger, Mike; Stander, Nielen; Ratcliffe, Mark B; Guccione, Julius M

    2009-12-01

    Heart failure continues to present a significant medical and economic burden throughout the developed world. Novel treatments involving the injection of polymeric materials into the myocardium of the failing left ventricle (LV) are currently being developed, which may reduce elevated myofiber stresses during the cardiac cycle and act to retard the progression of heart failure. A finite element (FE) simulation-based method was developed in this study that can automatically optimize the injection pattern of the polymeric "inclusions" according to a specific objective function, using commercially available software tools. The FE preprocessor TRUEGRID((R)) was used to create a parametric axisymmetric LV mesh matched to experimentally measured end-diastole and end-systole metrics from dogs with coronary microembolization-induced heart failure. Passive and active myocardial material properties were defined by a pseudo-elastic-strain energy function and a time-varying elastance model of active contraction, respectively, that were implemented in the FE software LS-DYNA. The companion optimization software LS-OPT was used to communicate directly with TRUEGRID((R)) to determine FE model parameters, such as defining the injection pattern and inclusion characteristics. The optimization resulted in an intuitive optimal injection pattern (i.e., the one with the greatest number of inclusions) when the objective function was weighted to minimize mean end-diastolic and end-systolic myofiber stress and ignore LV stroke volume. In contrast, the optimization resulted in a nonintuitive optimal pattern (i.e., 3 inclusions longitudinallyx6 inclusions circumferentially) when both myofiber stress and stroke volume were incorporated into the objective function with different weights.

  17. A development framework for distributed artificial intelligence

    NASA Technical Reports Server (NTRS)

    Adler, Richard M.; Cottman, Bruce H.

    1989-01-01

    The authors describe distributed artificial intelligence (DAI) applications in which multiple organizations of agents solve multiple domain problems. They then describe work in progress on a DAI system development environment, called SOCIAL, which consists of three primary language-based components. The Knowledge Object Language defines models of knowledge representation and reasoning. The metaCourier language supplies the underlying functionality for interprocess communication and control access across heterogeneous computing environments. The metaAgents language defines models for agent organization coordination, control, and resource management. Application agents and agent organizations will be constructed by combining metaAgents and metaCourier building blocks with task-specific functionality such as diagnostic or planning reasoning. This architecture hides implementation details of communications, control, and integration in distributed processing environments, enabling application developers to concentrate on the design and functionality of the intelligent agents and agent networks themselves.

  18. Angular trapping of anisometric nano-objects in a fluid.

    PubMed

    Celebrano, Michele; Rosman, Christina; Sönnichsen, Carsten; Krishnan, Madhavi

    2012-11-14

    We demonstrate the ability to trap, levitate, and orient single anisometric nanoscale objects with high angular precision in a fluid. An electrostatic fluidic trap confines a spherical object at a spatial location defined by the minimum of the electrostatic system free energy. For an anisometric object and a potential well lacking angular symmetry, the system free energy can further strongly depend on the object's orientation in the trap. Engineering the morphology of the trap thus enables precise spatial and angular confinement of a single levitating nano-object, and the process can be massively parallelized. Since the physics of the trap depends strongly on the surface charge of the object, the method is insensitive to the object's dielectric function. Furthermore, levitation of the assembled objects renders them amenable to individual manipulation using externally applied optical, electrical, or hydrodynamic fields, raising prospects for reconfigurable chip-based nano-object assemblies.

  19. Improved Holistic Analysis of Rayleigh Waves for Single- and Multi-Offset Data: Joint Inversion of Rayleigh-Wave Particle Motion and Vertical- and Radial-Component Velocity Spectra

    NASA Astrophysics Data System (ADS)

    Dal Moro, Giancarlo; Moustafa, Sayed S. R.; Al-Arifi, Nassir S.

    2018-01-01

    Rayleigh waves often propagate according to complex mode excitation so that the proper identification and separation of specific modes can be quite difficult or, in some cases, just impossible. Furthermore, the analysis of a single component (i.e., an inversion procedure based on just one objective function) necessarily prevents solving the problems related to the non-uniqueness of the solution. To overcome these issues and define a holistic analysis of Rayleigh waves, we implemented a procedure to acquire data that are useful to define and efficiently invert the three objective functions defined from the three following "objects": the velocity spectra of the vertical- and radial-components and the Rayleigh-wave particle motion (RPM) frequency-offset data. Two possible implementations are presented. In the first case we consider classical multi-offset (and multi-component) data, while in a second possible approach we exploit the data recorded by a single three-component geophone at a fixed offset from the source. Given the simple field procedures, the method could be particularly useful for the unambiguous geotechnical exploration of large areas, where more complex acquisition procedures, based on the joint acquisition of Rayleigh and Love waves, would not be economically viable. After illustrating the different kinds of data acquisition and the data processing, the results of the proposed methodology are illustrated in a case study. Finally, a series of theoretical and practical aspects are discussed to clarify some issues involved in the overall procedure (data acquisition and processing).

  20. Interaction rules for symbol-oriented graphical user interfaces

    NASA Astrophysics Data System (ADS)

    Brinkschulte, Uwe; Vogelsang, Holger; Wolf, Luc

    1999-03-01

    This work describes a way of interactive manipulation of structured objects by interaction rules. Symbols are used as graphical representation of object states. State changes lead to different visual symbol instances. The manipulation of symbols using interactive devices lead to an automatic state change of the corresponding structured object without any intervention of the application. Therefore, interaction rules are introduced. These rules describe the way a symbol may be manipulated and the effects this manipulation has on the corresponding structured object. The rules are interpreted by the visualization and interaction service. For each symbol used, a set of interaction rules can be defined. In order to be the more general as possible, all the interactions on a symbol are defined as a triple, which specifies the preconditions of all the manipulations of this symbol, the manipulations themselves, and the postconditions of all the manipulations of this symbol. A manipulation is a quintuplet, which describes the possible initial events of the manipulation, the possible places of these events, the preconditions of this manipulation, the results of this manipulation, and the postconditions of this manipulation. Finally, reflection functions map the results of a manipulation to the new state of a structured object.

  1. Content-based fused off-axis object illumination direct-to-digital holography

    DOEpatents

    Price, Jeffery R.

    2006-05-02

    Systems and methods are described for content-based fused off-axis illumination direct-to-digital holography. A method includes calculating an illumination angle with respect to an optical axis defined by a focusing lens as a function of data representing a Fourier analyzed spatially heterodyne hologram; reflecting a reference beam from a reference mirror at a non-normal angle; reflecting an object beam from an object the object beam incident upon the object at the illumination angle; focusing the reference beam and the object beam at a focal plane of a digital recorder to from the content-based off-axis illuminated spatially heterodyne hologram including spatially heterodyne fringes for Fourier analysis; and digitally recording the content based off-axis illuminated spatially heterodyne hologram including spatially heterodyne fringes for Fourier analysis.

  2. Functional Requirements for Onboard Management of Space Shuttle Consumables. Volume 2

    NASA Technical Reports Server (NTRS)

    Graf, P. J.; Herwig, H. A.; Neel, L. W.

    1973-01-01

    This report documents the results of the study "Functional Requirements for Onboard Management of Space Shuttle Consumables." The study was conducted for the Mission Planning and Analysis Division of the NASA Lyndon B. Johnson Space Center, Houston, Texas, between 3 July 1972 and 16 November 1973. The overall study program objective was two-fold. The first objective was to define a generalized consumable management concept which is applicable to advanced spacecraft. The second objective was to develop a specific consumables management concept for the Space Shuttle vehicle and to generate the functional requirements for the onboard portion of that concept. Consumables management is the process of controlling or influencing the usage of expendable materials involved in vehicle subsystem operation. The report consists of two volumes. Volume I presents a description of the study activities related to general approaches for developing consumable management, concepts for advanced spacecraft applications, and functional requirements for a Shuttle consumables management concept. Volume II presents a detailed description of the onboard consumables management concept proposed for use on the Space Shuttle.

  3. Role of exponential type random invexities for asymptotically sufficient efficiency conditions in semi-infinite multi-objective fractional programming.

    PubMed

    Verma, Ram U; Seol, Youngsoo

    2016-01-01

    First a new notion of the random exponential Hanson-Antczak type [Formula: see text]-V-invexity is introduced, which generalizes most of the existing notions in the literature, second a random function [Formula: see text] of the second order is defined, and finally a class of asymptotically sufficient efficiency conditions in semi-infinite multi-objective fractional programming is established. Furthermore, several sets of asymptotic sufficiency results in which various generalized exponential type [Formula: see text]-V-invexity assumptions are imposed on certain vector functions whose components are the individual as well as some combinations of the problem functions are examined and proved. To the best of our knowledge, all the established results on the semi-infinite aspects of the multi-objective fractional programming are new, which is a significantly new emerging field of the interdisciplinary research in nature. We also observed that the investigated results can be modified and applied to several special classes of nonlinear programming problems.

  4. Engineering intelligent tutoring systems

    NASA Technical Reports Server (NTRS)

    Warren, Kimberly C.; Goodman, Bradley A.

    1993-01-01

    We have defined an object-oriented software architecture for Intelligent Tutoring Systems (ITS's) to facilitate the rapid development, testing, and fielding of ITS's. This software architecture partitions the functionality of the ITS into a collection of software components with well-defined interfaces and execution concept. The architecture was designed to isolate advanced technology components, partition domain dependencies, take advantage of the increased availability of commercial software packages, and reduce the risks involved in acquiring ITS's. A key component of the architecture, the Executive, is a publish and subscribe message handling component that coordinates all communication between ITS components.

  5. Reusable Agena study. Volume 2: Technical

    NASA Technical Reports Server (NTRS)

    Carter, W. K.; Piper, J. E.; Douglass, D. A.; Waller, E. W.; Hopkins, C. V.; Fitzgerald, E. T.; Sagawa, S. S.; Carter, S. A.; Jensen, H. L.

    1974-01-01

    The application of the existing Agena vehicle as a reusable upper stage for the space shuttle is discussed. The primary objective of the study is to define those changes to the Agena required for it to function in the reusable mode in the 100 percent capture of the NASA-DOD mission model. This 100 percent capture is achieved without use of kick motors or stages by simply increasing the Agena propellant load by using optional strap-on-tanks. The required shuttle support equipment, launch and flight operations techniques, development program, and cost package are also defined.

  6. Critical Uses of College Resources. Part I: Personnel Utilization System.

    ERIC Educational Resources Information Center

    Vlahos, Mantha

    A Personnel Utilization System has been designed at Broward Community College, which combines payroll, personnel, course, and function information in order to determine the actual duties performed by personnel for the amount of remuneration received. Objectives of the system are (1) to define the tasks being performed by faculty, staff, and…

  7. Discovering the Laplace Transform in Undergraduate Differential Equations

    ERIC Educational Resources Information Center

    Quinn, Terrance J.; Rai, Sanjay

    2008-01-01

    The Laplace Transform is an object of fundamental importance in pure and applied mathematics. In addition, it has special pedagogical value in that it can provide a natural and concrete setting for a student to begin thinking about the modern concepts of "operator" and "functional". Most undergraduate textbooks, however, merely define the…

  8. The Images and Realities of Career Education.

    ERIC Educational Resources Information Center

    Parnell, Dale

    It is important that career education goals and objectives be clarified and any present inaccurate images by changed. Career education can be defined as that delivery system which helps students develop the necessary competencies to function in the real-life role of producer or wage earner. Thinking in terms of competencies required to function…

  9. StrateGene: object-oriented programming in molecular biology.

    PubMed

    Carhart, R E; Cash, H D; Moore, J F

    1988-03-01

    This paper describes some of the ways that object-oriented programming methodologies have been used to represent and manipulate biological information in a working application. When running on a Xerox 1100 series computer, StrateGene functions as a genetic engineering workstation for the management of information about cloning experiments. It represents biological molecules, enzymes, fragments, and methods as classes, subclasses, and members in a hierarchy of objects. These objects may have various attributes, which themselves can be defined and classified. The attributes and their values can be passed from the classes of objects down to the subclasses and members. The user can modify the objects and their attributes while using them. New knowledge and changes to the system can be incorporated relatively easily. The operations on the biological objects are associated with the objects themselves. This makes it easier to invoke them correctly and allows generic operations to be customized for the particular object.

  10. Accurate diagnosis of myalgic encephalomyelitis and chronic fatigue syndrome based upon objective test methods for characteristic symptoms

    PubMed Central

    Twisk, Frank NM

    2015-01-01

    Although myalgic encephalomyelitis (ME) and chronic fatigue syndrome (CFS) are considered to be synonymous, the definitional criteria for ME and CFS define two distinct, partially overlapping, clinical entities. ME, whether defined by the original criteria or by the recently proposed criteria, is not equivalent to CFS, let alone a severe variant of incapacitating chronic fatigue. Distinctive features of ME are: muscle weakness and easy muscle fatigability, cognitive impairment, circulatory deficits, a marked variability of the symptoms in presence and severity, but above all, post-exertional “malaise”: a (delayed) prolonged aggravation of symptoms after a minor exertion. In contrast, CFS is primarily defined by (unexplained) chronic fatigue, which should be accompanied by four out of a list of 8 symptoms, e.g., headaches. Due to the subjective nature of several symptoms of ME and CFS, researchers and clinicians have questioned the physiological origin of these symptoms and qualified ME and CFS as functional somatic syndromes. However, various characteristic symptoms, e.g., post-exertional “malaise” and muscle weakness, can be assessed objectively using well-accepted methods, e.g., cardiopulmonary exercise tests and cognitive tests. The objective measures acquired by these methods should be used to accurately diagnose patients, to evaluate the severity and impact of the illness objectively and to assess the positive and negative effects of proposed therapies impartially. PMID:26140274

  11. Defining Bladder Health in Women and Girls: Implications for Research, Clinical Practice, and Public Health Promotion.

    PubMed

    Lukacz, Emily S; Bavendam, Tamara G; Berry, Amanda; Fok, Cynthia S; Gahagan, Sheila; Goode, Patricia S; Hardacker, Cecilia T; Hebert-Beirne, Jeni; Lewis, Cora E; Lewis, Jessica; Low, Lisa Kane; Lowder, Jerry L; Palmer, Mary H; Smith, Ariana L; Brady, Sonya S

    2018-05-24

    Bladder health in women and girls is poorly understood, in part, due to absence of a definition for clinical or research purposes. This article describes the process used by a National Institutes of Health funded transdisciplinary research team (The Prevention of Lower Urinary Tract Symptoms [PLUS] Consortium) to develop a definition of bladder health. The PLUS Consortium identified currently accepted lower urinary tract symptoms (LUTS) and outlined elements of storage and emptying functions of the bladder. Consistent with the World Health Organization's definition of health, PLUS concluded that absence of LUTS was insufficient and emphasizes the bladder's ability to adapt to short-term physical, psychosocial, and environmental challenges for the final definition. Definitions for subjective experiences and objective measures of bladder dysfunction and health were drafted. An additional bioregulatory function to protect against infection, neoplasia, chemical, or biologic threats was proposed. PLUS proposes that bladder health be defined as: "A complete state of physical, mental, and social well-being related to bladder function and not merely the absence of LUTS. Healthy bladder function permits daily activities, adapts to short-term physical or environmental stressors, and allows optimal well-being (e.g., travel, exercise, social, occupational, or other activities)." Definitions for each element of bladder function are reported with suggested subjective and objective measures. PLUS used a comprehensive transdisciplinary process to develop a bladder health definition. This will inform instrument development for evaluation of bladder health promotion and prevention of LUTS in research, practice, and public health initiatives.

  12. In-Space Radiator Shape Optimization using Genetic Algorithms

    NASA Technical Reports Server (NTRS)

    Hull, Patrick V.; Kittredge, Ken; Tinker, Michael; SanSoucie, Michael

    2006-01-01

    Future space exploration missions will require the development of more advanced in-space radiators. These radiators should be highly efficient and lightweight, deployable heat rejection systems. Typical radiators for in-space heat mitigation commonly comprise a substantial portion of the total vehicle mass. A small mass savings of even 5-10% can greatly improve vehicle performance. The objective of this paper is to present the development of detailed tools for the analysis and design of in-space radiators using evolutionary computation techniques. The optimality criterion is defined as a two-dimensional radiator with a shape demonstrating the smallest mass for the greatest overall heat transfer, thus the end result is a set of highly functional radiator designs. This cross-disciplinary work combines topology optimization and thermal analysis design by means of a genetic algorithm The proposed design tool consists of the following steps; design parameterization based on the exterior boundary of the radiator, objective function definition (mass minimization and heat loss maximization), objective function evaluation via finite element analysis (thermal radiation analysis) and optimization based on evolutionary algorithms. The radiator design problem is defined as follows: the input force is a driving temperature and the output reaction is heat loss. Appropriate modeling of the space environment is added to capture its effect on the radiator. The design parameters chosen for this radiator shape optimization problem fall into two classes, variable height along the width of the radiator and a spline curve defining the -material boundary of the radiator. The implementation of multiple design parameter schemes allows the user to have more confidence in the radiator optimization tool upon demonstration of convergence between the two design parameter schemes. This tool easily allows the user to manipulate the driving temperature regions thus permitting detailed design of in-space radiators for unique situations. Preliminary results indicate an optimized shape following that of the temperature distribution regions in the "cooler" portions of the radiator. The results closely follow the expected radiator shape.

  13. A Regularizer Approach for RBF Networks Under the Concurrent Weight Failure Situation.

    PubMed

    Leung, Chi-Sing; Wan, Wai Yan; Feng, Ruibin

    2017-06-01

    Many existing results on fault-tolerant algorithms focus on the single fault source situation, where a trained network is affected by one kind of weight failure. In fact, a trained network may be affected by multiple kinds of weight failure. This paper first studies how the open weight fault and the multiplicative weight noise degrade the performance of radial basis function (RBF) networks. Afterward, we define the objective function for training fault-tolerant RBF networks. Based on the objective function, we then develop two learning algorithms, one batch mode and one online mode. Besides, the convergent conditions of our online algorithm are investigated. Finally, we develop a formula to estimate the test set error of faulty networks trained from our approach. This formula helps us to optimize some tuning parameters, such as RBF width.

  14. Integration of the Gene Ontology into an object-oriented architecture.

    PubMed

    Shegogue, Daniel; Zheng, W Jim

    2005-05-10

    To standardize gene product descriptions, a formal vocabulary defined as the Gene Ontology (GO) has been developed. GO terms have been categorized into biological processes, molecular functions, and cellular components. However, there is no single representation that integrates all the terms into one cohesive model. Furthermore, GO definitions have little information explaining the underlying architecture that forms these terms, such as the dynamic and static events occurring in a process. In contrast, object-oriented models have been developed to show dynamic and static events. A portion of the TGF-beta signaling pathway, which is involved in numerous cellular events including cancer, differentiation and development, was used to demonstrate the feasibility of integrating the Gene Ontology into an object-oriented model. Using object-oriented models we have captured the static and dynamic events that occur during a representative GO process, "transforming growth factor-beta (TGF-beta) receptor complex assembly" (GO:0007181). We demonstrate that the utility of GO terms can be enhanced by object-oriented technology, and that the GO terms can be integrated into an object-oriented model by serving as a basis for the generation of object functions and attributes.

  15. Integration of the Gene Ontology into an object-oriented architecture

    PubMed Central

    Shegogue, Daniel; Zheng, W Jim

    2005-01-01

    Background To standardize gene product descriptions, a formal vocabulary defined as the Gene Ontology (GO) has been developed. GO terms have been categorized into biological processes, molecular functions, and cellular components. However, there is no single representation that integrates all the terms into one cohesive model. Furthermore, GO definitions have little information explaining the underlying architecture that forms these terms, such as the dynamic and static events occurring in a process. In contrast, object-oriented models have been developed to show dynamic and static events. A portion of the TGF-beta signaling pathway, which is involved in numerous cellular events including cancer, differentiation and development, was used to demonstrate the feasibility of integrating the Gene Ontology into an object-oriented model. Results Using object-oriented models we have captured the static and dynamic events that occur during a representative GO process, "transforming growth factor-beta (TGF-beta) receptor complex assembly" (GO:0007181). Conclusion We demonstrate that the utility of GO terms can be enhanced by object-oriented technology, and that the GO terms can be integrated into an object-oriented model by serving as a basis for the generation of object functions and attributes. PMID:15885145

  16. Perception of faces in schizophrenia: Subjective (self-report) vs. objective (psychophysics) assessments

    PubMed Central

    Chen, Yue; Ekstrom, Tor

    2016-01-01

    Objectives Face perception impairment in schizophrenia has been demonstrated, mostly through experimental studies. How this laboratory-defined behavioral impairment is associated with patients’ perceptual experience of various faces in everyday life is however unclear. This question is important because a first-person account of face perception has direct consequences on social functioning of patients. In this study, we adapted and administered a self-reported questionnaire on narrative perceptual experience of faces along with psychophysical assessments of face perception in schizophrenia. Methods The self-reported questionnaire includes six rating items of face-related functioning in everyday life, providing a subjective measure of face perception. The psychophysical assessment determines perceptual threshold for discriminating different facial identities, providing an objective measure of face perception. Results Compared to controls (n=25), patients (n=35) showed significantly lower scores (worse performance) in the subjective assessment and significantly higher thresholds (worse performance) in the objective assessment. The subjective and objective face perception assessments were moderately correlated in controls but not in patients. The subjective face perception assessments were significantly correlated with measurements of a social cognitive ability (Theory of Mind), again in controls but not in patients. Conclusion These results suggest that in schizophrenia the quality of face-related functioning in everyday life is degraded and the role that basic face discrimination capacity plays in face-related everyday functioning is disrupted. PMID:26938027

  17. New Insights on the White Dwarf Luminosity and Mass Functions from the LSS-GAC Survey

    NASA Astrophysics Data System (ADS)

    Rebassa-Mansergas, Alberto; Liu, Xiaowei; Cojocaru, Ruxandra; Torres, Santiago; García–Berro, Enrique; Yuan, Haibo; Huang, Yang; Xiang, Maosheng

    2015-06-01

    The white dwarf (WD) population observed in magnitude-limited surveys can be used to derive the luminosity function (LF) and mass function (MF), once the corresponding volume corrections are employed. However, the WD samples from which the observational LFs and MFs are built are the result of complicated target selection algorithms. Thus, it is difficult to quantify the effects of the observational biases on the observed functions. The LAMOST (Large sky Area Multi-Object fiber Spectroscopic Telescope) spectroscopic survey of the Galactic anti-center (LSS-GAC) has well-defined selection criteria. This is a noticeable advantage over previous surveys. Here we derive the WD LF and MF of the LSS-GAC, and use a Monte Carlo code to simulate the WD population in the Galactic anti-center. We apply the well-defined LSS-GAC selection criteria to the simulated populations, taking into account all observational biases, and perform the first meaningful comparison between the simulated WD LFs and MFs and the observed ones.

  18. Distributions and motions of nearby stars defined by objective prism surveys and Hipparcos data

    NASA Technical Reports Server (NTRS)

    Hemenway, P. D.; Lee, J. T.; Upgren, A. R.

    1997-01-01

    Material and objective prism spectral classification work is used to determine the space density distribution of nearby common stars to the limits of objective prism spectral surveys. The aim is to extend the knowledge of the local densities of specific spectral types from a radius of 25 pc from the sun, as limited in the Gliese catalog of nearby stars, to 50 pc or more. Future plans for the application of these results to studies of the kinematic and dynamical properties of stars in the solar neighborhood as a function of their physical properties and ages are described.

  19. NASA TSRV essential flight control system requirements via object oriented analysis

    NASA Technical Reports Server (NTRS)

    Duffy, Keith S.; Hoza, Bradley J.

    1992-01-01

    The objective was to analyze the baseline flight control system of the Transport Systems Research Vehicle (TSRV) and to develop a system specification that offers high visibility of the essential system requirements in order to facilitate the future development of alternate, more advanced software architectures. The flight control system is defined to be the baseline software for the TSRV research flight deck, including all navigation, guidance, and control functions, and primary pilot displays. The Object Oriented Analysis (OOA) methodology developed is used to develop a system requirement definition. The scope of the requirements definition contained herein is limited to a portion of the Flight Management/Flight Control computer functionality. The development of a partial system requirements definition is documented, and includes a discussion of the tasks required to increase the scope of the requirements definition and recommendations for follow-on research.

  20. Computer memory management system

    DOEpatents

    Kirk, III, Whitson John

    2002-01-01

    A computer memory management system utilizing a memory structure system of "intelligent" pointers in which information related to the use status of the memory structure is designed into the pointer. Through this pointer system, The present invention provides essentially automatic memory management (often referred to as garbage collection) by allowing relationships between objects to have definite memory management behavior by use of coding protocol which describes when relationships should be maintained and when the relationships should be broken. In one aspect, the present invention system allows automatic breaking of strong links to facilitate object garbage collection, coupled with relationship adjectives which define deletion of associated objects. In another aspect, The present invention includes simple-to-use infinite undo/redo functionality in that it has the capability, through a simple function call, to undo all of the changes made to a data model since the previous `valid state` was noted.

  1. Biopathways representation and simulation on hybrid functional petri net.

    PubMed

    Matsuno, Hiroshi; Tanaka, Yukiko; Aoshima, Hitoshi; Doi, Atsushi; Matsui, Mika; Miyano, Satoru

    2011-01-01

    The following two matters should be resolved in order for biosimulation tools to be accepted by users in biology/medicine: (1) remove issues which are irrelevant to biological importance, and (2) allow users to represent biopathways intuitively and understand/manage easily the details of representation and simulation mechanism. From these criteria, we firstly define a novel notion of Petri net called Hybrid Functional Petri Net (HFPN). Then, we introduce a software tool, Genomic Object Net, for representing and simulating biopathways, which we have developed by employing the architecture of HFPN. In order to show the usefulness of Genomic Object Net for representing and simulating biopathways, we show two HFPN representations of gene regulation mechanisms of Drosophila melanogaster (fruit fly) circadian rhythm and apoptosis induced by Fas ligand. The simulation results of these biopathways are also correlated with biological observations. The software is available to academic users from http://www.GenomicObject.Net/.

  2. OB3D, a new set of 3D objects available for research: a web-based study

    PubMed Central

    Buffat, Stéphane; Chastres, Véronique; Bichot, Alain; Rider, Delphine; Benmussa, Frédéric; Lorenceau, Jean

    2014-01-01

    Studying object recognition is central to fundamental and clinical research on cognitive functions but suffers from the limitations of the available sets that cannot always be modified and adapted to meet the specific goals of each study. We here present a new set of 3D scans of real objects available on-line as ASCII files, OB3D. These files are lists of dots, each defined by a triplet of spatial coordinates and their normal that allow simple and highly versatile transformations and adaptations. We performed a web-based experiment to evaluate the minimal number of dots required for the denomination and categorization of these objects, thus providing a reference threshold. We further analyze several other variables derived from this data set, such as the correlations with object complexity. This new stimulus set, which was found to activate the Lower Occipital Complex (LOC) in another study, may be of interest for studies of cognitive functions in healthy participants and patients with cognitive impairments, including visual perception, language, memory, etc. PMID:25339920

  3. Impulsive Control for Continuous-Time Markov Decision Processes: A Linear Programming Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dufour, F., E-mail: dufour@math.u-bordeaux1.fr; Piunovskiy, A. B., E-mail: piunov@liv.ac.uk

    2016-08-15

    In this paper, we investigate an optimization problem for continuous-time Markov decision processes with both impulsive and continuous controls. We consider the so-called constrained problem where the objective of the controller is to minimize a total expected discounted optimality criterion associated with a cost rate function while keeping other performance criteria of the same form, but associated with different cost rate functions, below some given bounds. Our model allows multiple impulses at the same time moment. The main objective of this work is to study the associated linear program defined on a space of measures including the occupation measures ofmore » the controlled process and to provide sufficient conditions to ensure the existence of an optimal control.« less

  4. Clinical Application of Esophageal High-resolution Manometry in the Diagnosis of Esophageal Motility Disorders.

    PubMed

    van Hoeij, Froukje B; Bredenoord, Albert J

    2016-01-31

    Esophageal high-resolution manometry (HRM) is replacing conventional manometry in the clinical evaluation of patients with esophageal symptoms, especially dysphagia. The introduction of HRM gave rise to new objective metrics and recognizable patterns of esophageal motor function, requiring a new classification scheme: the Chicago classification. HRM measurements are more detailed and more easily performed compared to conventional manometry. The visual presentation of acquired data improved the analysis and interpretation of esophageal motor function. This led to a more sensitive, accurate, and objective analysis of esophageal motility. In this review we discuss how HRM changed the way we define and categorize esophageal motility disorders. Moreover, we discuss the clinical applications of HRM for each esophageal motility disorder separately.

  5. Clinical Application of Esophageal High-resolution Manometry in the Diagnosis of Esophageal Motility Disorders

    PubMed Central

    van Hoeij, Froukje B; Bredenoord, Albert J

    2016-01-01

    Esophageal high-resolution manometry (HRM) is replacing conventional manometry in the clinical evaluation of patients with esophageal symptoms, especially dysphagia. The introduction of HRM gave rise to new objective metrics and recognizable patterns of esophageal motor function, requiring a new classification scheme: the Chicago classification. HRM measurements are more detailed and more easily performed compared to conventional manometry. The visual presentation of acquired data improved the analysis and interpretation of esophageal motor function. This led to a more sensitive, accurate, and objective analysis of esophageal motility. In this review we discuss how HRM changed the way we define and categorize esophageal motility disorders. Moreover, we discuss the clinical applications of HRM for each esophageal motility disorder separately. PMID:26631942

  6. Objective assessment of image quality. IV. Application to adaptive optics

    PubMed Central

    Barrett, Harrison H.; Myers, Kyle J.; Devaney, Nicholas; Dainty, Christopher

    2008-01-01

    The methodology of objective assessment, which defines image quality in terms of the performance of specific observers on specific tasks of interest, is extended to temporal sequences of images with random point spread functions and applied to adaptive imaging in astronomy. The tasks considered include both detection and estimation, and the observers are the optimal linear discriminant (Hotelling observer) and the optimal linear estimator (Wiener). A general theory of first- and second-order spatiotemporal statistics in adaptive optics is developed. It is shown that the covariance matrix can be rigorously decomposed into three terms representing the effect of measurement noise, random point spread function, and random nature of the astronomical scene. Figures of merit are developed, and computational methods are discussed. PMID:17106464

  7. Reference earth orbital research and applications investigations (blue book). Volume 8: Life sciences

    NASA Technical Reports Server (NTRS)

    1971-01-01

    The functional program element for the life sciences facilities to operate aboard manned space stations is presented. The life sciences investigations will consist of the following subjects: (1) medical research, (2) vertebrate research, (3) plant research, (4) cells and tissue research, (5) invertebrate research, (6) life support and protection, and (7) man-system integration. The equipment required to provide the desired functional capability for the research facilities is defined. The goals and objectives of each research facility are described.

  8. Overview of Marketing and Distribution. The Wisconsin Guide to Local Curriculum Improvement in Industrial Education, K-12.

    ERIC Educational Resources Information Center

    Ritz, John M.

    The intent of this field tested instructional package is to familiarize the student with the marketing and distribution element of industry and its function in the production of goods and services. Defining behavioral objectives, the course description offers a media guide, suggested classroom activities, and sample student evaluation forms as…

  9. In Search of the Big Bubble

    ERIC Educational Resources Information Center

    Simoson, Andrew; Wentzky, Bethany

    2011-01-01

    Freely rising air bubbles in water sometimes assume the shape of a spherical cap, a shape also known as the "big bubble". Is it possible to find some objective function involving a combination of a bubble's attributes for which the big bubble is the optimal shape? Following the basic idea of the definite integral, we define a bubble's surface as…

  10. Maintenance and Services. The Wisconsin Guide to Local Curriculum Improvement in Industrial Education, K-12.

    ERIC Educational Resources Information Center

    Burt, Thomas

    The intent of this field tested instructional package is to familiarize the student with the maintenance and services elements of industry and their function in the production of goods and services. Defining behavioral objectives, the course description includes a media section, suggested classroom activities, and sample student evaluation forms,…

  11. Perspectives on Simulation and Miniaturization. Professional Paper No. 1472.

    ERIC Educational Resources Information Center

    McCluskey, Michael R.

    Simulation--here defined as a physical, procedural, or symbolic representation of certain aspects of a functioning system, or as a working model or representation of a real world system--has at least four areas of application: (1) training where the objective of simulation is to provide the trainee with a learning environment that will facilitate…

  12. Disturbance history and stand dynamics in secondary and old-growth forests of the Southern Appalachain Mountains, USA

    Treesearch

    Sarah M. Butler; Alan S. White; Katherine J. Elliott; Robert S. Seymour

    2014-01-01

    Understanding the patterns of past disturbance allows further insight into the complex composition, structure, and function of current and future forests, which is increasingly important in a world where disturbance characteristics are changing. Our objectives were to define disturbance causes, rates (percent disturbance per decade), magnitudes and frequency (time...

  13. What Is Communications. The Wisconsin Guide to Local Curriculum Improvement in Industrial Education, K-12.

    ERIC Educational Resources Information Center

    Ritz, John M.

    The intent of this field tested instructional package is to acquaint the student with the elements of communications and how they function in the production of goods and services. Defining behavioral objectives, the course description includes a media guide, suggested classroom activities, and sample student evaluation forms, as well as the basic…

  14. Effect of Bilateral Stimulation of the Subthalamic Nucleus on Different Speech Subsystems in Patients with Parkinson's Disease

    ERIC Educational Resources Information Center

    Putzer, Manfred; Barry, William J.; Moringlane, Jean Richard

    2008-01-01

    The effect of deep brain stimulation on the two speech-production subsystems, articulation and phonation, of nine Parkinsonian patients is examined. Production parameters (stop closure voicing; stop closure, VOT, vowel) in fast syllable-repetitions were defined and measured and quantitative, objective metrics of vocal fold function were obtained…

  15. A Variational Approach to Simultaneous Image Segmentation and Bias Correction.

    PubMed

    Zhang, Kaihua; Liu, Qingshan; Song, Huihui; Li, Xuelong

    2015-08-01

    This paper presents a novel variational approach for simultaneous estimation of bias field and segmentation of images with intensity inhomogeneity. We model intensity of inhomogeneous objects to be Gaussian distributed with different means and variances, and then introduce a sliding window to map the original image intensity onto another domain, where the intensity distribution of each object is still Gaussian but can be better separated. The means of the Gaussian distributions in the transformed domain can be adaptively estimated by multiplying the bias field with a piecewise constant signal within the sliding window. A maximum likelihood energy functional is then defined on each local region, which combines the bias field, the membership function of the object region, and the constant approximating the true signal from its corresponding object. The energy functional is then extended to the whole image domain by the Bayesian learning approach. An efficient iterative algorithm is proposed for energy minimization, via which the image segmentation and bias field correction are simultaneously achieved. Furthermore, the smoothness of the obtained optimal bias field is ensured by the normalized convolutions without extra cost. Experiments on real images demonstrated the superiority of the proposed algorithm to other state-of-the-art representative methods.

  16. Dynamic and structural control utilizing smart materials and structures

    NASA Technical Reports Server (NTRS)

    Rogers, C. A.; Robertshaw, H. H.

    1989-01-01

    An account is given of several novel 'smart material' structural control concepts that are currently under development. The thrust of these investigations is the evolution of intelligent materials and structures superceding the recently defined variable-geometry trusses and shape memory alloy-reinforced composites; the substances envisioned will be able to autonomously evaluate emergent environmental conditions and adapt to them, and even change their operational objectives. While until now the primary objective of the developmental efforts presently discussed has been materials that mimic biological functions, entirely novel concepts may be formulated in due course.

  17. Interpreting fMRI data: maps, modules and dimensions

    PubMed Central

    Op de Beeck, Hans P.; Haushofer, Johannes; Kanwisher, Nancy G.

    2009-01-01

    Neuroimaging research over the past decade has revealed a detailed picture of the functional organization of the human brain. Here we focus on two fundamental questions that are raised by the detailed mapping of sensory and cognitive functions and illustrate these questions with findings from the object-vision pathway. First, are functionally specific regions that are located close together best understood as distinct cortical modules or as parts of a larger-scale cortical map? Second, what functional properties define each cortical map or module? We propose a model in which overlapping continuous maps of simple features give rise to discrete modules that are selective for complex stimuli. PMID:18200027

  18. Can Functional Magnetic Resonance Imaging Improve Success Rates in CNS Drug Discovery?

    PubMed Central

    Borsook, David; Hargreaves, Richard; Becerra, Lino

    2011-01-01

    Introduction The bar for developing new treatments for CNS disease is getting progressively higher and fewer novel mechanisms are being discovered, validated and developed. The high costs of drug discovery necessitate early decisions to ensure the best molecules and hypotheses are tested in expensive late stage clinical trials. The discovery of brain imaging biomarkers that can bridge preclinical to clinical CNS drug discovery and provide a ‘language of translation’ affords the opportunity to improve the objectivity of decision-making. Areas Covered This review discusses the benefits, challenges and potential issues of using a science based biomarker strategy to change the paradigm of CNS drug development and increase success rates in the discovery of new medicines. The authors have summarized PubMed and Google Scholar based publication searches to identify recent advances in functional, structural and chemical brain imaging and have discussed how these techniques may be useful in defining CNS disease state and drug effects during drug development. Expert opinion The use of novel brain imaging biomarkers holds the bold promise of making neuroscience drug discovery smarter by increasing the objectivity of decision making thereby improving the probability of success of identifying useful drugs to treat CNS diseases. Functional imaging holds the promise to: (1) define pharmacodynamic markers as an index of target engagement (2) improve translational medicine paradigms to predict efficacy; (3) evaluate CNS efficacy and safety based on brain activation; (4) determine brain activity drug dose-response relationships and (5) provide an objective evaluation of symptom response and disease modification. PMID:21765857

  19. Structural Optimization for Reliability Using Nonlinear Goal Programming

    NASA Technical Reports Server (NTRS)

    El-Sayed, Mohamed E.

    1999-01-01

    This report details the development of a reliability based multi-objective design tool for solving structural optimization problems. Based on two different optimization techniques, namely sequential unconstrained minimization and nonlinear goal programming, the developed design method has the capability to take into account the effects of variability on the proposed design through a user specified reliability design criterion. In its sequential unconstrained minimization mode, the developed design tool uses a composite objective function, in conjunction with weight ordered design objectives, in order to take into account conflicting and multiple design criteria. Multiple design criteria of interest including structural weight, load induced stress and deflection, and mechanical reliability. The nonlinear goal programming mode, on the other hand, provides for a design method that eliminates the difficulty of having to define an objective function and constraints, while at the same time has the capability of handling rank ordered design objectives or goals. For simulation purposes the design of a pressure vessel cover plate was undertaken as a test bed for the newly developed design tool. The formulation of this structural optimization problem into sequential unconstrained minimization and goal programming form is presented. The resulting optimization problem was solved using: (i) the linear extended interior penalty function method algorithm; and (ii) Powell's conjugate directions method. Both single and multi-objective numerical test cases are included demonstrating the design tool's capabilities as it applies to this design problem.

  20. A reference architecture for telemonitoring.

    PubMed

    Clarke, Malcolm

    2004-01-01

    The Telecare Interactive Continuous Monitoring System exploits GPRS to provide an ambulatory device that monitors selected vital signs on a continuous basis. Alarms are sent when parameters fall outside preset limits, and accompanying physiological data may also be transmitted. The always-connected property of GPRS allows continuous interactive control of the device and its sensors, permitting changes to monitoring parameters or even enabling continuous monitoring of a sensor in emergency. A new personal area network (PAN) has been developed to support short-range wireless connection to sensors worn on the body including ECG and finger worn SpO2. Most notable is use of ultra low radio frequency to reduce power to minimum. The system has been designed to use a hierarchical architecture for sensors and "derived" signals, such as HR from ECG, so that each can be independently controlled and managed. Sensors are treated as objects, and functions are defined to control aspects of behaviour. These are refined in order to define a generic set of abstract functions to handle the majority of functions, leaving a minimum of sensor specific commands. The intention is to define a reference architecture in order to research the functionality and system architecture of a telemonitoring system. The Telecare project is funded through a grant from the European Commission (IST programme).

  1. Mathematical solution of multilevel fractional programming problem with fuzzy goal programming approach

    NASA Astrophysics Data System (ADS)

    Lachhwani, Kailash; Poonia, Mahaveer Prasad

    2012-08-01

    In this paper, we show a procedure for solving multilevel fractional programming problems in a large hierarchical decentralized organization using fuzzy goal programming approach. In the proposed method, the tolerance membership functions for the fuzzily described numerator and denominator part of the objective functions of all levels as well as the control vectors of the higher level decision makers are respectively defined by determining individual optimal solutions of each of the level decision makers. A possible relaxation of the higher level decision is considered for avoiding decision deadlock due to the conflicting nature of objective functions. Then, fuzzy goal programming approach is used for achieving the highest degree of each of the membership goal by minimizing negative deviational variables. We also provide sensitivity analysis with variation of tolerance values on decision vectors to show how the solution is sensitive to the change of tolerance values with the help of a numerical example.

  2. Quality of vision in refractive and cataract surgery, indirect measurers: review article.

    PubMed

    Parede, Taís Renata Ribeira; Torricelli, André Augusto Miranda; Mukai, Adriana; Vieira Netto, Marcelo; Bechara, Samir Jacob

    2013-01-01

    Visual acuity is the measurement of an individual's ability to recognize details of an object in a space. Visual function measurements in clinical ophthalmology are limited by factors such as maximum contrast and so it might not adequately reflect the real vision conditions at that moment as well as the subjective aspects of the world perception by the patient. The objective of a successful vision-restoring surgery lies not only in gaining visual acuity lines, but also in vision quality. Therefore, refractive and cataract surgeries have the responsibility of achieving quality results. It is difficult to define quality of vision by a single parameter, and the main functional-vision tests are: contrast sensitivity, disability glare, intraocular stray light and aberrometry. In the current review the different components of the visual function are explained and the several available methods to assess the vision quality are described.

  3. SASS wind ambiguity removal by direct minimization. [Seasat-A satellite scatterometer

    NASA Technical Reports Server (NTRS)

    Hoffman, R. N.

    1982-01-01

    An objective analysis procedure is presented which combines Seasat-A satellite scatterometer (SASS) data with other available data on wind speeds by minimizing an objective function of gridded wind speed values. The functions are defined as the loss functions for the SASS velocity data, the forecast, the SASS velocity magnitude data, and conventional wind speed data. Only aliases closest to the analysis were included, and a method for improving the first guess while using a minimization technique and slowly changing the parameters of the problem is introduced. The model is employed to predict the wind field for the North Atlantic on Sept. 10, 1978. Dealiased SASS data is compared with available ship readings, showing good agreement between the SASS dealiased winds and the winds measured at the surface. Expansion of the model to take in low-level cloud measurements, pressure data, and convergence and cloud level data correlations is discussed.

  4. Scripting Module for the Satellite Orbit Analysis Program (SOAP)

    NASA Technical Reports Server (NTRS)

    Carnright, Robert; Paget, Jim; Coggi, John; Stodden, David

    2008-01-01

    This add-on module to the SOAP software can perform changes to simulation objects based on the occurrence of specific conditions. This allows the software to encompass simulation response of scheduled or physical events. Users can manipulate objects in the simulation environment under programmatic control. Inputs to the scripting module are Actions, Conditions, and the Script. Actions are arbitrary modifications to constructs such as Platform Objects (i.e. satellites), Sensor Objects (representing instruments or communication links), or Analysis Objects (user-defined logical or numeric variables). Examples of actions include changes to a satellite orbit ( v), changing a sensor-pointing direction, and the manipulation of a numerical expression. Conditions represent the circumstances under which Actions are performed and can be couched in If-Then-Else logic, like performing v at specific times or adding to the spacecraft power only when it is being illuminated by the Sun. The SOAP script represents the entire set of conditions being considered over a specific time interval. The output of the scripting module is a series of events, which are changes to objects at specific times. As the SOAP simulation clock runs forward, the scheduled events are performed. If the user sets the clock back in time, the events within that interval are automatically undone. This script offers an interface for defining scripts where the user does not have to remember the vocabulary of various keywords. Actions can be captured by employing the same user interface that is used to define the objects themselves. Conditions can be set to invoke Actions by selecting them from pull-down lists. Users define the script by selecting from the pool of defined conditions. Many space systems have to react to arbitrary events that can occur from scheduling or from the environment. For example, an instrument may cease to draw power when the area that it is tasked to observe is not in view. The contingency of the planetary body blocking the line of sight is a condition upon which the power being drawn is set to zero. It remains at zero until the observation objective is again in view. Computing the total power drawn by the instrument over a period of days or weeks can now take such factors into consideration. What makes the architecture especially powerful is that the scripting module can look ahead and behind in simulation time, and this temporal versatility can be leveraged in displays such as x-y plots. For example, a plot of a satellite s altitude as a function of time can take changes to the orbit into account.

  5. Modeling Functional Neuroanatomy for an Anatomy Information System

    PubMed Central

    Niggemann, Jörg M.; Gebert, Andreas; Schulz, Stefan

    2008-01-01

    Objective Existing neuroanatomical ontologies, databases and information systems, such as the Foundational Model of Anatomy (FMA), represent outgoing connections from brain structures, but cannot represent the “internal wiring” of structures and as such, cannot distinguish between different independent connections from the same structure. Thus, a fundamental aspect of Neuroanatomy, the functional pathways and functional systems of the brain such as the pupillary light reflex system, is not adequately represented. This article identifies underlying anatomical objects which are the source of independent connections (collections of neurons) and uses these as basic building blocks to construct a model of functional neuroanatomy and its functional pathways. Design The basic representational elements of the model are unnamed groups of neurons or groups of neuron segments. These groups, their relations to each other, and the relations to the objects of macroscopic anatomy are defined. The resulting model can be incorporated into the FMA. Measurements The capabilities of the presented model are compared to the FMA and the Brain Architecture Management System (BAMS). Results Internal wiring as well as functional pathways can correctly be represented and tracked. Conclusion This model bridges the gap between representations of single neurons and their parts on the one hand and representations of spatial brain structures and areas on the other hand. It is capable of drawing correct inferences on pathways in a nervous system. The object and relation definitions are related to the Open Biomedical Ontology effort and its relation ontology, so that this model can be further developed into an ontology of neuronal functional systems. PMID:18579841

  6. The Effect of Nasal Functions on the Integrity of Grafts after Myringoplasty.

    PubMed

    Eser, Başak Çaypınar; Yılmaz, Aslı Şahin; Önder, Serap Şahin; Toros, Sema Zer; Oysu, Çağatay

    2017-12-01

    We aimed to evaluate the effects of nasal functions for the integrity of grafts after myringoplasty. In our study 78 patients who underwent myringoplasty operation between 2011-2013 were included. Group I was defined as the group with an intact tympanic membrane following surgery. Group II was defined as the group with a tympanic membrane perforation following surgery. Group I consisted of 44 and Group II consisted of 34 patients. Subjective and objective measurements of nasal functions, Eustachian tube function (ETF), and allergic status were performed using nasal obstruction symptom evaluation (NOSE) scale, visual analog scale (VAS), and the score for allergic rhinitis (SFAR) questionnaires and acoustic rhinometry and saccharin test. It was investigated whether there was any difference between these two groups in terms of these parameters. There was statistically no significant difference between groups according to the age, sex and the presence of tubal dysfunction and allergic rhinitis (p>0.05). In the group of intact tympanic membranes, the likelihood of right ear being the operated one was significantly higher compared to the group of myringoplasty failures (p=0.037). The VAS and NOSE scales did not show any significant difference between groups in terms of successful outcome of myringoplasty (p>0.05). The nasal congestion index (NCI) and the mucociliary clearance (MCC) did not show any significant difference between groups in terms of successful outcome of myringoplasty (p>0.05). This study has shown that nasal functions measured by objective and subjective methods had no effects on the success of myringoplasty.

  7. A novel approach based on preference-based index for interval bilevel linear programming problem.

    PubMed

    Ren, Aihong; Wang, Yuping; Xue, Xingsi

    2017-01-01

    This paper proposes a new methodology for solving the interval bilevel linear programming problem in which all coefficients of both objective functions and constraints are considered as interval numbers. In order to keep as much uncertainty of the original constraint region as possible, the original problem is first converted into an interval bilevel programming problem with interval coefficients in both objective functions only through normal variation of interval number and chance-constrained programming. With the consideration of different preferences of different decision makers, the concept of the preference level that the interval objective function is preferred to a target interval is defined based on the preference-based index. Then a preference-based deterministic bilevel programming problem is constructed in terms of the preference level and the order relation [Formula: see text]. Furthermore, the concept of a preference δ -optimal solution is given. Subsequently, the constructed deterministic nonlinear bilevel problem is solved with the help of estimation of distribution algorithm. Finally, several numerical examples are provided to demonstrate the effectiveness of the proposed approach.

  8. Cradle and pressure grippers

    DOEpatents

    Muniak, John E.

    2001-01-01

    A gripper that is designed to incorporate the functions of gripping, supporting and pressure tongs into one device. The gripper has two opposing finger sections with interlocking fingers that incline and taper to form a wedge. The interlocking fingers are vertically off-set so that the opposing finger sections may close together allowing the inclined, tapered tips of the fingers to extend beyond the plane defined by the opposing finger section's engagement surface. The range of motion defined by the interlocking relationship of the finger sections allows the gripper to grab, lift and support objects of varying size and shape. The gripper has one stationary and one moveable finger section. Power is provided to the moveable finger section by an actuating device enabling the gripper to close around an object to be lifted. A lifting bail is attached to the gripper and is supported by a crane that provides vertical lift.

  9. Student generated learning objectives: extent of congruence with faculty set objectives and factors influencing their generation.

    PubMed

    Abdul Ghaffar Al-Shaibani, Tarik A; Sachs-Robertson, Annette; Al Shazali, Hafiz O; Sequeira, Reginald P; Hamdy, Hosam; Al-Roomi, Khaldoon

    2003-07-01

    A problem-based learning strategy is used for curriculum planning and implementation at the Arabian Gulf University, Bahrain. Problems are constructed in a way that faculty-set objectives are expected to be identified by students during tutorials. Students in small groups, along with a tutor functioning as a facilitator, identify learning issues and define their learning objectives. We compared objectives identified by student groups with faculty-set objectives to determine extent of congruence, and identified factors that influenced students' ability at identifying faculty-set objectives. Male and female students were segregated and randomly grouped. A faculty tutor was allocated for each group. This study was based on 13 problems given to entry-level medical students. Pooled objectives of these problems were classified into four categories: structural, functional, clinical and psychosocial. Univariate analysis of variance was used for comparison, and a p > 0.05 was considered significant. The mean of overall objectives generated by the students was 54.2%, for each problem. Students identified psychosocial learning objectives more readily than structural ones. Female students identified more psychosocial objectives, whereas male students identified more of structural objectives. Tutor characteristics such as medical/non-medical background, and the years of teaching were correlated with categories of learning issues identified. Students identify part of the faculty-set learning objectives during tutorials with a faculty tutor acting as a facilitator. Students' gender influences types of learning issues identified. Content expertise of tutors does not influence identification of learning needs by students.

  10. Large-area landslide susceptibility with optimized slope-units

    NASA Astrophysics Data System (ADS)

    Alvioli, Massimiliano; Marchesini, Ivan; Reichenbach, Paola; Rossi, Mauro; Ardizzone, Francesca; Fiorucci, Federica; Guzzetti, Fausto

    2017-04-01

    A Slope-Unit (SU) is a type of morphological terrain unit bounded by drainage and divide lines that maximize the within-unit homogeneity and the between-unit heterogeneity across distinct physical and geographical boundaries [1]. Compared to other terrain subdivisions, SU are morphological terrain unit well related to the natural (i.e., geological, geomorphological, hydrological) processes that shape and characterize natural slopes. This makes SU easily recognizable in the field or in topographic base maps, and well suited for environmental and geomorphological analysis, in particular for landslide susceptibility (LS) modelling. An optimal subdivision of an area into a set of SU depends on multiple factors: size and complexity of the study area, quality and resolution of the available terrain elevation data, purpose of the terrain subdivision, scale and resolution of the phenomena for which SU are delineated. We use the recently developed r.slopeunits software [2,3] for the automatic, parametric delineation of SU within the open source GRASS GIS based on terrain elevation data and a small number of user-defined parameters. The software provides subdivisions consisting of SU with different shapes and sizes, as a function of the input parameters. In this work, we describe a procedure for the optimal selection of the user parameters through the production of a large number of realizations of the LS model. We tested the software and the optimization procedure in a 2,000 km2 area in Umbria, Central Italy. For LS zonation we adopt a logistic regression model implemented in an well-known software [4,5], using about 50 independent variables. To select the optimal SU partition for LS zonation, we want to define a metric which is able to quantify simultaneously: (i) slope-unit internal homogeneity (ii) slope-unit external heterogeneity (iii) landslide susceptibility model performance. To this end, we define a comprehensive objective function S, as the product of three normalized objective functions dealing with the points (i)-(ii)-(iii) independently. We use an intra-segment variance function V, the Moran's autocorrelation index I and the AUCROC function R arising from the application of the logistic regression model. Maximization of the objective function S = f(I,V,R) as a function of the r.slopeunits input parameters provides an objective and reproducible way to select the optimal parameter combination for a proper SU subdivision for LS modelling. We further perform an analysis of the statistical significance of the LS models as a function of the r.slopeunits input parameters, focusing on the degree of coarseness of each subdivision. We find that the LRM, when applied to subdivisions with large average SU size, has a very poor statistical significance, resulting in only few (5%, typically lithological) variables being used in the regression due to the large heterogeneity of all variables within each unit, while up to 35% of the variables are used when SU are very small. This behavior was largely expected and provides further evidence that an objective method to select SU size is highly desirable. [1] Guzzetti, F. et al., Geomorphology 31, (1999) 181-216 [2] Alvioli, M. et al., Geoscientific Model Development 9 (2016), 3975-3991 [3] http://geomorphology.irpi.cnr.it/tools/slope-units [4] Rossi, M. et al., Geomorphology 114, (2010) 129-142 [5] Rossi, M. and Reichenbach, P., Geoscientific Model Development 9 (2016), 3533-3543

  11. Space station systems analysis study. Part 2, Volume 2. [technical report

    NASA Technical Reports Server (NTRS)

    1977-01-01

    Specific system options are defined and identified for a cost effective space station capable of orderly growth with regard to both function and orbit location. Selected program options are analyzed and configuration concepts are developed to meet objectives for the satellite power system, earth servicing, space processing, and supporting activities. Transportation systems are analyzed for both LEO and GEO orbits.

  12. Developing and Validation a Usability Evaluation Tools for Distance Education Websites: Persian Version

    ERIC Educational Resources Information Center

    Hafezi, Soheila; Farahi, Ahmad; Mehri, Soheil Najafi; Mahmoodi, Hosein

    2010-01-01

    The web is playing a central role in distance education. The word "usability" is usually synonymous with functionality of the system for the user. Also, usability of a website is defined as something that can be used by a specific group of people to carry out specific objectives in an effective way, with efficiency and satisfaction.…

  13. Spacelab Level 4 Programmatic Implementation Assessment Study. Volume 4: Executive summary

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The study objectives of the Spacelab level 4 analysis were defined, along with the most significant results. The approach used in the synthesis and selection of alternate level 4 integration is described; the options included distributed site, lead center, and launch site. Principal characteristics, as well as the functional flow diagrams for each option, are presented and explained.

  14. The perception of geometrical structure from congruence

    NASA Technical Reports Server (NTRS)

    Lappin, Joseph S.; Wason, Thomas D.

    1989-01-01

    The principle function of vision is to measure the environment. As demonstrated by the coordination of motor actions with the positions and trajectories of moving objects in cluttered environments and by rapid recognition of solid objects in varying contexts from changing perspectives, vision provides real-time information about the geometrical structure and location of environmental objects and events. The geometric information provided by 2-D spatial displays is examined. It is proposed that the geometry of this information is best understood not within the traditional framework of perspective trigonometry, but in terms of the structure of qualitative relations defined by congruences among intrinsic geometric relations in images of surfaces. The basic concepts of this geometrical theory are outlined.

  15. GRAPES-Grounding representations in action, perception, and emotion systems: How object properties and categories are represented in the human brain.

    PubMed

    Martin, Alex

    2016-08-01

    In this article, I discuss some of the latest functional neuroimaging findings on the organization of object concepts in the human brain. I argue that these data provide strong support for viewing concepts as the products of highly interactive neural circuits grounded in the action, perception, and emotion systems. The nodes of these circuits are defined by regions representing specific object properties (e.g., form, color, and motion) and thus are property-specific, rather than strictly modality-specific. How these circuits are modified by external and internal environmental demands, the distinction between representational content and format, and the grounding of abstract social concepts are also discussed.

  16. The effect of face inversion for neurons inside and outside fMRI-defined face-selective cortical regions

    PubMed Central

    Van Belle, Goedele; Vanduffel, Wim; Rossion, Bruno; Vogels, Rufin

    2014-01-01

    It is widely believed that face processing in the primate brain occurs in a network of category-selective cortical regions. Combined functional MRI (fMRI)-single-cell recording studies in macaques have identified high concentrations of neurons that respond more to faces than objects within face-selective patches. However, cells with a preference for faces over objects are also found scattered throughout inferior temporal (IT) cortex, raising the question whether face-selective cells inside and outside of the face patches differ functionally. Here, we compare the properties of face-selective cells inside and outside of face-selective patches in the IT cortex by means of an image manipulation that reliably disrupts behavior toward face processing: inversion. We recorded IT neurons from two fMRI-defined face-patches (ML and AL) and a region outside of the face patches (herein labeled OUT) during upright and inverted face stimulation. Overall, turning faces upside down reduced the firing rate of face-selective cells. However, there were differences among the recording regions. First, the reduced neuronal response for inverted faces was independent of stimulus position, relative to fixation, in the face-selective patches (ML and AL) only. Additionally, the effect of inversion for face-selective cells in ML, but not those in AL or OUT, was impervious to whether the neurons were initially searched for using upright or inverted stimuli. Collectively, these results show that face-selective cells differ in their functional characteristics depending on their anatomicofunctional location, suggesting that upright faces are preferably coded by face-selective cells inside but not outside of the fMRI-defined face-selective regions of the posterior IT cortex. PMID:25520434

  17. The nature of epistemic virtues in the practice of medicine.

    PubMed

    Ahmadi Nasab Emran, Shahram

    2015-02-01

    There is an assumption in virtue epistemology that epistemic virtues are the same in different times and places. In this paper, however, I examine this assumption in the practice of medicine as a paradigm example. I identify two different paradigms of medical practice, one before and the other after the rise of bioethics in 1960s. I discuss the socially defined role and function of physicians and the epistemic goals of medical practice in these two periods to see how these elements affect the necessary epistemic virtues for physicians. I conclude that epistemic virtues of medical practice differ in these two periods according to the differing epistemic goals and the socially defined function of physicians. In the end, I respond to the possible objections to my thesis based on the distinction between skill and virtue.

  18. Cross-domain latent space projection for person re-identification

    NASA Astrophysics Data System (ADS)

    Pu, Nan; Wu, Song; Qian, Li; Xiao, Guoqiang

    2018-04-01

    In this paper, we research the problem of person re-identification and propose a cross-domain latent space projection (CDLSP) method to address the problems of the absence or insufficient labeled data in the target domain. Under the assumption that the visual features in the source domain and target domain share the similar geometric structure, we transform the visual features from source domain and target domain to a common latent space by optimizing the object function defined in the manifold alignment method. Moreover, the proposed object function takes into account the specific knowledge in the re-id with the aim to improve the performance of re-id under complex situations. Extensive experiments conducted on four benchmark datasets show the proposed CDLSP outperforms or is competitive with stateof- the-art methods for person re-identification.

  19. The optimal design of UAV wing structure

    NASA Astrophysics Data System (ADS)

    Długosz, Adam; Klimek, Wiktor

    2018-01-01

    The paper presents an optimal design of UAV wing, made of composite materials. The aim of the optimization is to improve strength and stiffness together with reduction of the weight of the structure. Three different types of functionals, which depend on stress, stiffness and the total mass are defined. The paper presents an application of the in-house implementation of the evolutionary multi-objective algorithm in optimization of the UAV wing structure. Values of the functionals are calculated on the basis of results obtained from numerical simulations. Numerical FEM model, consisting of different composite materials is created. Adequacy of the numerical model is verified by results obtained from the experiment, performed on a tensile testing machine. Examples of multi-objective optimization by means of Pareto-optimal set of solutions are presented.

  20. Getting the most out of additional guidance information in deformable image registration by leveraging multi-objective optimization

    NASA Astrophysics Data System (ADS)

    Alderliesten, Tanja; Bosman, Peter A. N.; Bel, Arjan

    2015-03-01

    Incorporating additional guidance information, e.g., landmark/contour correspondence, in deformable image registration is often desirable and is typically done by adding constraints or cost terms to the optimization function. Commonly, deciding between a "hard" constraint and a "soft" additional cost term as well as the weighting of cost terms in the optimization function is done on a trial-and-error basis. The aim of this study is to investigate the advantages of exploiting guidance information by taking a multi-objective optimization perspective. Hereto, next to objectives related to match quality and amount of deformation, we define a third objective related to guidance information. Multi-objective optimization eliminates the need to a-priori tune a weighting of objectives in a single optimization function or the strict requirement of fulfilling hard guidance constraints. Instead, Pareto-efficient trade-offs between all objectives are found, effectively making the introduction of guidance information straightforward, independent of its type or scale. Further, since complete Pareto fronts also contain less interesting parts (i.e., solutions with near-zero deformation effort), we study how adaptive steering mechanisms can be incorporated to automatically focus more on solutions of interest. We performed experiments on artificial and real clinical data with large differences, including disappearing structures. Results show the substantial benefit of using additional guidance information. Moreover, compared to the 2-objective case, additional computational cost is negligible. Finally, with the same computational budget, use of the adaptive steering mechanism provides superior solutions in the area of interest.

  1. Dynamic acoustic radiation force acting on cylindrical shells: theory and simulations.

    PubMed

    Mitri, F G; Fatemi, M

    2005-05-01

    An object placed in an acoustic field is known to experience a force due to the transfer of momentum from the wave to the object itself. This force is known to be steady when the incident field is considered to be continuous with constant amplitude. One may define the dynamic (oscillatory) radiation force for a continuous wave-field whose intensity varies slowly with time. This paper extends the theory of the dynamic acoustic radiation force resulting from an amplitude-modulated progressive plane wave-field incident on solid cylinders to the case of solid cylindrical shells with particular emphasis on their thickness and contents of their hollow regions. A new factor corresponding to the dynamic radiation force is defined as Y(d) and stands for the dynamic radiation force per unit energy density and unit cross sectional surface. The results of numerical calculations are presented, indicating the ways in which the form of the dynamic radiation force function curves are affected by variations in the material mechanical parameters and by changes in the interior fluid inside the shell's hollow region. It was shown that the dynamic radiation force function Y(d) deviates from the static radiation force function for progressive waves Y(p) when the modulation frequency increases. These results indicate that the theory presented here is broader than the existing theory on cylinders.

  2. The Ultimate Big Data Enterprise Initiative: Defining Functional Capabilities for an International Information System (IIS) for Orbital Space Data (OSD)

    NASA Astrophysics Data System (ADS)

    Raygan, R.

    Global collaboration in support of an International Information System (IIS) for Orbital Space Data (OSD) literally requires a global enterprise. As with many information technology enterprise initiatives attempting to coral the desires of business with the budgets and limitations of technology, Space Situational Awareness (SSA) includes many of the same challenges: 1) Adaptive / Intuitive Dash Board that facilitates User Experience Design for a variety of users. 2) Asset Management of hundreds of thousands of objects moving at thousands of miles per hour hundreds of miles in space. 3) Normalization and integration of diverse data in various languages, possibly hidden or protected from easy access. 4) Expectations of near real-time information availability coupled with predictive analysis to affect decisions before critical points of no return, such as Space Object Conjunction Assessment (CA). 5) Data Ownership, management, taxonomy, and accuracy. 6) Integrated metrics and easily modified algorithms for "what if" analysis. This paper proposes an approach to define the functional capabilities for an IIS for OSD. These functional capabilities not only address previously identified gaps in current systems but incorporate lessons learned from other big data, enterprise, and agile information technology initiatives that correlate to the space domain. Viewing the IIS as the "data service provider" allows adoption of existing information technology processes which strengthen governance and ensure service consumers certain levels of service dependability and accuracy.

  3. Co-optimization of CO 2 -EOR and Storage Processes under Geological Uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ampomah, William; Balch, Robert; Will, Robert

    This paper presents an integrated numerical framework to co-optimize EOR and CO 2 storage performance in the Farnsworth field unit (FWU), Ochiltree County, Texas. The framework includes a field-scale compositional reservoir flow model, an uncertainty quantification model and a neural network optimization process. The reservoir flow model has been constructed based on the field geophysical, geological, and engineering data. A laboratory fluid analysis was tuned to an equation of state and subsequently used to predict the thermodynamic minimum miscible pressure (MMP). A history match of primary and secondary recovery processes was conducted to estimate the reservoir and multiphase flow parametersmore » as the baseline case for analyzing the effect of recycling produced gas, infill drilling and water alternating gas (WAG) cycles on oil recovery and CO 2 storage. A multi-objective optimization model was defined for maximizing both oil recovery and CO 2 storage. The uncertainty quantification model comprising the Latin Hypercube sampling, Monte Carlo simulation, and sensitivity analysis, was used to study the effects of uncertain variables on the defined objective functions. Uncertain variables such as bottom hole injection pressure, WAG cycle, injection and production group rates, and gas-oil ratio among others were selected. The most significant variables were selected as control variables to be used for the optimization process. A neural network optimization algorithm was utilized to optimize the objective function both with and without geological uncertainty. The vertical permeability anisotropy (Kv/Kh) was selected as one of the uncertain parameters in the optimization process. The simulation results were compared to a scenario baseline case that predicted CO 2 storage of 74%. The results showed an improved approach for optimizing oil recovery and CO 2 storage in the FWU. The optimization process predicted more than 94% of CO 2 storage and most importantly about 28% of incremental oil recovery. The sensitivity analysis reduced the number of control variables to decrease computational time. A risk aversion factor was used to represent results at various confidence levels to assist management in the decision-making process. The defined objective functions were proved to be a robust approach to co-optimize oil recovery and CO 2 storage. The Farnsworth CO 2 project will serve as a benchmark for future CO 2–EOR or CCUS projects in the Anadarko basin or geologically similar basins throughout the world.« less

  4. Co-optimization of CO 2 -EOR and Storage Processes under Geological Uncertainty

    DOE PAGES

    Ampomah, William; Balch, Robert; Will, Robert; ...

    2017-07-01

    This paper presents an integrated numerical framework to co-optimize EOR and CO 2 storage performance in the Farnsworth field unit (FWU), Ochiltree County, Texas. The framework includes a field-scale compositional reservoir flow model, an uncertainty quantification model and a neural network optimization process. The reservoir flow model has been constructed based on the field geophysical, geological, and engineering data. A laboratory fluid analysis was tuned to an equation of state and subsequently used to predict the thermodynamic minimum miscible pressure (MMP). A history match of primary and secondary recovery processes was conducted to estimate the reservoir and multiphase flow parametersmore » as the baseline case for analyzing the effect of recycling produced gas, infill drilling and water alternating gas (WAG) cycles on oil recovery and CO 2 storage. A multi-objective optimization model was defined for maximizing both oil recovery and CO 2 storage. The uncertainty quantification model comprising the Latin Hypercube sampling, Monte Carlo simulation, and sensitivity analysis, was used to study the effects of uncertain variables on the defined objective functions. Uncertain variables such as bottom hole injection pressure, WAG cycle, injection and production group rates, and gas-oil ratio among others were selected. The most significant variables were selected as control variables to be used for the optimization process. A neural network optimization algorithm was utilized to optimize the objective function both with and without geological uncertainty. The vertical permeability anisotropy (Kv/Kh) was selected as one of the uncertain parameters in the optimization process. The simulation results were compared to a scenario baseline case that predicted CO 2 storage of 74%. The results showed an improved approach for optimizing oil recovery and CO 2 storage in the FWU. The optimization process predicted more than 94% of CO 2 storage and most importantly about 28% of incremental oil recovery. The sensitivity analysis reduced the number of control variables to decrease computational time. A risk aversion factor was used to represent results at various confidence levels to assist management in the decision-making process. The defined objective functions were proved to be a robust approach to co-optimize oil recovery and CO 2 storage. The Farnsworth CO 2 project will serve as a benchmark for future CO 2–EOR or CCUS projects in the Anadarko basin or geologically similar basins throughout the world.« less

  5. Perception of faces in schizophrenia: Subjective (self-report) vs. objective (psychophysics) assessments.

    PubMed

    Chen, Yue; Ekstrom, Tor

    2016-05-01

    Face perception impairment in schizophrenia has been demonstrated, mostly through experimental studies. How this laboratory-defined behavioral impairment is associated with patients' perceptual experience of various faces in everyday life is however unclear. This question is important because a first-person account of face perception has direct consequences on social functioning of patients. In this study, we adapted and administered a self-reported questionnaire on narrative perceptual experience of faces along with psychophysical assessments of face perception in schizophrenia. The self-reported questionnaire includes six rating items of face-related functioning in everyday life, providing a subjective measure of face perception. The psychophysical assessment determines perceptual threshold for discriminating different facial identities, providing an objective measure of face perception. Compared to controls (n = 25), patients (n = 35) showed significantly lower scores (worse performance) in the subjective assessment and significantly higher thresholds (worse performance) in the objective assessment. The subjective and objective face perception assessments were moderately correlated in controls but not in patients. The subjective face perception assessments were significantly correlated with measurements of a social cognitive ability (Theory of Mind), again in controls but not in patients. These results suggest that in schizophrenia the quality of face-related functioning in everyday life is degraded and the role that basic face discrimination capacity plays in face-related everyday functioning is disrupted. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. A mathematical function to evaluate surgical complexity of cleft lip and palate.

    PubMed

    Ortiz-Posadas, M R; Vega-Alvarado, L; Toni, B

    2009-06-01

    The objective of this work is to show the modeling of a similarity function adapted to the medical environment using the logical-combinatorial approach of pattern recognition theory, and its application comparing the condition of patients with congenital malformations in the lip and/or palate, which are called cleft-primary palate and/or cleft-secondary palate, respectively. The similarity function is defined by the comparison criteria determined for each variable, taking into account their type (qualitative or quantitative), their domain and their initial space representation. In all, we defined 18 variables, with their domains and six different comparison criteria (fuzzy and absolute difference type). The model includes, further, the importance of every variable as well as a weight which reflects the surgical complexity of the cleft. Likewise, the usefulness of this function is shown by calculating the similarity among three patients. This work was developed jointly with the Cleft Palate Team at the Reconstructive Surgery Service of the Pediatric Hospital of Tacubaya, which belongs to the Health Institute of the Federal District in Mexico City.

  7. Forecasting Electricity Prices in an Optimization Hydrothermal Problem

    NASA Astrophysics Data System (ADS)

    Matías, J. M.; Bayón, L.; Suárez, P.; Argüelles, A.; Taboada, J.

    2007-12-01

    This paper presents an economic dispatch algorithm in a hydrothermal system within the framework of a competitive and deregulated electricity market. The optimization problem of one firm is described, whose objective function can be defined as its profit maximization. Since next-day price forecasting is an aspect crucial, this paper proposes an efficient yet highly accurate next-day price new forecasting method using a functional time series approach trying to exploit the daily seasonal structure of the series of prices. For the optimization problem, an optimal control technique is applied and Pontryagin's theorem is employed.

  8. The Mass Function of Cosmic Structures

    NASA Astrophysics Data System (ADS)

    Audit, E.; Teyssier, R.; Alimi, J.-M.

    We investigate some modifications to the Press and Schechter (1974) (PS) prescription resulting from shear and tidal effects. These modifications rely on more realistic treatments of the collapse process than the standard approach based on the spherical model. First, we show that the mass function resulting from a new approximate Lagrangian dynamic (Audit and Alimi, A&A 1996), contains more objects at high mass, than the classical PS mass function and is well fitted by a PS-like function with a threshold density of deltac ≍ 1.4. However, such a Lagrangian description can underestimate the epoch of structure formation since it defines it as the collapse of the first principal axis. We therefore suggest some analytical prescriptions, for computing the collapse time along the second and third principal axes, and we deduce the corresponding mass functions. The collapse along the third axis is delayed by the shear and the number of objects of high mass then decreases. Finally, we show that the shear also strongly affects the formation of low-mass halos. This dynamical effect implies a modification of the low-mass slope of the mass function and allows the reproduction of the observed luminosity function of field galaxies.

  9. Cellular automata with object-oriented features for parallel molecular network modeling.

    PubMed

    Zhu, Hao; Wu, Yinghui; Huang, Sui; Sun, Yan; Dhar, Pawan

    2005-06-01

    Cellular automata are an important modeling paradigm for studying the dynamics of large, parallel systems composed of multiple, interacting components. However, to model biological systems, cellular automata need to be extended beyond the large-scale parallelism and intensive communication in order to capture two fundamental properties characteristic of complex biological systems: hierarchy and heterogeneity. This paper proposes extensions to a cellular automata language, Cellang, to meet this purpose. The extended language, with object-oriented features, can be used to describe the structure and activity of parallel molecular networks within cells. Capabilities of this new programming language include object structure to define molecular programs within a cell, floating-point data type and mathematical functions to perform quantitative computation, message passing capability to describe molecular interactions, as well as new operators, statements, and built-in functions. We discuss relevant programming issues of these features, including the object-oriented description of molecular interactions with molecule encapsulation, message passing, and the description of heterogeneity and anisotropy at the cell and molecule levels. By enabling the integration of modeling at the molecular level with system behavior at cell, tissue, organ, or even organism levels, the program will help improve our understanding of how complex and dynamic biological activities are generated and controlled by parallel functioning of molecular networks. Index Terms-Cellular automata, modeling, molecular network, object-oriented.

  10. Risk-Based Object Oriented Testing

    NASA Technical Reports Server (NTRS)

    Rosenberg, Linda H.; Stapko, Ruth; Gallo, Albert

    2000-01-01

    Software testing is a well-defined phase of the software development life cycle. Functional ("black box") testing and structural ("white box") testing are two methods of test case design commonly used by software developers. A lesser known testing method is risk-based testing, which takes into account the probability of failure of a portion of code as determined by its complexity. For object oriented programs, a methodology is proposed for identification of risk-prone classes. Risk-based testing is a highly effective testing technique that can be used to find and fix the most important problems as quickly as possible.

  11. Research principles and the construction of mnemonic diagrams

    NASA Technical Reports Server (NTRS)

    Venda, V. F.; Mitkin, A. A.

    1973-01-01

    Mnemonic diagrams are defined as a variety of information display devices, the essential element of which is conventional graphical presentation of technological or functional-operational links in a controlled system or object. Graphically displaying the operational structure of an object, the interd dependence between different parameters, and the interdependence between indicators and control organs, the mneomonic diagram reduces the load on the operator's memory and facilitates perception and reprocessing of information and decision making, while at the same time playing the role of visual support to the information activity of the operator. The types of mnemonic diagrams are listed.

  12. Multi-objective optimization for model predictive control.

    PubMed

    Wojsznis, Willy; Mehta, Ashish; Wojsznis, Peter; Thiele, Dirk; Blevins, Terry

    2007-06-01

    This paper presents a technique of multi-objective optimization for Model Predictive Control (MPC) where the optimization has three levels of the objective function, in order of priority: handling constraints, maximizing economics, and maintaining control. The greatest weights are assigned dynamically to control or constraint variables that are predicted to be out of their limits. The weights assigned for economics have to out-weigh those assigned for control objectives. Control variables (CV) can be controlled at fixed targets or within one- or two-sided ranges around the targets. Manipulated Variables (MV) can have assigned targets too, which may be predefined values or current actual values. This MV functionality is extremely useful when economic objectives are not defined for some or all the MVs. To achieve this complex operation, handle process outputs predicted to go out of limits, and have a guaranteed solution for any condition, the technique makes use of the priority structure, penalties on slack variables, and redefinition of the constraint and control model. An engineering implementation of this approach is shown in the MPC embedded in an industrial control system. The optimization and control of a distillation column, the standard Shell heavy oil fractionator (HOF) problem, is adequately achieved with this MPC.

  13. SASS wind ambiguity removal by direct minimization. II - Use of smoothness and dynamical constraints

    NASA Technical Reports Server (NTRS)

    Hoffman, R. N.

    1984-01-01

    A variational analysis method (VAM) is used to remove the ambiguity of the Seasat-A Satellite Scatterometer (SASS) winds. The VAM yields the best fit to the data by minimizing an objective function S which is a measure of the lack of fit. The SASS data are described and the function S and the analysis procedure are defined. Analyses of a single ship report which are analogous to Green's functions are presented. The analysis procedure is tuned and its sensitivity is described using the QE II storm. The procedure is then applied to a case study of September 6, 1978, south of Japan.

  14. Data inversion immune to cycle-skipping using AWI

    NASA Astrophysics Data System (ADS)

    Guasch, L.; Warner, M.; Umpleby, A.; Yao, G.; Morgan, J. V.

    2014-12-01

    Over the last decade, 3D Full Waveform Inversion (FWI) has become a standard model-building tool in exploration seismology, especially in oil and gas applications -thanks to the high quality (spatial density of sources and receivers) datasets acquired by the industry. FWI provides superior quantitative images than its travel-time counterparts (travel-time based inversion methods) because it aims to match all the information in the observations instead of a severely restricted subset of them, namely picked arrivals.The downside is that the solution space explored by FWI has a high number of local minima, and since the solution is restricted to local optimization methods (due to the objective function evaluation cost), the success of the inversion is subject to starting within the basin of attraction of the global minimum.Local minima can exist for a wide variety of reasons, and it seems unlikely that a formulation of the problem that can eliminate all of them -by defining the optimization problem in a form that results in a monotonic objective function- exist. However, a significant amount of local minima are created by the definition of data misfit. In its standard formulation FWI compares observed data (field data) with predicted data (generated with a synthetic model) by subtracting one from the other, and the objective function is defined as some norm of this difference. The combination of this criteria and the fact that seismic data is oscillatory produces the well-known phenomenon of cycle-skipping, where model updates try to match nearest cycles from one dataset to the other.In order to avoid cycle-skipping we propose a different comparison between observed and predicted data, based on Wiener filters, which exploits the fact that the "identity" Wiener filter is a spike at zero lag. This gives rise to a new objective function without cycle-skipped related local minima, and therefore suppress the need of accurate starting models or low frequencies in the data. This new technique, called Adaptive Waveform Inversion (AWI) appears always superior to conventional FWI.

  15. Does human perception of wetland aesthetics and healthiness relate to ecological functioning?

    PubMed

    Cottet, Marylise; Piégay, Hervé; Bornette, Gudrun

    2013-10-15

    Wetland management usually aims at preserving or restoring desirable ecological characteristics or functions. It is now well-recognized that some social criteria should also be included. Involving lay-people in wetland preservation or restoration projects may mean broadening project objectives to fit various and potentially competing requirements that relate to ecology, aesthetics, recreation, etc. In addition, perceived value depends both upon expertise and objectives, both of which vary from one stakeholder population to another. Perceived value and ecological functioning have to be reconciled in order to make a project successful. Understanding the perceptions of lay-people as well as their opinions about ecological value is a critical part of the development of sustainable management plans. Characterizing the environment in a way that adequately describes ecological function while also being consistent with lay perception may help reach such objectives. This goal has been addressed in a case study relating to wetlands of the Ain River (France). A photo-questionnaire presenting a sample of photographs of riverine wetlands distributed along the Ain River was submitted to 403 lay-people and self-identified experts. Two objectives were defined: (1) to identify the different parameters, whether visual or ecological, influencing the perception regarding the value of these ecosystems; (2) to compare the perceptions of self-identified experts and lay-people. Four criteria appear to strongly influence peoples' perceptions of ecological and aesthetical values: water transparency and colour, the presence and appearance of aquatic vegetation, the presence of sediments, and finally, trophic status. In our study, we observed only a few differences in perception. The differences primarily related to the value assigned to oligotrophic wetlands but even here, the differences between lay and expert populations were minimal. These results support the idea that it is possible to implement an integrated and participative management program for ecosystems. Our approach can provide a shared view of environmental value facilitating the work of managers in defining comprehensive goals for wetland preservation or restoration projects. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Memory Age Identity as a predictor of cognitive function in the elderly: A 2-year follow-up study.

    PubMed

    Chang, Ki Jung; Hong, Chang Hyung; Lee, Yun Hwan; Chung, Young Ki; Lim, Ki Young; Noh, Jai Sung; Kim, Jin-Ju; Kim, Haena; Kim, Hyun-Chung; Son, Sang Joon

    2018-01-01

    There is a growing interest in finding psychosocial predictors related to cognitive function. In our previous research, we conducted a cross-sectional study on memory age identity (MAI) and found that MAI might be associated with objective cognitive performance in non-cognitively impaired elderly. A longitudinal study was conducted to better understand the importance of MAI as a psychosocial predictor related to objective cognitive function. Data obtained from 1345 Korean subjects aged 60 years and above were analyzed. During the two-year follow-up, subjective memory age was assessed on three occasions using the following question: How old do you feel based on your memory? Discrepancy between subjective memory age and chronological age was then calculated. We defined this value as 'memory age identity (MAI)'. A generalized estimating equation (GEE) was then obtained to demonstrate the relationship between MAI and Korean version-Mini Mental State Examination (K-MMSE) score over the 2 years of study. MAI was found to significantly (β=-0.03, p< 0.0001) predict objective cognitive performance in the non-cognitively impaired elderly. MAI may be a potential psychosocial predictor related to objective cognitive performance in the non-cognitively impaired elderly. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Full Scenes Produce More Activation than Close-Up Scenes and Scene-Diagnostic Objects in Parahippocampal and Retrosplenial Cortex: An fMRI Study

    ERIC Educational Resources Information Center

    Henderson, John M.; Larson, Christine L.; Zhu, David C.

    2008-01-01

    We used fMRI to directly compare activation in two cortical regions previously identified as relevant to real-world scene processing: retrosplenial cortex and a region of posterior parahippocampal cortex functionally defined as the parahippocampal place area (PPA). We compared activation in these regions to full views of scenes from a global…

  18. The assessment of language and the emergence from disorders of consciousness.

    PubMed

    Pundole, Amy; Crawford, Sarah

    2017-04-06

    In order to demonstrate emergence from a disorder of consciousness (DoC) an individual is currently required to demonstrate functional object use of two objects, or functional communication defined as accurately answering six yes/no questions on two consecutive occasions (Giacino et al., 2002 ). In practice, experienced speech and language therapists (SLTs) working with this group often focus on facilitating object use or employ other language tasks, since achieving a 100% accurate yes/no response can be difficult for patients following an extensive brain injury due to language and/or cognitive impairments. There is an increasing awareness of this issue in the literature and in practice and there is discussion about reviewing the current definition of emergence. This paper outlines the traditional definition of emergence and recent updates, discusses some of the problems and implications associated with current assessment, highlights the importance of getting it right, explores potential other ways to determine emergence, and suggests further areas for research.

  19. Interpretations of Probability in Quantum Mechanics: A Case of "Experimental Metaphysics"

    NASA Astrophysics Data System (ADS)

    Hellman, Geoffrey

    After reviewing paradigmatic cases of "experimental metaphysics" basing inferences against local realism and determinism on experimental tests of Bells theorem (and successors), we concentrate on clarifying the meaning and status of "objective probability" in quantum mechanics. The terms "objective" and "subjective" are found ambiguous and inadequate, masking crucial differences turning on the question of what the numerical values of probability functions measure vs. the question of the nature of the "events" on which such functions are defined. This leads naturally to a 2×2 matrix of types of interpretations, which are then illustrated with salient examples. (Of independent interest are the splitting of "Copenhagen interpretation" into "objective" and "subjective" varieties in one of the dimensions and the splitting of Bohmian hidden variables from (other) modal interpretations along that same dimension.) It is then explained why Everett interpretations are difficult to categorize in these terms. Finally, we argue that Bohmian mechanics does not seriously threaten the experimental-metaphysical case for ultimate randomness and purely physical probabilities.

  20. A process for quantifying aesthetic and functional breast surgery: I. Quantifying optimal nipple position and vertical and horizontal skin excess for mastopexy and breast reduction.

    PubMed

    Tebbetts, John B

    2013-07-01

    This article defines a comprehensive process using quantified parameters for objective decision making, operative planning, technique selection, and outcomes analysis in mastopexy and breast reduction, and defines quantified parameters for nipple position and vertical and horizontal skin excess. Future submissions will detail application of the processes for skin envelope design and address composite, three-dimensional parenchyma modification options. Breast base width was used to define a proportional, desired nipple-to-inframammary fold distance for optimal aesthetics. Vertical and horizontal skin excess were measured, documented, and used for technique selection and skin envelope design in mastopexy and breast reduction. This method was applied in 124 consecutive mastopexy and 122 consecutive breast reduction cases. Average follow-up was 4.6 years (range, 6 to 14 years). No changes were made to the basic algorithm of the defined process during the study period. No patient required nipple repositioning. Complications included excessive lower pole restretch (4 percent), periareolar scar hypertrophy (0.8 percent), hematoma (1.2 percent), and areola shape irregularities (1.6 percent). Delayed healing at the junction of vertical and horizontal scars occurred in two of 124 reduction patients (1.6 percent), neither of whom required revision. The overall reoperation rate was 6.5 percent (16 of 246). This study defines the first steps of a comprehensive process for using objectively defined parameters that surgeons can apply to skin envelope design for mastopexy and breast reduction. The method can be used in conjunction with, or in lieu of, other described methods to determine nipple position.

  1. Figure-ground segregation by motion contrast and by luminance contrast.

    PubMed

    Regan, D; Beverley, K I

    1984-05-01

    Some naturally camouflaged objects are invisible unless they move; their boundaries are then defined by motion contrast between object and background. We compared the visual detection of such camouflaged objects with the detection of objects whose boundaries were defined by luminance contrast. The summation field area is 0.16 deg2 , and the summation time constant is 750 msec for parafoveally viewed objects whose boundaries are defined by motion contrast; these values are, respectively, about 5 and 12 times larger than the corresponding values for objects defined by luminance contrast. The log detection threshold is proportional to the eccentricity for a camouflaged object of constant area. The effect of eccentricity on threshold is less for large objects than for small objects. The log summation field diameter for detecting camouflaged objects is roughly proportional to the eccentricity, increasing to about 20 deg at 32-deg eccentricity. In contrast to the 100:1 increase of summation area for detecting camouflaged objects, the temporal summation time constant changes by only 40% between eccentricities of 0 and 16 deg.

  2. Skeletonization and Partitioning of Digital Images Using Discrete Morse Theory.

    PubMed

    Delgado-Friedrichs, Olaf; Robins, Vanessa; Sheppard, Adrian

    2015-03-01

    We show how discrete Morse theory provides a rigorous and unifying foundation for defining skeletons and partitions of grayscale digital images. We model a grayscale image as a cubical complex with a real-valued function defined on its vertices (the voxel values). This function is extended to a discrete gradient vector field using the algorithm presented in Robins, Wood, Sheppard TPAMI 33:1646 (2011). In the current paper we define basins (the building blocks of a partition) and segments of the skeleton using the stable and unstable sets associated with critical cells. The natural connection between Morse theory and homology allows us to prove the topological validity of these constructions; for example, that the skeleton is homotopic to the initial object. We simplify the basins and skeletons via Morse-theoretic cancellation of critical cells in the discrete gradient vector field using a strategy informed by persistent homology. Simple working Python code for our algorithms for efficient vector field traversal is included. Example data are taken from micro-CT images of porous materials, an application area where accurate topological models of pore connectivity are vital for fluid-flow modelling.

  3. Empiricism, ethics and orthodox economic theory: what is the appropriate basis for decision-making in the health sector?

    PubMed

    Richardson, Jeff; McKie, John

    2005-01-01

    Economics is commonly defined in terms of the relationship between people's unlimited wants and society's scarce resources. The definition implies a central role for an understanding of what people want, i.e. their objectives. This, in turn, suggests an important role for both empirical research into people's objectives and debate about the acceptability of the objectives. In contrast with this expectation, economics has avoided these issues by the adoption of an orthodoxy that imposes objectives. However evidence suggests, at least in the health sector, that people do not have the simple objectives assumed by economic theory. Amartya Sen has advocated a shift from a focus on "utility" to a focus on "capabilities" and "functionings" as a way of overcoming the shortcomings of welfarism. However, the practicality of Sen's account is threatened by the range of possible "functionings", by the lack of guidance about how they should be weighted, and by suspicions that they do not capture the full range of objectives people appear to value. We argue that "empirical ethics", an emerging approach in the health sector, provides important lessons on overcoming these problems. Moreover, it is an ethically defensible methodology, and yields practical results that can assist policy makers in the allocation of resources.

  4. Modeling functional neuroanatomy for an anatomy information system.

    PubMed

    Niggemann, Jörg M; Gebert, Andreas; Schulz, Stefan

    2008-01-01

    Existing neuroanatomical ontologies, databases and information systems, such as the Foundational Model of Anatomy (FMA), represent outgoing connections from brain structures, but cannot represent the "internal wiring" of structures and as such, cannot distinguish between different independent connections from the same structure. Thus, a fundamental aspect of Neuroanatomy, the functional pathways and functional systems of the brain such as the pupillary light reflex system, is not adequately represented. This article identifies underlying anatomical objects which are the source of independent connections (collections of neurons) and uses these as basic building blocks to construct a model of functional neuroanatomy and its functional pathways. The basic representational elements of the model are unnamed groups of neurons or groups of neuron segments. These groups, their relations to each other, and the relations to the objects of macroscopic anatomy are defined. The resulting model can be incorporated into the FMA. The capabilities of the presented model are compared to the FMA and the Brain Architecture Management System (BAMS). Internal wiring as well as functional pathways can correctly be represented and tracked. This model bridges the gap between representations of single neurons and their parts on the one hand and representations of spatial brain structures and areas on the other hand. It is capable of drawing correct inferences on pathways in a nervous system. The object and relation definitions are related to the Open Biomedical Ontology effort and its relation ontology, so that this model can be further developed into an ontology of neuronal functional systems.

  5. Learning of Rule Ensembles for Multiple Attribute Ranking Problems

    NASA Astrophysics Data System (ADS)

    Dembczyński, Krzysztof; Kotłowski, Wojciech; Słowiński, Roman; Szeląg, Marcin

    In this paper, we consider the multiple attribute ranking problem from a Machine Learning perspective. We propose two approaches to statistical learning of an ensemble of decision rules from decision examples provided by the Decision Maker in terms of pairwise comparisons of some objects. The first approach consists in learning a preference function defining a binary preference relation for a pair of objects. The result of application of this function on all pairs of objects to be ranked is then exploited using the Net Flow Score procedure, giving a linear ranking of objects. The second approach consists in learning a utility function for single objects. The utility function also gives a linear ranking of objects. In both approaches, the learning is based on the boosting technique. The presented approaches to Preference Learning share good properties of the decision rule preference model and have good performance in the massive-data learning problems. As Preference Learning and Multiple Attribute Decision Aiding share many concepts and methodological issues, in the introduction, we review some aspects bridging these two fields. To illustrate the two approaches proposed in this paper, we solve with them a toy example concerning the ranking of a set of cars evaluated by multiple attributes. Then, we perform a large data experiment on real data sets. The first data set concerns credit rating. Since recent research in the field of Preference Learning is motivated by the increasing role of modeling preferences in recommender systems and information retrieval, we chose two other massive data sets from this area - one comes from movie recommender system MovieLens, and the other concerns ranking of text documents from 20 Newsgroups data set.

  6. GRAPES—Grounding representations in action, perception, and emotion systems: How object properties and categories are represented in the human brain

    PubMed Central

    Martin, Alex

    2016-01-01

    In this article, I discuss some of the latest functional neuroimaging findings on the organization of object concepts in the human brain. I argue that these data provide strong support for viewing concepts as the products of highly interactive neural circuits grounded in the action, perception, and emotion systems. The nodes of these circuits are defined by regions representing specific object properties (e.g., form, color, and motion) and thus are property-specific, rather than strictly modality-specific. How these circuits are modified by external and internal environmental demands, the distinction between representational content and format, and the grounding of abstract social concepts are also discussed. PMID:25968087

  7. Memory-Efficient Analysis of Dense Functional Connectomes.

    PubMed

    Loewe, Kristian; Donohue, Sarah E; Schoenfeld, Mircea A; Kruse, Rudolf; Borgelt, Christian

    2016-01-01

    The functioning of the human brain relies on the interplay and integration of numerous individual units within a complex network. To identify network configurations characteristic of specific cognitive tasks or mental illnesses, functional connectomes can be constructed based on the assessment of synchronous fMRI activity at separate brain sites, and then analyzed using graph-theoretical concepts. In most previous studies, relatively coarse parcellations of the brain were used to define regions as graphical nodes. Such parcellated connectomes are highly dependent on parcellation quality because regional and functional boundaries need to be relatively consistent for the results to be interpretable. In contrast, dense connectomes are not subject to this limitation, since the parcellation inherent to the data is used to define graphical nodes, also allowing for a more detailed spatial mapping of connectivity patterns. However, dense connectomes are associated with considerable computational demands in terms of both time and memory requirements. The memory required to explicitly store dense connectomes in main memory can render their analysis infeasible, especially when considering high-resolution data or analyses across multiple subjects or conditions. Here, we present an object-based matrix representation that achieves a very low memory footprint by computing matrix elements on demand instead of explicitly storing them. In doing so, memory required for a dense connectome is reduced to the amount needed to store the underlying time series data. Based on theoretical considerations and benchmarks, different matrix object implementations and additional programs (based on available Matlab functions and Matlab-based third-party software) are compared with regard to their computational efficiency. The matrix implementation based on on-demand computations has very low memory requirements, thus enabling analyses that would be otherwise infeasible to conduct due to insufficient memory. An open source software package containing the created programs is available for download.

  8. Memory-Efficient Analysis of Dense Functional Connectomes

    PubMed Central

    Loewe, Kristian; Donohue, Sarah E.; Schoenfeld, Mircea A.; Kruse, Rudolf; Borgelt, Christian

    2016-01-01

    The functioning of the human brain relies on the interplay and integration of numerous individual units within a complex network. To identify network configurations characteristic of specific cognitive tasks or mental illnesses, functional connectomes can be constructed based on the assessment of synchronous fMRI activity at separate brain sites, and then analyzed using graph-theoretical concepts. In most previous studies, relatively coarse parcellations of the brain were used to define regions as graphical nodes. Such parcellated connectomes are highly dependent on parcellation quality because regional and functional boundaries need to be relatively consistent for the results to be interpretable. In contrast, dense connectomes are not subject to this limitation, since the parcellation inherent to the data is used to define graphical nodes, also allowing for a more detailed spatial mapping of connectivity patterns. However, dense connectomes are associated with considerable computational demands in terms of both time and memory requirements. The memory required to explicitly store dense connectomes in main memory can render their analysis infeasible, especially when considering high-resolution data or analyses across multiple subjects or conditions. Here, we present an object-based matrix representation that achieves a very low memory footprint by computing matrix elements on demand instead of explicitly storing them. In doing so, memory required for a dense connectome is reduced to the amount needed to store the underlying time series data. Based on theoretical considerations and benchmarks, different matrix object implementations and additional programs (based on available Matlab functions and Matlab-based third-party software) are compared with regard to their computational efficiency. The matrix implementation based on on-demand computations has very low memory requirements, thus enabling analyses that would be otherwise infeasible to conduct due to insufficient memory. An open source software package containing the created programs is available for download. PMID:27965565

  9. Association of exceptional parental longevity and physical function in aging.

    PubMed

    Ayers, Emmeline; Barzilai, Nir; Crandall, Jill P; Milman, Sofiya; Verghese, Joe

    2014-01-01

    Offspring of parents with exceptional longevity (OPEL), who are more likely to carry longevity-associated genotypes, may age more successfully than offspring of parents with usual survival (OPUS). Maintenance of physical function is a key attribute of successful aging. While many genetic and non-genetic factors interact to determine physical phenotype in aging, examination of the contribution of exceptional parental longevity to physical function in aging is limited. The LonGenity study recruited a relatively genetically homogenous cohort of Ashkenazi Jewish (AJ) adults age 65 and older, who were defined as either OPEL (having at least one parent who lived to age 95 or older) or OPUS (neither parent survived to age 95). Subjective and objective measures of physical function were compared between the two groups, accounting for potential confounders. Of the 893 LonGenity subjects, 365 were OPEL and 528 were OPUS. OPEL had better objective and subjective measures of physical function than OPUS, especially on unipedal stance (p = 0.009) and gait speed (p = 0.002). Results support the protective role of exceptional parental longevity in preventing decline in physical function, possibly via genetic mechanisms that should be further explored.

  10. Regular Mechanical Transformation of Rotations Into Translations: Part 1. Kinematic Analysis and Definition of the Basic Characteristics

    NASA Astrophysics Data System (ADS)

    Abadjieva, Emilia; Abadjiev, Valentin

    2017-06-01

    The science that study the processes of motions transformation upon a preliminary defined law between non-coplanar axes (in general case) axes of rotations or axis of rotation and direction of rectilinear translation by three-link mechanisms, equipped with high kinematic joints, can be treated as an independent branch of Applied Mechanics. It deals with mechanical behaviour of these multibody systems in relation to the kinematic and geometric characteristics of the elements of the high kinematic joints, which form them. The object of study here is the process of regular transformation of rotation into translation. The developed mathematical model is subjected to the defined task for studying the sliding velocity vector function at the contact point from the surfaces elements of arbitrary high kinematic joints. The main kinematic characteristics of the studied type motions transformation (kinematic cylinders on level, kinematic relative helices (helical conoids) and kinematic pitch configurations) are defined on the bases of the realized analysis. These features expand the theoretical knowledge, which is the objective of the gearing theory. They also complement the system of kinematic and geometric primitives, that form the mathematical model for synthesis of spatial rack mechanisms.

  11. Behavioral and biological interactions with small groups in confined microsocieties

    NASA Technical Reports Server (NTRS)

    Brady, J. V.; Emurian, H. H.

    1982-01-01

    Requirements for high levels of human performance in the unfamiliar and stressful environments associated with space missions necessitate the development of research-based technological procedures for maximizing the probability of effective functioning at all levels of personnel participation. Where the successful accomplishment of such missions requires the coordinated contributions of several individuals collectively identified with the achievement of a common objective, the conditions for characterizing a team, crew, or functional group are operationally defined. For the most part, studies of group performances under operational conditions which emphasize relatively long exposure to extended mission environments have been limited by the constraints imposed on experimental manipulations to identify critical effectiveness factors. On the other hand, laboratory studies involving relatively brief exposures to contrived task situations have been considered of questionable generality to operational settings requiring realistic group objectives.

  12. From learning to forgetting: behavioral, circuitry, and molecular properties define the different functional states of the recognition memory trace.

    PubMed

    Romero-Granados, Rocío; Fontán-Lozano, Angela; Delgado-García, José María; Carrión, Angel M

    2010-05-01

    Neuropsychological analyses of amnesic patients, as well as lesion experiments, indicate that the temporal lobe is essential for the encoding, storage, and expression of object recognition memory (ORM). However, temporal lobe structures directly involved in the consolidation and reconsolidation of these memories are not yet well-defined. We report here that systemic administration of a protein synthesis inhibitor before or up to 4 h after training or reactivation sessions impairs consolidation and reconsolidation of ORM, without affecting short-term memory. We have also observed that ORM reconsolidation is sensitive to protein synthesis inhibition, independently of the ORM trace age. Using bdnf and egr-1 gene expression analysis, we defined temporal lobe areas related to consolidation and reconsolidation of ORM. Training and reactivation 21 days after ORM acquisition sessions provoked changes in bdnf mRNA in somatosensory, perirhinal, and hippocampal cortices. Reactivation 2 days after the training session elicited changes in bdnf and egr-1 mRNA in entorhinal and prefrontal cortices, while reactivation 9 days post-training provoked an increase in egr-1 transcription in somatosensory and entorhinal cortices. The differences in activated circuits and in the capacity to recall the memory trace after 9 or 21 days post-training suggest that memory trace suffers functional changes in this period of time. All these results indicate that the functional state of the recognition memory trace, from acquisition to forgetting, can be specifically defined by behavioral, circuitry, and molecular properties. 2009 Wiley-Liss, Inc.

  13. The role of diffusion tensor imaging tractography for Gamma Knife thalamotomy planning.

    PubMed

    Gomes, João Gabriel Ribeiro; Gorgulho, Alessandra Augusta; de Oliveira López, Amanda; Saraiva, Crystian Wilian Chagas; Damiani, Lucas Petri; Pássaro, Anderson Martins; Salvajoli, João Victor; de Oliveira Siqueira, Ludmila; Salvajoli, Bernardo Peres; De Salles, Antônio Afonso Ferreira

    2016-12-01

    OBJECTIVE The role of tractography in Gamma Knife thalamotomy (GK-T) planning is still unclear. Pyramidal tractography might reduce the risk of radiation injury to the pyramidal tract and reduce motor complications. METHODS In this study, the ventralis intermedius nucleus (VIM) targets of 20 patients were bilaterally defined using Iplannet Stereotaxy Software, according to the anterior commissure-posterior commissure (AC-PC) line and considering the localization of the pyramidal tract. The 40 targets and tractography were transferred as objects to the GammaPlan Treatment Planning System (GP-TPS). New targets were defined, according to the AC-PC line in the functional targets section of the GP-TPS. The target offsets required to maintain the internal capsule (IC) constraint of < 15 Gy were evaluated. In addition, the strategies available in GP-TPS to maintain the minimum conventional VIM target dose at > 100 Gy were determined. RESULTS A difference was observed between the positions of both targets and the doses to the IC. The lateral (x) and the vertical (z) coordinates were adjusted 1.9 mm medially and 1.3 mm cranially, respectively. The targets defined considering the position of the pyramidal tract were more medial and superior, based on the constraint of 15 Gy touching the object representing the IC in the GP-TPS. The best strategy to meet the set constraints was 90° Gamma angle (GA) with automatic shaping of dose distribution; this was followed by 110° GA. The worst GA was 70°. Treatment time was substantially increased by the shaping strategy, approximately doubling delivery time. CONCLUSIONS Routine use of DTI pyramidal tractography might be important to fine-tune GK-T planning. DTI tractography, as well as anisotropy showing the VIM, promises to improve Gamma Knife functional procedures. They allow for a more objective definition of dose constraints to the IC and targeting. DTI pyramidal tractography introduced into the treatment planning may reduce the incidence of motor complications and improve efficacy. This needs to be validated in a large clinical series.

  14. Binocular Perception of 2D Lateral Motion and Guidance of Coordinated Motor Behavior.

    PubMed

    Fath, Aaron J; Snapp-Childs, Winona; Kountouriotis, Georgios K; Bingham, Geoffrey P

    2016-04-01

    Zannoli, Cass, Alais, and Mamassian (2012) found greater audiovisual lag between a tone and disparity-defined stimuli moving laterally (90-170 ms) than for disparity-defined stimuli moving in depth or luminance-defined stimuli moving laterally or in depth (50-60 ms). We tested if this increased lag presents an impediment to visually guided coordination with laterally moving objects. Participants used a joystick to move a virtual object in several constant relative phases with a laterally oscillating stimulus. Both the participant-controlled object and the target object were presented using a disparity-defined display that yielded information through changes in disparity over time (CDOT) or using a luminance-defined display that additionally provided information through monocular motion and interocular velocity differences (IOVD). Performance was comparable for both disparity-defined and luminance-defined displays in all relative phases. This suggests that, despite lag, perception of lateral motion through CDOT is generally sufficient to guide coordinated motor behavior.

  15. Metabolite-balancing techniques vs. 13C tracer experiments to determine metabolic fluxes in hybridoma cells.

    PubMed

    Bonarius, H P; Timmerarends, B; de Gooijer, C D; Tramper, J

    The estimation of intracellular fluxes of mammalian cells using only mass balances of the relevant metabolites is not possible because the set of linear equations defined by these mass balances is underdetermined. In order to quantify fluxes in cyclic pathways the mass balance equations can be complemented with several constraints: (1) the mass balances of co-metabolites, such as ATP or NAD(P)H, (2) linear objective functions, (3) flux data obtained by isotopic-tracer experiments. Here, these three methods are compared for the analysis of fluxes in the primary metabolism of continuously cultured hybridoma cells. The significance of different theoretical constraints and different objective functions is discussed after comparing their resulting flux distributions to the fluxes determined using 13CO2 and 13C-lactate measurements of 1 - 13C-glucose-fed hybridoma cells. Metabolic fluxes estimated using the objective functions "maximize ATP" and "maximize NADH" are relatively similar to the experimentally determined fluxes. This is consistent with the observation that cancer cells, such as hybridomas, are metabolically hyperactive, and produce ATP and NADH regardless of the need for these cofactors. Copyright 1998 John Wiley & Sons, Inc.

  16. Object-based attention in Chinese readers of Chinese words: beyond Gestalt principles.

    PubMed

    Li, Xingshan; Logan, Gordon D

    2008-10-01

    Most object-based attention studies use objects defined bottom-up by Gestalt principles. In the present study, we defined objects top-down, using Chinese words that were seen as objects by skilled readers of Chinese. Using a spatial cuing paradigm, we found that a target character was detected faster if it was in the same word as the cued character than if it was in a different word. Because there were no bottom-up factors that distinguished the words, these results showed that objects defined by subjects' knowledge--in this case, lexical information--can also constrain the deployment of attention.

  17. Hearing the shape of the Ising model with a programmable superconducting-flux annealer.

    PubMed

    Vinci, Walter; Markström, Klas; Boixo, Sergio; Roy, Aidan; Spedalieri, Federico M; Warburton, Paul A; Severini, Simone

    2014-07-16

    Two objects can be distinguished if they have different measurable properties. Thus, distinguishability depends on the Physics of the objects. In considering graphs, we revisit the Ising model as a framework to define physically meaningful spectral invariants. In this context, we introduce a family of refinements of the classical spectrum and consider the quantum partition function. We demonstrate that the energy spectrum of the quantum Ising Hamiltonian is a stronger invariant than the classical one without refinements. For the purpose of implementing the related physical systems, we perform experiments on a programmable annealer with superconducting flux technology. Departing from the paradigm of adiabatic computation, we take advantage of a noisy evolution of the device to generate statistics of low energy states. The graphs considered in the experiments have the same classical partition functions, but different quantum spectra. The data obtained from the annealer distinguish non-isomorphic graphs via information contained in the classical refinements of the functions but not via the differences in the quantum spectra.

  18. Dissociation and Convergence of the Dorsal and Ventral Visual Streams in the Human Prefrontal Cortex

    PubMed Central

    Takahashi, Emi; Ohki, Kenichi; Kim, Dae-Shik

    2012-01-01

    Visual information is largely processed through two pathways in the primate brain: an object pathway from the primary visual cortex to the temporal cortex (ventral stream) and a spatial pathway to the parietal cortex (dorsal stream). Whether and to what extent dissociation exists in the human prefrontal cortex (PFC) has long been debated. We examined anatomical connections from functionally defined areas in the temporal and parietal cortices to the PFC, using noninvasive functional and diffusion-weighted magnetic resonance imaging. The right inferior frontal gyrus (IFG) received converging input from both streams, while the right superior frontal gyrus received input only from the dorsal stream. Interstream functional connectivity to the IFG was dynamically recruited only when both object and spatial information were processed. These results suggest that the human PFC receives dissociated and converging visual pathways, and that the right IFG region serves as an integrator of the two types of information. PMID:23063444

  19. Level set segmentation of medical images based on local region statistics and maximum a posteriori probability.

    PubMed

    Cui, Wenchao; Wang, Yi; Lei, Tao; Fan, Yangyu; Feng, Yan

    2013-01-01

    This paper presents a variational level set method for simultaneous segmentation and bias field estimation of medical images with intensity inhomogeneity. In our model, the statistics of image intensities belonging to each different tissue in local regions are characterized by Gaussian distributions with different means and variances. According to maximum a posteriori probability (MAP) and Bayes' rule, we first derive a local objective function for image intensities in a neighborhood around each pixel. Then this local objective function is integrated with respect to the neighborhood center over the entire image domain to give a global criterion. In level set framework, this global criterion defines an energy in terms of the level set functions that represent a partition of the image domain and a bias field that accounts for the intensity inhomogeneity of the image. Therefore, image segmentation and bias field estimation are simultaneously achieved via a level set evolution process. Experimental results for synthetic and real images show desirable performances of our method.

  20. Pain and major depressive disorder: Associations with cognitive impairment as measured by the THINC-integrated tool (THINC-it).

    PubMed

    Cha, Danielle S; Carmona, Nicole E; Mansur, Rodrigo B; Lee, Yena; Park, Hyun Jung; Rodrigues, Nelson B; Subramaniapillai, Mehala; Rosenblat, Joshua D; Pan, Zihang; Lee, Jae Hon; Lee, JungGoo; Almatham, Fahad; Alageel, Asem; Shekotikhina, Margarita; Zhou, Aileen J; Rong, Carola; Harrison, John; McIntyre, Roger S

    2017-04-01

    To examine the role of pain on cognitive function in adults with major depressive disorder (MDD). Adults (18-65) with a Diagnostic and Statistical Manual - Fifth Edition (DSM-5)-defined diagnosis of MDD experiencing a current major depressive episode (MDE) were enrolled (n MDD =100). All subjects with MDD were matched in age, sex, and years of education to healthy controls (HC) (n HC =100) for comparison. Cognitive function was assessed using the recently validated THINC-integrated tool (THINC-it), which comprises variants of the choice reaction time (i.e., THINC-it: Spotter), One-Back (i.e., THINC-it: Symbol Check), Digit Symbol Substitution Test (i.e., THINC-it: Codebreaker), Trail Making Test - Part B (i.e., THINC-it: Trails), as well as the Perceived Deficits Questionnaire for Depression - 5-item (i.e., THINC-it: PDQ-5-D). A global index of objective cognitive function was computed using objective measures from the THINC-it, while self-rated cognitive deficits were measured using the PDQ-5-D. Pain was measured using a Visual Analogue Scale (VAS). Regression analyses evaluated the role of pain in predicting objective and subjective cognitive function. A significant between-group differences on the VAS was observed (p<0.001), with individuals with MDD reporting higher pain severity as evidenced by higher scores on the VAS than HC. Significant interaction effects were observed between self -rated cognitive deficits and pain ratings (p<0.001) on objective cognitive performance (after adjusting for MADRS total score), suggesting that pain moderates the association between self-rated and objective cognitive function. Results indicated that pain is associated with increased self-rated and objective cognitive deficits in adults with MDD. The study herein provides preliminary evidence demonstrating that adults with MDD reporting pain symptomatology and poorer subjective cognitive function is predictive of poorer objective cognitive performance. THINC-it is capable of detecting cognitive dysfunction amongst adults with MDD and pain. Copyright © 2017 Scandinavian Association for the Study of Pain. Published by Elsevier B.V. All rights reserved.

  1. The influence of lifestyle on health behavior and preference for functional foods.

    PubMed

    Szakály, Zoltán; Szente, Viktória; Kövér, György; Polereczki, Zsolt; Szigeti, Orsolya

    2012-02-01

    The main objective of this survey is to reveal the relationship between lifestyle, health behavior, and the consumption of functional foods on the basis of Grunert's food-related lifestyle model. In order to achieve this objective, a nationwide representative questionnaire-based survey was launched with 1000 participants in Hungary. The results indicate that a Hungarian consumer makes rational decisions, he or she seeks bargains, and he wants to know whether or not he gets good value for his money. Further on, various lifestyle segments are defined by the authors: the rational, uninvolved, conservative, careless, and adventurous consumer segments. Among these, consumers with a rational approach provide the primary target group for the functional food market, where health consciousness and moderate price sensitivity can be observed together. Adventurous food consumers stand out because they search for novelty; this makes them an equally important target group. Conservative consumers are another, one characterized by positive health behavior. According to the findings of the research, there is a significant relationship between lifestyle, health behavior, and the preference for functional food products. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. Pressure-Flow Analysis for the Assessment of Pediatric Oropharyngeal Dysphagia.

    PubMed

    Ferris, Lara; Rommel, Nathalie; Doeltgen, Sebastian; Scholten, Ingrid; Kritas, Stamatiki; Abu-Assi, Rammy; McCall, Lisa; Seiboth, Grace; Lowe, Katie; Moore, David; Faulks, Jenny; Omari, Taher

    2016-10-01

    To determine which objective pressure-impedance measures of pharyngeal swallowing function correlated with clinically assessed severity of oropharyngeal dysphagia (OPD) symptoms. Forty-five children with OPD and 34 control children without OPD were recruited and up to 5 liquid bolus swallows were recorded with a solid-state high-resolution manometry with impedance catheter. Individual measures of pharyngeal and upper esophageal sphincter (UES) function and a swallow risk index composite score were derived for each swallow, and averaged data for patients with OPD were compared with those of control children without OPD. Clinical severity of OPD symptoms and oral feeding competency was based on the validated Dysphagia Disorders Survey and Functional Oral Intake Scale. Those objective measures that were markers of UES relaxation, UES opening, and pharyngeal flow resistance differentiated patients with and without OPD symptoms. Patients demonstrating abnormally high pharyngeal intrabolus pressures and high UES resistance, markers of outflow obstruction, were most likely to have signs and symptoms of overt Dysphagia Disorders Survey (OR 9.24, P = .05, and 9.7, P = .016, respectively). Pharyngeal motor patterns can be recorded in children by the use of HRIM and pharyngeal function can be defined objectively with the use of pressure-impedance measures. Objective measurements suggest that pharyngeal dysfunction is common in children with clinical signs of OPD. A key finding of this study was evidence of markers of restricted UES opening. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Using multi-species occupancy models in structured decision making on managed lands

    USGS Publications Warehouse

    Sauer, John R.; Blank, Peter J.; Zipkin, Elise F.; Fallon, Jane E.; Fallon, Frederick W.

    2013-01-01

    Land managers must balance the needs of a variety of species when manipulating habitats. Structured decision making provides a systematic means of defining choices and choosing among alternative management options; implementation of a structured decision requires quantitative approaches to predicting consequences of management on the relevant species. Multi-species occupancy models provide a convenient framework for making structured decisions when the management objective is focused on a collection of species. These models use replicate survey data that are often collected on managed lands. Occupancy can be modeled for each species as a function of habitat and other environmental features, and Bayesian methods allow for estimation and prediction of collective responses of groups of species to alternative scenarios of habitat management. We provide an example of this approach using data from breeding bird surveys conducted in 2008 at the Patuxent Research Refuge in Laurel, Maryland, evaluating the effects of eliminating meadow and wetland habitats on scrub-successional and woodland-breeding bird species using summed total occupancy of species as an objective function. Removal of meadows and wetlands decreased value of an objective function based on scrub-successional species by 23.3% (95% CI: 20.3–26.5), but caused only a 2% (0.5, 3.5) increase in value of an objective function based on woodland species, documenting differential effects of elimination of meadows and wetlands on these groups of breeding birds. This approach provides a useful quantitative tool for managers interested in structured decision making.

  4. Future Roles for Autonomous Vertical Lift in Disaster Relief and Emergency Response

    NASA Technical Reports Server (NTRS)

    Young, Larry A.

    2006-01-01

    System analysis concepts are applied to the assessment of potential collaborative contributions of autonomous system and vertical lift (a.k.a. rotorcraft, VTOL, powered-lift, etc.) technologies to the important, and perhaps underemphasized, application domain of disaster relief and emergency response. In particular, an analytic framework is outlined whereby system design functional requirements for an application domain can be derived from defined societal good goals and objectives.

  5. Control system estimation and design for aerospace vehicles

    NASA Technical Reports Server (NTRS)

    Stefani, R. T.; Williams, T. L.; Yakowitz, S. J.

    1972-01-01

    The selection of an estimator which is unbiased when applied to structural parameter estimation is discussed. The mathematical relationships for structural parameter estimation are defined. It is shown that a conventional weighted least squares (CWLS) estimate is biased when applied to structural parameter estimation. Two approaches to bias removal are suggested: (1) change the CWLS estimator or (2) change the objective function. The advantages of each approach are analyzed.

  6. Future payload technology requirements study

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Technology advances needed for an overall mission model standpoint as well as those for individual shuttle payloads are defined. The technology advances relate to the mission scientific equipment, spacecraft subsystems that functionally support this equipment, and other payload-related equipment, software, and environment necessary to meet broad program objectives. In the interest of obtaining commonality of requirements, the study was structured according to technology categories rather than in terms of individual payloads.

  7. Context recognition for a hyperintensional inference machine

    NASA Astrophysics Data System (ADS)

    Duží, Marie; Fait, Michal; Menšík, Marek

    2017-07-01

    The goal of this paper is to introduce the algorithm of context recognition in the functional programming language TIL-Script, which is a necessary condition for the implementation of the TIL-Script inference machine. The TIL-Script language is an operationally isomorphic syntactic variant of Tichý's Transparent Intensional Logic (TIL). From the formal point of view, TIL is a hyperintensional, partial, typed λ-calculus with procedural semantics. Hyperintensional, because TIL λ-terms denote procedures (defined as TIL constructions) producing set-theoretic functions rather than the functions themselves; partial, because TIL is a logic of partial functions; and typed, because all the entities of TIL ontology, including constructions, receive a type within a ramified hierarchy of types. These features make it possible to distinguish three levels of abstraction at which TIL constructions operate. At the highest hyperintensional level the object to operate on is a construction (though a higher-order construction is needed to present this lower-order construction as an object of predication). At the middle intensional level the object to operate on is the function presented, or constructed, by a construction, while at the lowest extensional level the object to operate on is the value (if any) of the presented function. Thus a necessary condition for the development of an inference machine for the TIL-Script language is recognizing a context in which a construction occurs, namely extensional, intensional and hyperintensional context, in order to determine the type of an argument at which a given inference rule can be properly applied. As a result, our logic does not flout logical rules of extensional logic, which makes it possible to develop a hyperintensional inference machine for the TIL-Script language.

  8. Moving Particles Through a Finite Element Mesh

    PubMed Central

    Peskin, Adele P.; Hardin, Gary R.

    1998-01-01

    We present a new numerical technique for modeling the flow around multiple objects moving in a fluid. The method tracks the dynamic interaction between each particle and the fluid. The movements of the fluid and the object are directly coupled. A background mesh is designed to fit the geometry of the overall domain. The mesh is designed independently of the presence of the particles except in terms of how fine it must be to track particles of a given size. Each particle is represented by a geometric figure that describes its boundary. This figure overlies the mesh. Nodes are added to the mesh where the particle boundaries intersect the background mesh, increasing the number of nodes contained in each element whose boundary is intersected. These additional nodes are then used to describe and track the particle in the numerical scheme. Appropriate element shape functions are defined to approximate the solution on the elements with extra nodes. The particles are moved through the mesh by moving only the overlying nodes defining the particles. The regular finite element grid remains unchanged. In this method, the mesh does not distort as the particles move. Instead, only the placement of particle-defining nodes changes as the particles move. Element shape functions are updated as the nodes move through the elements. This method is especially suited for models of moderate numbers of moderate-size particles, where the details of the fluid-particle coupling are important. Both the complications of creating finite element meshes around appreciable numbers of particles, and extensive remeshing upon movement of the particles are simplified in this method. PMID:28009377

  9. Presentation on Instructional Objectives

    ERIC Educational Resources Information Center

    Naz, Bibi Asia

    2009-01-01

    "Learning can be defined as change in a student's capacity for performance as a result of experience" (Kenneth D. Moore). The intended changes should be specified in instructional objectives. Viewed in this context, an objective can be defined as a clear and unambiguous description of your instructional intent. An objective is not a…

  10. Spatial-Heterodyne Interferometry For Reflection And Transm Ission (Shirt) Measurements

    DOEpatents

    Hanson, Gregory R [Clinton, TN; Bingham, Philip R [Knoxville, TN; Tobin, Ken W [Harriman, TN

    2006-02-14

    Systems and methods are described for spatial-heterodyne interferometry for reflection and transmission (SHIRT) measurements. A method includes digitally recording a first spatially-heterodyned hologram using a first reference beam and a first object beam; digitally recording a second spatially-heterodyned hologram using a second reference beam and a second object beam; Fourier analyzing the digitally recorded first spatially-heterodyned hologram to define a first analyzed image; Fourier analyzing the digitally recorded second spatially-heterodyned hologram to define a second analyzed image; digitally filtering the first analyzed image to define a first result; and digitally filtering the second analyzed image to define a second result; performing a first inverse Fourier transform on the first result, and performing a second inverse Fourier transform on the second result. The first object beam is transmitted through an object that is at least partially translucent, and the second object beam is reflected from the object.

  11. NASA System Safety Handbook. Volume 2: System Safety Concepts, Guidelines, and Implementation Examples

    NASA Technical Reports Server (NTRS)

    Dezfuli, Homayoon; Benjamin, Allan; Everett, Christopher; Feather, Martin; Rutledge, Peter; Sen, Dev; Youngblood, Robert

    2015-01-01

    This is the second of two volumes that collectively comprise the NASA System Safety Handbook. Volume 1 (NASASP-210-580) was prepared for the purpose of presenting the overall framework for System Safety and for providing the general concepts needed to implement the framework. Volume 2 provides guidance for implementing these concepts as an integral part of systems engineering and risk management. This guidance addresses the following functional areas: 1.The development of objectives that collectively define adequate safety for a system, and the safety requirements derived from these objectives that are levied on the system. 2.The conduct of system safety activities, performed to meet the safety requirements, with specific emphasis on the conduct of integrated safety analysis (ISA) as a fundamental means by which systems engineering and risk management decisions are risk-informed. 3.The development of a risk-informed safety case (RISC) at major milestone reviews to argue that the systems safety objectives are satisfied (and therefore that the system is adequately safe). 4.The evaluation of the RISC (including supporting evidence) using a defined set of evaluation criteria, to assess the veracity of the claims made therein in order to support risk acceptance decisions.

  12. Heterogeneity in ADHD: Neurocognitive predictors of peer, family, and academic functioning.

    PubMed

    Kofler, Michael J; Sarver, Dustin E; Spiegel, Jamie A; Day, Taylor N; Harmon, Sherelle L; Wells, Erica L

    2017-08-01

    Childhood attention-deficit/hyperactivity disorder (ADHD) is associated with impairments in peer, family, and academic functioning. Although impairment is required for diagnosis, children with ADHD vary significantly in the areas in which they demonstrate clinically significant impairment. However, relatively little is known about the mechanisms and processes underlying these individual differences. The current study examined neurocognitive predictors of heterogeneity in peer, family, and academic functioning in a well-defined sample of 44 children with ADHD aged 8-13 years (M = 10.31, SD = 1.42; 31 boys, 13 girls; 81% Caucasian). Reliable change analysis indicated that 98% of the sample demonstrated objectively-defined impairment on at least one assessed outcome measure; 65% were impaired in two or all three areas of functioning. ADHD children with quantifiable deficits in academic success and family functioning performed worse on tests of working memory (d = 0.68 to 1.09), whereas children with impaired parent-reported social functioning demonstrated slower processing speed (d = 0.53). Dimensional analyses identified additional predictors of peer, family, and academic functioning. Working memory abilities were associated with individual differences in all three functional domains, processing speed predicted social functioning, and inhibitory control predicted family functioning. These results add to a growing literature implicating neurocognitive abilities not only in explaining behavioral differences between ADHD and non-ADHD groups, but also in the substantial heterogeneity in ecologically-valid functional outcomes associated with the disorder.

  13. Multiple object tracking with non-unique data-to-object association via generalized hypothesis testing. [tracking several aircraft near each other or ships at sea

    NASA Technical Reports Server (NTRS)

    Porter, D. W.; Lefler, R. M.

    1979-01-01

    A generalized hypothesis testing approach is applied to the problem of tracking several objects where several different associations of data with objects are possible. Such problems occur, for instance, when attempting to distinctly track several aircraft maneuvering near each other or when tracking ships at sea. Conceptually, the problem is solved by first, associating data with objects in a statistically reasonable fashion and then, tracking with a bank of Kalman filters. The objects are assumed to have motion characterized by a fixed but unknown deterministic portion plus a random process portion modeled by a shaping filter. For example, the object might be assumed to have a mean straight line path about which it maneuvers in a random manner. Several hypothesized associations of data with objects are possible because of ambiguity as to which object the data comes from, false alarm/detection errors, and possible uncertainty in the number of objects being tracked. The statistical likelihood function is computed for each possible hypothesized association of data with objects. Then the generalized likelihood is computed by maximizing the likelihood over parameters that define the deterministic motion of the object.

  14. MetaNET--a web-accessible interactive platform for biological metabolic network analysis.

    PubMed

    Narang, Pankaj; Khan, Shawez; Hemrom, Anmol Jaywant; Lynn, Andrew Michael

    2014-01-01

    Metabolic reactions have been extensively studied and compiled over the last century. These have provided a theoretical base to implement models, simulations of which are used to identify drug targets and optimize metabolic throughput at a systemic level. While tools for the perturbation of metabolic networks are available, their applications are limited and restricted as they require varied dependencies and often a commercial platform for full functionality. We have developed MetaNET, an open source user-friendly platform-independent and web-accessible resource consisting of several pre-defined workflows for metabolic network analysis. MetaNET is a web-accessible platform that incorporates a range of functions which can be combined to produce different simulations related to metabolic networks. These include (i) optimization of an objective function for wild type strain, gene/catalyst/reaction knock-out/knock-down analysis using flux balance analysis. (ii) flux variability analysis (iii) chemical species participation (iv) cycles and extreme paths identification and (v) choke point reaction analysis to facilitate identification of potential drug targets. The platform is built using custom scripts along with the open-source Galaxy workflow and Systems Biology Research Tool as components. Pre-defined workflows are available for common processes, and an exhaustive list of over 50 functions are provided for user defined workflows. MetaNET, available at http://metanet.osdd.net , provides a user-friendly rich interface allowing the analysis of genome-scale metabolic networks under various genetic and environmental conditions. The framework permits the storage of previous results, the ability to repeat analysis and share results with other users over the internet as well as run different tools simultaneously using pre-defined workflows, and user-created custom workflows.

  15. A Bayesian alternative for multi-objective ecohydrological model specification

    NASA Astrophysics Data System (ADS)

    Tang, Yating; Marshall, Lucy; Sharma, Ashish; Ajami, Hoori

    2018-01-01

    Recent studies have identified the importance of vegetation processes in terrestrial hydrologic systems. Process-based ecohydrological models combine hydrological, physical, biochemical and ecological processes of the catchments, and as such are generally more complex and parametric than conceptual hydrological models. Thus, appropriate calibration objectives and model uncertainty analysis are essential for ecohydrological modeling. In recent years, Bayesian inference has become one of the most popular tools for quantifying the uncertainties in hydrological modeling with the development of Markov chain Monte Carlo (MCMC) techniques. The Bayesian approach offers an appealing alternative to traditional multi-objective hydrologic model calibrations by defining proper prior distributions that can be considered analogous to the ad-hoc weighting often prescribed in multi-objective calibration. Our study aims to develop appropriate prior distributions and likelihood functions that minimize the model uncertainties and bias within a Bayesian ecohydrological modeling framework based on a traditional Pareto-based model calibration technique. In our study, a Pareto-based multi-objective optimization and a formal Bayesian framework are implemented in a conceptual ecohydrological model that combines a hydrological model (HYMOD) and a modified Bucket Grassland Model (BGM). Simulations focused on one objective (streamflow/LAI) and multiple objectives (streamflow and LAI) with different emphasis defined via the prior distribution of the model error parameters. Results show more reliable outputs for both predicted streamflow and LAI using Bayesian multi-objective calibration with specified prior distributions for error parameters based on results from the Pareto front in the ecohydrological modeling. The methodology implemented here provides insight into the usefulness of multiobjective Bayesian calibration for ecohydrologic systems and the importance of appropriate prior distributions in such approaches.

  16. DICOM router: an open source toolbox for communication and correction of DICOM objects.

    PubMed

    Hackländer, Thomas; Kleber, Klaus; Martin, Jens; Mertens, Heinrich

    2005-03-01

    Today, the exchange of medical images and clinical information is well defined by the digital imaging and communications in medicine (DICOM) and Health Level Seven (ie, HL7) standards. The interoperability among information systems is specified by the integration profiles of IHE (Integrating the Healthcare Enterprise). However, older imaging modalities frequently do not correctly support these interfaces and integration profiles, and some use cases are not yet specified by IHE. Therefore, corrections of DICOM objects are necessary to establish conformity. The aim of this project was to develop a toolbox that can automatically perform these recurrent corrections of the DICOM objects. The toolbox is composed of three main components: 1) a receiver to receive DICOM objects, 2) a processing pipeline to correct each object, and 3) one or more senders to forward each corrected object to predefined addressees. The toolbox is implemented under Java as an open source project. The processing pipeline is realized by means of plug ins. One of the plug ins can be programmed by the user via an external eXtensible Stylesheet Language (ie, XSL) file. Using this plug in, DICOM objects can also be converted into eXtensible Markup Language (ie, XML) documents or other data formats. DICOM storage services, DICOM CD-ROMs, and the local file system are defined as input and output channel. The toolbox is used clinically for different application areas. These are the automatic correction of DICOM objects from non-IHE-conforming modalities, the import of DICOM CD-ROMs into the picture archiving and communication system and the pseudo naming of DICOM images. The toolbox has been accepted by users in a clinical setting. Because of the open programming interfaces, the functionality can easily be adapted to future applications.

  17. Simulation of Physical Experiments in Immersive Virtual Environments

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Wasfy, Tamer M.

    2001-01-01

    An object-oriented event-driven immersive Virtual environment is described for the creation of virtual labs (VLs) for simulating physical experiments. Discussion focuses on a number of aspects of the VLs, including interface devices, software objects, and various applications. The VLs interface with output devices, including immersive stereoscopic screed(s) and stereo speakers; and a variety of input devices, including body tracking (head and hands), haptic gloves, wand, joystick, mouse, microphone, and keyboard. The VL incorporates the following types of primitive software objects: interface objects, support objects, geometric entities, and finite elements. Each object encapsulates a set of properties, methods, and events that define its behavior, appearance, and functions. A container object allows grouping of several objects. Applications of the VLs include viewing the results of the physical experiment, viewing a computer simulation of the physical experiment, simulation of the experiments procedure, computational steering, and remote control of the physical experiment. In addition, the VL can be used as a risk-free (safe) environment for training. The implementation of virtual structures testing machines, virtual wind tunnels, and a virtual acoustic testing facility is described.

  18. Report of the workshop on evidence-based design of national wildlife health programs

    USGS Publications Warehouse

    Nguyen, Natalie T.; Duff, J. Paul; Gavier-Widén, Dolores; Grillo, Tiggy; He, Hongxuan; Lee, Hang; Ratanakorn, Parntep; Rijks, Jolianne M.; Ryser-Degiorgis, Marie-Pierre; Sleeman, Jonathan M.; Stephen, Craig; Tana, Toni; Uhart, Marcela; Zimmer , Patrick

    2017-05-08

    SummaryThis report summarizes a Wildlife Disease Association sponsored workshop held in 2016. The overall objective of the workshop was to use available evidence and selected subject matter expertise to define the essential functions of a National Wildlife Health Program and the resources needed to deliver a robust and reliable program, including the basic infrastructure, workforce, data and information systems, governance, organizational capacity, and essential features, such as wildlife disease surveillance, diagnostic services, and epidemiological investigation. This workshop also provided the means to begin the process of defining the essential attributes of a national wildlife health program that could be scalable and adaptable to each nation’s needs.

  19. Possible functions of contextual modulations and receptive field nonlinearities: pop-out and texture segmentation

    PubMed Central

    Schmid, Anita M.; Victor, Jonathan D.

    2014-01-01

    When analyzing a visual image, the brain has to achieve several goals quickly. One crucial goal is to rapidly detect parts of the visual scene that might be behaviorally relevant, while another one is to segment the image into objects, to enable an internal representation of the world. Both of these processes can be driven by local variations in any of several image attributes such as luminance, color, and texture. Here, focusing on texture defined by local orientation, we propose that the two processes are mediated by separate mechanisms that function in parallel. More specifically, differences in orientation can cause an object to “pop out” and attract visual attention, if its orientation differs from that of the surrounding objects. Differences in orientation can also signal a boundary between objects and therefore provide useful information for image segmentation. We propose that contextual response modulations in primary visual cortex (V1) are responsible for orientation pop-out, while a different kind of receptive field nonlinearity in secondary visual cortex (V2) is responsible for orientation-based texture segmentation. We review a recent experiment that led us to put forward this hypothesis along with other research literature relevant to this notion. PMID:25064441

  20. TU-F-12A-04: Differential Radiation Avoidance of Functional Liver Regions Defined by 99mTc-Sulfur Colloid SPECT/CT with Proton Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bowen, S; Miyaoka, R; Kinahan, P

    2014-06-15

    Purpose: Radiotherapy for hepatocellular carcinoma patients is conventionally planned without consideration of spatial heterogeneity in hepatic function, which may increase risk of radiation-induced liver disease. Pencil beam scanning (PBS) proton radiotherapy (pRT) plans were generated to differentially decrease dose to functional liver volumes (FLV) defined on [{sup 99m}Tc]sulfur colloid (SC) SPECT/CT images (functional avoidance plans) and compared against conventional pRT plans. Methods: Three HCC patients underwent SC SPECT/CT scans for pRT planning acquired 15 min post injection over 24 min. Images were reconstructed with OSEM following scatter, collimator, and exhale CT attenuation correction. Functional liver volumes (FLV) were defined bymore » liver:spleen uptake ratio thresholds (43% to 90% maximum). Planning objectives to FLV were based on mean SC SPECT uptake ratio relative to GTV-subtracted liver and inversely scaled to mean liver dose of 20 Gy. PTV target coverage (V{sub 95}) was matched between conventional and functional avoidance plans. PBS pRT plans were optimized in RayStation for single field uniform dose (SFUD) and systematically perturbed to verify robustness to uncertainty in range, setup, and motion. Relative differences in FLV DVH and target dose heterogeneity (D{sub 2}-D{sub 98})/D50 were assessed. Results: For similar liver dose between functional avoidance and conventional PBS pRT plans (D{sub mean}≤5% difference, V{sub 18Gy}≤1% difference), dose to functional liver volumes were lower in avoidance plans but varied in magnitude across patients (FLV{sub 70%max} D{sub mean}≤26% difference, V{sub 18Gy}≤8% difference). Higher PTV dose heterogeneity in avoidance plans was associated with lower functional liver dose, particularly for the largest lesion [(D{sub 2}-D{sub 98})/D{sub 50}=13%, FLV{sub 90%max}=50% difference]. Conclusion: Differential avoidance of functional liver regions defined on sulfur colloid SPECT/CT is feasible with proton therapy. The magnitude of benefit appears to be patient specific and dependent on tumor location, size, and proximity to functional volumes. Further investigation in a larger cohort of patients may validate the clinical utility of functional avoidance planning of HCC radiotherapy.« less

  1. A new user-assisted segmentation and tracking technique for an object-based video editing system

    NASA Astrophysics Data System (ADS)

    Yu, Hong Y.; Hong, Sung-Hoon; Lee, Mike M.; Choi, Jae-Gark

    2004-03-01

    This paper presents a semi-automatic segmentation method which can be used to generate video object plane (VOP) for object based coding scheme and multimedia authoring environment. Semi-automatic segmentation can be considered as a user-assisted segmentation technique. A user can initially mark objects of interest around the object boundaries and then the user-guided and selected objects are continuously separated from the unselected areas through time evolution in the image sequences. The proposed segmentation method consists of two processing steps: partially manual intra-frame segmentation and fully automatic inter-frame segmentation. The intra-frame segmentation incorporates user-assistance to define the meaningful complete visual object of interest to be segmentation and decides precise object boundary. The inter-frame segmentation involves boundary and region tracking to obtain temporal coherence of moving object based on the object boundary information of previous frame. The proposed method shows stable efficient results that could be suitable for many digital video applications such as multimedia contents authoring, content based coding and indexing. Based on these results, we have developed objects based video editing system with several convenient editing functions.

  2. Design Optimization of a Centrifugal Fan with Splitter Blades

    NASA Astrophysics Data System (ADS)

    Heo, Man-Woong; Kim, Jin-Hyuk; Kim, Kwang-Yong

    2015-05-01

    Multi-objective optimization of a centrifugal fan with additionally installed splitter blades was performed to simultaneously maximize the efficiency and pressure rise using three-dimensional Reynolds-averaged Navier-Stokes equations and hybrid multi-objective evolutionary algorithm. Two design variables defining the location of splitter, and the height ratio between inlet and outlet of impeller were selected for the optimization. In addition, the aerodynamic characteristics of the centrifugal fan were investigated with the variation of design variables in the design space. Latin hypercube sampling was used to select the training points, and response surface approximation models were constructed as surrogate models of the objective functions. With the optimization, both the efficiency and pressure rise of the centrifugal fan with splitter blades were improved considerably compared to the reference model.

  3. On Integral Invariants for Effective 3-D Motion Trajectory Matching and Recognition.

    PubMed

    Shao, Zhanpeng; Li, Youfu

    2016-02-01

    Motion trajectories tracked from the motions of human, robots, and moving objects can provide an important clue for motion analysis, classification, and recognition. This paper defines some new integral invariants for a 3-D motion trajectory. Based on two typical kernel functions, we design two integral invariants, the distance and area integral invariants. The area integral invariants are estimated based on the blurred segment of noisy discrete curve to avoid the computation of high-order derivatives. Such integral invariants for a motion trajectory enjoy some desirable properties, such as computational locality, uniqueness of representation, and noise insensitivity. Moreover, our formulation allows the analysis of motion trajectories at a range of scales by varying the scale of kernel function. The features of motion trajectories can thus be perceived at multiscale levels in a coarse-to-fine manner. Finally, we define a distance function to measure the trajectory similarity to find similar trajectories. Through the experiments, we examine the robustness and effectiveness of the proposed integral invariants and find that they can capture the motion cues in trajectory matching and sign recognition satisfactorily.

  4. Verbal definitions of familiar objects in blind children reflect their peculiar perceptual experience.

    PubMed

    Vinter, A; Fernandes, V; Orlandi, O; Morgan, P

    2013-11-01

    The aim of the present study was to examine to what extent the verbal definitions of familiar objects produced by blind children reflect their peculiar perceptual experience and, in consequence, differ from those produced by sighted children. Ninety-six visually impaired children, aged between 6 and 14 years, and 32 age-matched sighted children had to define 10 words denoting concrete animate or inanimate familiar objects. The blind children evoked the tactile and auditory characteristics of objects and expressed personal perceptual experiences in their definitions. The sighted children relied on visual perception, and produced more visually oriented verbalism. In contrast, no differences were observed between children in their propensity to include functional attributes in their verbal definitions. The results are discussed in line with embodied views of cognition that postulate mandatory perceptuomotor processing of words during access to their meaning. © 2012 John Wiley & Sons Ltd.

  5. Habitat and Biodiversity: One out of five essential soil functions for agricultural soils

    NASA Astrophysics Data System (ADS)

    Trinsoutrot Gattin, Isabelle; Creamer, Rachel; van Leeuwen, Jeroen; Vrebos, Dirk; Gatti, Fabio; Bampa, Francesca; Schulte, Rogier; Rutgers, Michiel

    2017-04-01

    Current agricultural challenges require developing new agricultural systems that can optimize the ecological functioning of soils in order to limit the use of chemical inputs (i.e. disease suppression) and maintain a high organic matter content. This implies our ability to evaluate the effects of management practices on immediate performance objectives (i.e. fertility linked to nutrient cycling) but also in longer-term objective (i.e. C cycling and storage) in a variety of agro-climatic conditions. These issues demand the development of systemic approaches for understanding the determinants of soil functioning. In ecology, it is generally accepted that there are many positive relationships between soil biodiversity indicators and the functioning of ecosystems. Indeed, soil organisms and their interactions are essential drivers of ecosystem processes and impact the response, resilience and adaptability of ecosystems to environmental pressures. Thus, maintaining soil biodiversity is a condition for the sustainability of cropping systems. In this new context, the European project Landmark considers soil functions as a key to the improvement of agricultural land management towards sustainable development goals, amongst the five functions is soil biodiversity and habitat provisioning. We propose to present how we manage within this project to deal with this challenging objective at three spatial scales : field, landscape (regional) and European (policy). We aim to define a link between the physical, chemical and biological soil properties and "habitat & biodiversity" soil function in order to identify key indicators which modulate biodiversity. This will allow us to quantify and assess this soil function, in order to provide insight in win wins and tradeoffs in soil functions to enhance management practices which optimise the biodiversity in European agricultural systems.

  6. Identifying multiple influential spreaders based on generalized closeness centrality

    NASA Astrophysics Data System (ADS)

    Liu, Huan-Li; Ma, Chuang; Xiang, Bing-Bing; Tang, Ming; Zhang, Hai-Feng

    2018-02-01

    To maximize the spreading influence of multiple spreaders in complex networks, one important fact cannot be ignored: the multiple spreaders should be dispersively distributed in networks, which can effectively reduce the redundance of information spreading. For this purpose, we define a generalized closeness centrality (GCC) index by generalizing the closeness centrality index to a set of nodes. The problem converts to how to identify multiple spreaders such that an objective function has the minimal value. By comparing with the K-means clustering algorithm, we find that the optimization problem is very similar to the problem of minimizing the objective function in the K-means method. Therefore, how to find multiple nodes with the highest GCC value can be approximately solved by the K-means method. Two typical transmission dynamics-epidemic spreading process and rumor spreading process are implemented in real networks to verify the good performance of our proposed method.

  7. Bidirectional Elastic Image Registration Using B-Spline Affine Transformation

    PubMed Central

    Gu, Suicheng; Meng, Xin; Sciurba, Frank C.; Wang, Chen; Kaminski, Naftali; Pu, Jiantao

    2014-01-01

    A registration scheme termed as B-spline affine transformation (BSAT) is presented in this study to elastically align two images. We define an affine transformation instead of the traditional translation at each control point. Mathematically, BSAT is a generalized form of the affine transformation and the traditional B-Spline transformation (BST). In order to improve the performance of the iterative closest point (ICP) method in registering two homologous shapes but with large deformation, a bi-directional instead of the traditional unidirectional objective / cost function is proposed. In implementation, the objective function is formulated as a sparse linear equation problem, and a sub-division strategy is used to achieve a reasonable efficiency in registration. The performance of the developed scheme was assessed using both two-dimensional (2D) synthesized dataset and three-dimensional (3D) volumetric computed tomography (CT) data. Our experiments showed that the proposed B-spline affine model could obtain reasonable registration accuracy. PMID:24530210

  8. Using Bayesian Population Viability Analysis to Define Relevant Conservation Objectives.

    PubMed

    Green, Adam W; Bailey, Larissa L

    2015-01-01

    Adaptive management provides a useful framework for managing natural resources in the face of uncertainty. An important component of adaptive management is identifying clear, measurable conservation objectives that reflect the desired outcomes of stakeholders. A common objective is to have a sustainable population, or metapopulation, but it can be difficult to quantify a threshold above which such a population is likely to persist. We performed a Bayesian metapopulation viability analysis (BMPVA) using a dynamic occupancy model to quantify the characteristics of two wood frog (Lithobates sylvatica) metapopulations resulting in sustainable populations, and we demonstrate how the results could be used to define meaningful objectives that serve as the basis of adaptive management. We explored scenarios involving metapopulations with different numbers of patches (pools) using estimates of breeding occurrence and successful metamorphosis from two study areas to estimate the probability of quasi-extinction and calculate the proportion of vernal pools producing metamorphs. Our results suggest that ≥50 pools are required to ensure long-term persistence with approximately 16% of pools producing metamorphs in stable metapopulations. We demonstrate one way to incorporate the BMPVA results into a utility function that balances the trade-offs between ecological and financial objectives, which can be used in an adaptive management framework to make optimal, transparent decisions. Our approach provides a framework for using a standard method (i.e., PVA) and available information to inform a formal decision process to determine optimal and timely management policies.

  9. Life sciences payload definition and integration study, task C and D. Volume 1: Management summary

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The findings of a study to define the required payloads for conducting life science experiments in space are presented. The primary objectives of the study are: (1) identify research functions to be performed aboard life sciences spacecraft laboratories and necessary equipment, (2) develop conceptual designs of potential payloads, (3) integrate selected laboratory designs with space shuttle configurations, and (4) establish cost analysis of preliminary program planning.

  10. Decoding information about dynamically occluded objects in visual cortex

    PubMed Central

    Erlikhman, Gennady; Caplovitz, Gideon P.

    2016-01-01

    During dynamic occlusion, an object passes behind an occluding surface and then later reappears. Even when completely occluded from view, such objects are experienced as continuing to exist or persist behind the occluder, even though they are no longer visible. The contents and neural basis of this persistent representation remain poorly understood. Questions remain as to whether there is information maintained about the object itself (i.e. its shape or identity) or, non-object-specific information such as its position or velocity as it is tracked behind an occluder as well as which areas of visual cortex represent such information. Recent studies have found that early visual cortex is activated by “invisible” objects during visual imagery and by unstimulated regions along the path of apparent motion, suggesting that some properties of dynamically occluded objects may also be neurally represented in early visual cortex. We applied functional magnetic resonance imaging in human subjects to examine the representation of information within visual cortex during dynamic occlusion. For gradually occluded, but not for instantly disappearing objects, there was an increase in activity in early visual cortex (V1, V2, and V3). This activity was spatially-specific, corresponding to the occluded location in the visual field. However, the activity did not encode enough information about object identity to discriminate between different kinds of occluded objects (circles vs. stars) using MVPA. In contrast, object identity could be decoded in spatially-specific subregions of higher-order, topographically organized areas such as ventral, lateral, and temporal occipital areas (VO, LO, and TO) as well as the functionally defined LOC and hMT+. These results suggest that early visual cortex may represent the dynamically occluded object’s position or motion path, while later visual areas represent object-specific information. PMID:27663987

  11. Application of new type of distributed multimedia databases to networked electronic museum

    NASA Astrophysics Data System (ADS)

    Kuroda, Kazuhide; Komatsu, Naohisa; Komiya, Kazumi; Ikeda, Hiroaki

    1999-01-01

    Recently, various kinds of multimedia application systems have actively been developed based on the achievement of advanced high sped communication networks, computer processing technologies, and digital contents-handling technologies. Under this background, this paper proposed a new distributed multimedia database system which can effectively perform a new function of cooperative retrieval among distributed databases. The proposed system introduces a new concept of 'Retrieval manager' which functions as an intelligent controller so that the user can recognize a set of distributed databases as one logical database. The logical database dynamically generates and performs a preferred combination of retrieving parameters on the basis of both directory data and the system environment. Moreover, a concept of 'domain' is defined in the system as a managing unit of retrieval. The retrieval can effectively be performed by cooperation of processing among multiple domains. Communication language and protocols are also defined in the system. These are used in every action for communications in the system. A language interpreter in each machine translates a communication language into an internal language used in each machine. Using the language interpreter, internal processing, such internal modules as DBMS and user interface modules can freely be selected. A concept of 'content-set' is also introduced. A content-set is defined as a package of contents. Contents in the content-set are related to each other. The system handles a content-set as one object. The user terminal can effectively control the displaying of retrieved contents, referring to data indicating the relation of the contents in the content- set. In order to verify the function of the proposed system, a networked electronic museum was experimentally built. The results of this experiment indicate that the proposed system can effectively retrieve the objective contents under the control to a number of distributed domains. The result also indicate that the system can effectively work even if the system becomes large.

  12. The Neurologic Assessment in Neuro-Oncology (NANO) scale: a tool to assess neurologic function for integration into the Response Assessment in Neuro-Oncology (RANO) criteria

    PubMed Central

    DeAngelis, Lisa M.; Brandes, Alba A.; Peereboom, David M.; Galanis, Evanthia; Lin, Nancy U.; Soffietti, Riccardo; Macdonald, David R.; Chamberlain, Marc; Perry, James; Jaeckle, Kurt; Mehta, Minesh; Stupp, Roger; Muzikansky, Alona; Pentsova, Elena; Cloughesy, Timothy; Iwamoto, Fabio M.; Tonn, Joerg-Christian; Vogelbaum, Michael A.; Wen, Patrick Y.; van den Bent, Martin J.; Reardon, David A.

    2017-01-01

    Abstract Background. The Macdonald criteria and the Response Assessment in Neuro-Oncology (RANO) criteria define radiologic parameters to classify therapeutic outcome among patients with malignant glioma and specify that clinical status must be incorporated and prioritized for overall assessment. But neither provides specific parameters to do so. We hypothesized that a standardized metric to measure neurologic function will permit more effective overall response assessment in neuro-oncology. Methods. An international group of physicians including neurologists, medical oncologists, radiation oncologists, and neurosurgeons with expertise in neuro-oncology drafted the Neurologic Assessment in Neuro-Oncology (NANO) scale as an objective and quantifiable metric of neurologic function evaluable during a routine office examination. The scale was subsequently tested in a multicenter study to determine its overall reliability, inter-observer variability, and feasibility. Results. The NANO scale is a quantifiable evaluation of 9 relevant neurologic domains based on direct observation and testing conducted during routine office visits. The score defines overall response criteria. A prospective, multinational study noted a >90% inter-observer agreement rate with kappa statistic ranging from 0.35 to 0.83 (fair to almost perfect agreement), and a median assessment time of 4 minutes (interquartile range, 3–5). Conclusion. The NANO scale provides an objective clinician-reported outcome of neurologic function with high inter-observer agreement. It is designed to combine with radiographic assessment to provide an overall assessment of outcome for neuro-oncology patients in clinical trials and in daily practice. Furthermore, it complements existing patient-reported outcomes and cognition testing to combine for a global clinical outcome assessment of well-being among brain tumor patients. PMID:28453751

  13. A comparison of mean parotid gland dose with measures of parotid gland function after radiotherapy for head-and-neck cancer: Implications for future trials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roesink, Judith M.; Schipper, Maria; Busschers, Wim

    2005-11-15

    Purpose: To determine the most adequate parameter to measure the consequences of reducing the parotid gland dose. Methods and Materials: One hundred eight patients treated with radiotherapy for various malignancies of the head and neck were prospectively evaluated using three methods. Parotid gland function was objectively determined by measuring stimulated parotid flow using Lashley cups and scintigraphy. To assess xerostomia-related quality of life, the head-and-neck cancer module European Organization for Research and Treatment of Cancer QLQ (Quality of Life Questionnaire) H and N35 was used. Measurements took place before radiotherapy and 6 weeks and 12 months after the completion ofmore » radiotherapy. Complication was defined for each method using cutoff values. The correlation between these complications and the mean parotid gland dose was investigated to find the best measure for parotid gland function. Results: For both flow and scintigraphy data, the best definition for objective parotid gland toxicity seemed to be reduction of stimulated parotid flow to {<=}25% of the preradiotherapy flow. Of all the subjective variables, only the single item dry mouth 6 weeks after radiotherapy was found to be significant. The best correlation with the mean parotid gland dose was found for the stimulated flow measurements. The predictive ability was the highest for the time point 1 year after radiotherapy. Subjective findings did not correlate with the mean parotid dose. Conclusions: Stimulated flow measurements using Lashley cups, with a complication defined as flow {<=}25% of the preradiotherapy output, correlated best with the mean parotid gland dose. When reduction of the mean dose to the parotid gland is intended, the stimulated flow measurement is the best method for evaluating parotid gland function.« less

  14. Human-robot skills transfer interfaces for a flexible surgical robot.

    PubMed

    Calinon, Sylvain; Bruno, Danilo; Malekzadeh, Milad S; Nanayakkara, Thrishantha; Caldwell, Darwin G

    2014-09-01

    In minimally invasive surgery, tools go through narrow openings and manipulate soft organs to perform surgical tasks. There are limitations in current robot-assisted surgical systems due to the rigidity of robot tools. The aim of the STIFF-FLOP European project is to develop a soft robotic arm to perform surgical tasks. The flexibility of the robot allows the surgeon to move within organs to reach remote areas inside the body and perform challenging procedures in laparoscopy. This article addresses the problem of designing learning interfaces enabling the transfer of skills from human demonstration. Robot programming by demonstration encompasses a wide range of learning strategies, from simple mimicking of the demonstrator's actions to the higher level imitation of the underlying intent extracted from the demonstrations. By focusing on this last form, we study the problem of extracting an objective function explaining the demonstrations from an over-specified set of candidate reward functions, and using this information for self-refinement of the skill. In contrast to inverse reinforcement learning strategies that attempt to explain the observations with reward functions defined for the entire task (or a set of pre-defined reward profiles active for different parts of the task), the proposed approach is based on context-dependent reward-weighted learning, where the robot can learn the relevance of candidate objective functions with respect to the current phase of the task or encountered situation. The robot then exploits this information for skills refinement in the policy parameters space. The proposed approach is tested in simulation with a cutting task performed by the STIFF-FLOP flexible robot, using kinesthetic demonstrations from a Barrett WAM manipulator. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  15. Exploring the Role of Space-Defining Objects in Constructing and Maintaining Imagined Scenes

    ERIC Educational Resources Information Center

    Mullally, Sinead L.; Maguire, Eleanor A.

    2013-01-01

    It has recently been observed that certain objects, when viewed or imagined in isolation, evoke a strong sense of three-dimensional local space surrounding them (space-defining (SD) objects), while others do not (space-ambiguous (SA) objects), and this is associated with engagement of the parahippocampal cortex (PHC). But activation of the PHC is…

  16. Sleep-Wake Disturbances in Sedentary Community-Dwelling Elders With Functional Limitations

    PubMed Central

    Vaz Fragoso, Carlos A.; Miller, Michael E.; Fielding, Roger A.; King, Abby C.; Kritchevsky, Stephen B.; McDermott, Mary M.; Myers, Valerie; Newman, Anne B.; Pahor, Marco; Gill, Thomas M.

    2014-01-01

    OBJECTIVES To evaluate sleep-wake disturbances in sedentary community-dwelling elders with functional limitations. DESIGN Cross-sectional. SETTING Lifestyle Interventions and Independence in Elder (LIFE) Study. PARTICIPANTS 1635 community-dwelling persons, mean age 78.9, who spent <20 minutes/week in the past month of regular physical activity and <125 minutes/week of moderate physical activity, and had a Short Physical Performance Battery (SPPB) score <10. MEASUREMENTS Mobility was evaluated by the 400-meter walk time (slow gait speed defined as <0.8 m/s) and SPPB score (≤7 defined moderate-to-severe mobility impairment). Physical inactivity was defined by sedentary time, as percent of accelerometry wear time with activity <100 counts/min); top quartile established high sedentary time. Sleep-wake disturbances were evaluated by the Insomnia Severity Index (ISI) (range 0–28; ≥8 defined insomnia), Epworth Sleepiness Scale (ESS) (range 0–24; ≥10 defined daytime drowsiness), Pittsburgh Sleep Quality Index (PSQI) (range 0–21; >5 defined poor sleep quality), and Berlin Questionnaire (high risk of sleep apnea). RESULTS Prevalence rates were 43.5% for slow gait speed and 44.7% for moderate-to-severe mobility impairment, with 77.0% of accelerometry wear time spent as sedentary time. Prevalence rates were 33.0% for insomnia, 18.1% for daytime drowsiness, 47.8% for poor sleep quality, and 32.9% for high risk of sleep apnea. Participants with insomnia, daytime drowsiness, and poor sleep quality had mean values of 12.1 for ISI, 12.5 for ESS, and 9.2 for PSQI, respectively. In adjusted models, measures of mobility and physical inactivity were generally not associated with sleep-wake disturbances, using continuous or categorical variables. CONCLUSION In a large sample of sedentary community-dwelling elders with functional limitations, sleep-wake disturbances were prevalent but only mildly severe, and were generally not associated with mobility impairment or physical inactivity. PMID:24889836

  17. Evaluative procedures to detect, characterize, and assess the severity of diabetic neuropathy.

    PubMed

    Dyck, P J

    1991-01-01

    Minimal criteria for diabetic neuropathy need to be defined and universally applied. Standardized evaluative procedures need to be agreed and normal ranges determined from healthy volunteers. Types and stages of neuropathy should be established and assessments performed on representative populations of both Type 1 and Type 2 diabetic patients. Potential minimal criteria include absent ankle reflexes and vibratory sensation, and abnormalities of nerve conduction. However, the preferred criterion is the identification of more than two statistically defined abnormalities among symptoms and deficits, nerve conduction, quantitative sensory examination or quantitative autonomic examination. Various evaluative procedures are available. Symptoms should be assessed and scores can be assigned to neurological deficits. However, assessments of nerve conduction provide the most specific, objective, sensitive, and repeatable procedures, although these may be the least meaningful. Many techniques are available for quantitative sensory examination, but are poorly standardized and normal values are not available. For quantitative autonomic examination, tests are available for the adequacy of cardiovascular and peripheral vascular reflexes and increasingly for other autonomic functions. In any assessment of nerve function the conditions should be optimized and standardized, and stimuli defined. Specific instructions should be given and normal ranges established in healthy volunteers.

  18. An archaeal genomic signature

    NASA Technical Reports Server (NTRS)

    Graham, D. E.; Overbeek, R.; Olsen, G. J.; Woese, C. R.

    2000-01-01

    Comparisons of complete genome sequences allow the most objective and comprehensive descriptions possible of a lineage's evolution. This communication uses the completed genomes from four major euryarchaeal taxa to define a genomic signature for the Euryarchaeota and, by extension, the Archaea as a whole. The signature is defined in terms of the set of protein-encoding genes found in at least two diverse members of the euryarchaeal taxa that function uniquely within the Archaea; most signature proteins have no recognizable bacterial or eukaryal homologs. By this definition, 351 clusters of signature proteins have been identified. Functions of most proteins in this signature set are currently unknown. At least 70% of the clusters that contain proteins from all the euryarchaeal genomes also have crenarchaeal homologs. This conservative set, which appears refractory to horizontal gene transfer to the Bacteria or the Eukarya, would seem to reflect the significant innovations that were unique and fundamental to the archaeal "design fabric." Genomic protein signature analysis methods may be extended to characterize the evolution of any phylogenetically defined lineage. The complete set of protein clusters for the archaeal genomic signature is presented as supplementary material (see the PNAS web site, www.pnas.org).

  19. Tracking with occlusions via graph cuts.

    PubMed

    Papadakis, Nicolas; Bugeau, Aurélie

    2011-01-01

    This work presents a new method for tracking and segmenting along time-interacting objects within an image sequence. One major contribution of the paper is the formalization of the notion of visible and occluded parts. For each object, we aim at tracking these two parts. Assuming that the velocity of each object is driven by a dynamical law, predictions can be used to guide the successive estimations. Separating these predicted areas into good and bad parts with respect to the final segmentation and representing the objects with their visible and occluded parts permit handling partial and complete occlusions. To achieve this tracking, a label is assigned to each object and an energy function representing the multilabel problem is minimized via a graph cuts optimization. This energy contains terms based on image intensities which enable segmenting and regularizing the visible parts of the objects. It also includes terms dedicated to the management of the occluded and disappearing areas, which are defined on the areas of prediction of the objects. The results on several challenging sequences prove the strength of the proposed approach.

  20. A New Approach to Defining Human Touch Temperature Standards

    NASA Technical Reports Server (NTRS)

    Ungar, Eugene; Stroud, Kenneth

    2010-01-01

    Defining touch temperature limits for skin contact with both hot and cold objects is important to prevent pain and skin damage, which may affect task performance or become a safety concern. Pain and skin damage depend on the skin temperature during contact, which depends on the contact thermal conductance, the object's initial temperature, and its material properties. However, previous spacecraft standards have incorrectly defined touch temperature limits in terms of a single object temperature value for all materials, or have provided limited material-specific values which do not cover the gamut of likely designs. A new approach has been developed for updated NASA standards, which defines touch temperature limits in terms of skin temperature at pain onset for bare skin contact with hot and cold objects. The authors have developed an analytical verification method for safe hot and cold object temperatures for contact times from 1 second to infinity.

  1. A New Approach to Defining Human Touch Temperature Standards

    NASA Technical Reports Server (NTRS)

    Ungar, Eugene; Stroud, Kenneth

    2009-01-01

    Defining touch temperature limits for skin contact with both hot and cold objects is important to prevent pain and skin damage, which may affect task performance or become a safety concern. Pain and skin damage depend on the resulting skin temperature during contact, which depends on the object s initial temperature, its material properties and its ability to transfer heat. However, previous spacecraft standards have incorrectly defined touch temperature limits in terms of a single object temperature value for all materials, or have provided limited material-specific values which do not cover the gamut of most designs. A new approach is being used in new NASA standards, which defines touch temperature limits in terms of skin temperature at pain onset for bare skin contact with hot and cold objects. The authors have developed an analytical verification method for safe hot and cold object temperatures for contact times from 1 second to infinity.

  2. The evolution of meaning: spatio-temporal dynamics of visual object recognition.

    PubMed

    Clarke, Alex; Taylor, Kirsten I; Tyler, Lorraine K

    2011-08-01

    Research on the spatio-temporal dynamics of visual object recognition suggests a recurrent, interactive model whereby an initial feedforward sweep through the ventral stream to prefrontal cortex is followed by recurrent interactions. However, critical questions remain regarding the factors that mediate the degree of recurrent interactions necessary for meaningful object recognition. The novel prediction we test here is that recurrent interactivity is driven by increasing semantic integration demands as defined by the complexity of semantic information required by the task and driven by the stimuli. To test this prediction, we recorded magnetoencephalography data while participants named living and nonliving objects during two naming tasks. We found that the spatio-temporal dynamics of neural activity were modulated by the level of semantic integration required. Specifically, source reconstructed time courses and phase synchronization measures showed increased recurrent interactions as a function of semantic integration demands. These findings demonstrate that the cortical dynamics of object processing are modulated by the complexity of semantic information required from the visual input.

  3. Self-Assembly of an α-Helical Peptide into a Crystalline Two-Dimensional Nanoporous Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Magnotti, Elizabeth L.; Hughes, Spencer A.; Dillard, Rebecca S.

    Sequence-specific peptides have been demonstrated to self-assemble into structurally defined nanoscale objects including nanofibers, nanotubes, and nanosheets. The latter structures display significant promise for the construction of hybrid materials for functional devices due to their extended planar geometry. Realization of this objective necessitates the ability to control the structural features of the resultant assemblies through the peptide sequence. The design of a amphiphilic peptide, 3FD-IL, is described that comprises two repeats of a canonical 18 amino acid sequence associated with straight α-helical structures. Peptide 3FD-IL displays 3-fold screw symmetry in a helical conformation and self-assembles into nanosheets based on hexagonalmore » packing of helices. Biophysical evidence from TEM, cryo-TEM, SAXS, AFM, and STEM measurements on the 3FD-IL nanosheets support a structural model based on a honeycomb lattice, in which the length of the peptide determines the thickness of the nanosheet and the packing of helices defines the presence of nanoscale channels that permeate the sheet. The honeycomb structure can be rationalized on the basis of geometrical packing frustration in which the channels occupy defect sites that define a periodic superlattice. In conclusion, the resultant 2D materials may have potential as materials for nanoscale transport and controlled release applications.« less

  4. Self-Assembly of an α-Helical Peptide into a Crystalline Two-Dimensional Nanoporous Framework

    DOE PAGES

    Magnotti, Elizabeth L.; Hughes, Spencer A.; Dillard, Rebecca S.; ...

    2016-11-22

    Sequence-specific peptides have been demonstrated to self-assemble into structurally defined nanoscale objects including nanofibers, nanotubes, and nanosheets. The latter structures display significant promise for the construction of hybrid materials for functional devices due to their extended planar geometry. Realization of this objective necessitates the ability to control the structural features of the resultant assemblies through the peptide sequence. The design of a amphiphilic peptide, 3FD-IL, is described that comprises two repeats of a canonical 18 amino acid sequence associated with straight α-helical structures. Peptide 3FD-IL displays 3-fold screw symmetry in a helical conformation and self-assembles into nanosheets based on hexagonalmore » packing of helices. Biophysical evidence from TEM, cryo-TEM, SAXS, AFM, and STEM measurements on the 3FD-IL nanosheets support a structural model based on a honeycomb lattice, in which the length of the peptide determines the thickness of the nanosheet and the packing of helices defines the presence of nanoscale channels that permeate the sheet. The honeycomb structure can be rationalized on the basis of geometrical packing frustration in which the channels occupy defect sites that define a periodic superlattice. In conclusion, the resultant 2D materials may have potential as materials for nanoscale transport and controlled release applications.« less

  5. PLAID- A COMPUTER AIDED DESIGN SYSTEM

    NASA Technical Reports Server (NTRS)

    Brown, J. W.

    1994-01-01

    PLAID is a three-dimensional Computer Aided Design (CAD) system which enables the user to interactively construct, manipulate, and display sets of highly complex geometric models. PLAID was initially developed by NASA to assist in the design of Space Shuttle crewstation panels, and the detection of payload object collisions. It has evolved into a more general program for convenient use in many engineering applications. Special effort was made to incorporate CAD techniques and features which minimize the users workload in designing and managing PLAID models. PLAID consists of three major modules: the Primitive Object Generator (BUILD), the Composite Object Generator (COG), and the DISPLAY Processor. The BUILD module provides a means of constructing simple geometric objects called primitives. The primitives are created from polygons which are defined either explicitly by vertex coordinates, or graphically by use of terminal crosshairs or a digitizer. Solid objects are constructed by combining, rotating, or translating the polygons. Corner rounding, hole punching, milling, and contouring are special features available in BUILD. The COG module hierarchically organizes and manipulates primitives and other previously defined COG objects to form complex assemblies. The composite object is constructed by applying transformations to simpler objects. The transformations which can be applied are scalings, rotations, and translations. These transformations may be defined explicitly or defined graphically using the interactive COG commands. The DISPLAY module enables the user to view COG assemblies from arbitrary viewpoints (inside or outside the object) both in wireframe and hidden line renderings. The PLAID projection of a three-dimensional object can be either orthographic or with perspective. A conflict analysis option enables detection of spatial conflicts or collisions. DISPLAY provides camera functions to simulate a view of the model through different lenses. Other features include hardcopy plot generation, scaling and zoom options, distance tabulations, and descriptive text in different sizes and fonts. An object in the PLAID database is not just a collection of lines; rather, it is a true three-dimensional representation from which correct hidden line renditions can be computed for any specified eye point. The drawings produced in the various modules of PLAID can be stored in files for future use. The PLAID program product is available by license for a period of 10 years to domestic U.S. licensees. The licensed program product includes the PLAID source code, command procedures, sample applications, and one set of supporting documentation. Copies of the documentation may be purchased separately at the price indicated below. PLAID is written in FORTRAN 77 for single user interactive execution and has been implemented on a DEC VAX series computer operating under VMS with a recommended core memory of four megabytes. PLAID requires a Tektronix 4014 compatible graphics display terminal and optionally uses a Tektronix 4631 compatible graphics hardcopier. Plots of resulting PLAID displays may be produced using the Calcomp 960, HP 7221, or HP 7580 plotters. Digitizer tablets can also be supported. This program was developed in 1986.

  6. Heterocyclic Salt Synthesis and Rational Properties Tailoring (PREPRINT)

    DTIC Science & Technology

    2009-06-23

    performance behavior can be tailored in a controlled manner, defines the objective of a pertinent synthesis effort. Achieving this objective by...the structure of the anion. To illustrate this premise, four general synthesis methods to synthesize heterocyclic salts, including several new binary...manner, defines the objective of a pertinent synthesis effort. Achieving this objective by introducing structural alterations in a neutral covalent

  7. A stochastic optimal feedforward and feedback control methodology for superagility

    NASA Technical Reports Server (NTRS)

    Halyo, Nesim; Direskeneli, Haldun; Taylor, Deborah B.

    1992-01-01

    A new control design methodology is developed: Stochastic Optimal Feedforward and Feedback Technology (SOFFT). Traditional design techniques optimize a single cost function (which expresses the design objectives) to obtain both the feedforward and feedback control laws. This approach places conflicting demands on the control law such as fast tracking versus noise atttenuation/disturbance rejection. In the SOFFT approach, two cost functions are defined. The feedforward control law is designed to optimize one cost function, the feedback optimizes the other. By separating the design objectives and decoupling the feedforward and feedback design processes, both objectives can be achieved fully. A new measure of command tracking performance, Z-plots, is also developed. By analyzing these plots at off-nominal conditions, the sensitivity or robustness of the system in tracking commands can be predicted. Z-plots provide an important tool for designing robust control systems. The Variable-Gain SOFFT methodology was used to design a flight control system for the F/A-18 aircraft. It is shown that SOFFT can be used to expand the operating regime and provide greater performance (flying/handling qualities) throughout the extended flight regime. This work was performed under the NASA SBIR program. ICS plans to market the software developed as a new module in its commercial CACSD software package: ACET.

  8. Functions of the human frontoparietal attention network: Evidence from neuroimaging

    PubMed Central

    Scolari, Miranda; Seidl-Rathkopf, Katharina N; Kastner, Sabine

    2016-01-01

    Human frontoparietal cortex has long been implicated as a source of attentional control. However, the mechanistic underpinnings of these control functions have remained elusive due to limitations of neuroimaging techniques that rely on anatomical landmarks to localize patterns of activation. The recent advent of topographic mapping via functional magnetic resonance imaging (fMRI) has allowed the reliable parcellation of the network into 18 independent subregions in individual subjects, thereby offering unprecedented opportunities to address a wide range of empirical questions as to how mechanisms of control operate. Here, we review the human neuroimaging literature that has begun to explore space-based, feature-based, object-based and category-based attentional control within the context of topographically defined frontoparietal cortex. PMID:27398396

  9. Mathematic simulation of soil-vegetation condition and land use structure applying basin approach

    NASA Astrophysics Data System (ADS)

    Mishchenko, Natalia; Shirkin, Leonid; Krasnoshchekov, Alexey

    2016-04-01

    Ecosystems anthropogenic transformation is basically connected to the changes of land use structure and human impact on soil fertility. The Research objective is to simulate the stationary state of river basins ecosystems. Materials and Methods. Basin approach has been applied in the research. Small rivers basins of the Klyazma river have been chosen as our research objects. They are situated in the central part of the Russian plain. The analysis is carried out applying integrated characteristics of ecosystems functioning and mathematic simulation methods. To design mathematic simulator functional simulation methods and principles on the basis of regression, correlation and factor analysis have been applied in the research. Results. Mathematic simulation resulted in defining possible permanent conditions of "phytocenosis-soil" system in coordinates of phytomass, phytoproductivity, humus percentage in soil. Ecosystem productivity is determined not only by vegetation photosynthesis activity but also by the area ratio of forest and meadow phytocenosis. Local maximums attached to certain phytomass areas and humus content in soil have been defined on the basin phytoproductivity distribution diagram. We explain the local maximum by synergetic effect. It appears with the definite ratio of forest and meadow phytocenosis. In this case, utmost values of phytomass for the whole area are higher than just a sum of utmost values of phytomass for the forest and meadow phytocenosis. Efficient correlation of natural forest and meadow phytocenosis has been defined for the Klyazma river. Conclusion. Mathematic simulation methods assist in forecasting the ecosystem conditions under various changes of land use structure. Nowadays overgrowing of the abandoned agricultural lands is very actual for the Russian Federation. Simulation results demonstrate that natural ratio of forest and meadow phytocenosis for the area will restore during agricultural overgrowing.

  10. Function follows form: combining nanoimprint and inkjet printing

    NASA Astrophysics Data System (ADS)

    Muehlberger, M.; Haslinger, M. J.; Kurzmann, J.; Ikeda, M.; Fuchsbauer, A.; Faury, T.; Koepplmayr, T.; Ausserhuber, H.; Kastner, J.; Woegerer, C.; Fechtig, D.

    2017-06-01

    We are investigating the possibilities and the technical requirements to do nanopatterning on arbitrary curved surfaces. This is done considering the opportunities and possibilities of additive manufacturing. One of the key elements is the necessity to deposit material in well-defined areas of various complex 3D objects. In order to achieve this we are developing a robot-based inkjet printing. We report on our progress with this respect and also on our efforts to perform nanoimprinting on curved, possibly 3D-printed objects using materials that can be deposited by inkjet printing. In the framework of this article, we provide an overview over our current status, the challenges and an outlook.

  11. Clinician-Reported Outcome Assessments of Treatment Benefit: Report of the ISPOR Clinical Outcome Assessment Emerging Good Practices Task Force

    PubMed Central

    Powers, John H.; Patrick, Donald L.; Walton, Marc K.; Marquis, Patrick; Cano, Stefan; Hobart, Jeremy; Isaac, Maria; Vamvakas, Spiros; Slagle, Ashley; Molsen, Elizabeth; Burke, Laurie B.

    2017-01-01

    A clinician-reported outcome (ClinRO) assessment is a type of clinical outcome assessment (COA). ClinRO assessments, like all COAs (patient-reported, observer-reported, or performance outcome assessments), are used to 1) measure patients’ health status and 2) define end points that can be interpreted as treatment benefits of medical interventions on how patients feel, function, or survive in clinical trials. Like other COAs, ClinRO assessments can be influenced by human choices, judgment, or motivation. A ClinRO assessment is conducted and reported by a trained health care professional and requires specialized professional training to evaluate the patient’s health status. This is the second of two reports by the ISPOR Clinical Outcomes Assessment—Emerging Good Practices for Outcomes Research Task Force. The first report provided an overview of COAs including definitions important for an understanding of COA measurement practices. This report focuses specifically on issues related to ClinRO assessments. In this report, we define three types of ClinRO assessments (readings, ratings, and clinician global assessments) and describe emerging good measurement practices in their development and evaluation. The good measurement practices include 1) defining the context of use; 2) identifying the concept of interest measured; 3) defining the intended treatment benefit on how patients feel, function, or survive reflected by the ClinRO assessment and evaluating the relationship between that intended treatment benefit and the concept of interest; 4) documenting content validity; 5) evaluating other measurement properties once content validity is established (including intra- and inter-rater reliability); 6) defining study objectives and end point(s) objectives, and defining study end points and placing study end points within the hierarchy of end points; 7) establishing interpretability in trial results; and 8) evaluating operational considerations for the implementation of ClinRO assessments used as end points in clinical trials. Applying good measurement practices to ClinRO assessment development and evaluation will lead to more efficient and accurate measurement of treatment effects. This is important beyond regulatory approval in that it provides evidence for the uptake of new interventions into clinical practice and provides justification to payers for reimbursement on the basis of the clearly demonstrated added value of the new intervention. PMID:28212963

  12. Clinician-Reported Outcome Assessments of Treatment Benefit: Report of the ISPOR Clinical Outcome Assessment Emerging Good Practices Task Force.

    PubMed

    Powers, John H; Patrick, Donald L; Walton, Marc K; Marquis, Patrick; Cano, Stefan; Hobart, Jeremy; Isaac, Maria; Vamvakas, Spiros; Slagle, Ashley; Molsen, Elizabeth; Burke, Laurie B

    2017-01-01

    A clinician-reported outcome (ClinRO) assessment is a type of clinical outcome assessment (COA). ClinRO assessments, like all COAs (patient-reported, observer-reported, or performance outcome assessments), are used to 1) measure patients' health status and 2) define end points that can be interpreted as treatment benefits of medical interventions on how patients feel, function, or survive in clinical trials. Like other COAs, ClinRO assessments can be influenced by human choices, judgment, or motivation. A ClinRO assessment is conducted and reported by a trained health care professional and requires specialized professional training to evaluate the patient's health status. This is the second of two reports by the ISPOR Clinical Outcomes Assessment-Emerging Good Practices for Outcomes Research Task Force. The first report provided an overview of COAs including definitions important for an understanding of COA measurement practices. This report focuses specifically on issues related to ClinRO assessments. In this report, we define three types of ClinRO assessments (readings, ratings, and clinician global assessments) and describe emerging good measurement practices in their development and evaluation. The good measurement practices include 1) defining the context of use; 2) identifying the concept of interest measured; 3) defining the intended treatment benefit on how patients feel, function, or survive reflected by the ClinRO assessment and evaluating the relationship between that intended treatment benefit and the concept of interest; 4) documenting content validity; 5) evaluating other measurement properties once content validity is established (including intra- and inter-rater reliability); 6) defining study objectives and end point(s) objectives, and defining study end points and placing study end points within the hierarchy of end points; 7) establishing interpretability in trial results; and 8) evaluating operational considerations for the implementation of ClinRO assessments used as end points in clinical trials. Applying good measurement practices to ClinRO assessment development and evaluation will lead to more efficient and accurate measurement of treatment effects. This is important beyond regulatory approval in that it provides evidence for the uptake of new interventions into clinical practice and provides justification to payers for reimbursement on the basis of the clearly demonstrated added value of the new intervention. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  13. Application of response surface techniques to helicopter rotor blade optimization procedure

    NASA Technical Reports Server (NTRS)

    Henderson, Joseph Lynn; Walsh, Joanne L.; Young, Katherine C.

    1995-01-01

    In multidisciplinary optimization problems, response surface techniques can be used to replace the complex analyses that define the objective function and/or constraints with simple functions, typically polynomials. In this work a response surface is applied to the design optimization of a helicopter rotor blade. In previous work, this problem has been formulated with a multilevel approach. Here, the response surface takes advantage of this decomposition and is used to replace the lower level, a structural optimization of the blade. Problems that were encountered and important considerations in applying the response surface are discussed. Preliminary results are also presented that illustrate the benefits of using the response surface.

  14. Standardization of a Hierarchical Transactive Control System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hammerstrom, Donald J.; Oliver, Terry V.; Melton, Ronald B.

    2010-12-03

    The authors describe work they have conducted toward the generalization and standardization of the transactive control approach that was first demonstrated in the Olympic Peninsula Project for the management of a transmission constraint. The newly generalized approach addresses several potential shortfalls of the prior approach: First, the authors have formalized a hierarchical node structure which defines the nodes and the functional signal pathways between these nodes. Second, by fully generalizing the inputs, outputs, and functional responsibilities of each node, the authors make the approach available to a much wider set of responsive assets and operational objectives. Third, the new, generalizedmore » approach defines transactive signals that include the predicted day-ahead future. This predictive feature allows the market-like bids and offers to become resolved iteratively over time, thus allowing the behaviors of responsive assets to be called upon both for the present and as future dispatch decisions are being made. The hierarchical transactive control approach is a key feature of a proposed Pacific Northwest smart grid demonstration.« less

  15. SU-F-BRD-13: Quantum Annealing Applied to IMRT Beamlet Intensity Optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nazareth, D; Spaans, J

    Purpose: We report on the first application of quantum annealing (QA) to the process of beamlet intensity optimization for IMRT. QA is a new technology, which employs novel hardware and software techniques to address various discrete optimization problems in many fields. Methods: We apply the D-Wave Inc. proprietary hardware, which natively exploits quantum mechanical effects for improved optimization. The new QA algorithm, running on this hardware, is most similar to simulated annealing, but relies on natural processes to directly minimize the free energy of a system. A simple quantum system is slowly evolved into a classical system, representing the objectivemore » function. To apply QA to IMRT-type optimization, two prostate cases were considered. A reduced number of beamlets were employed, due to the current QA hardware limitation of ∼500 binary variables. The beamlet dose matrices were computed using CERR, and an objective function was defined based on typical clinical constraints, including dose-volume objectives. The objective function was discretized, and the QA method was compared to two standard optimization Methods: simulated annealing and Tabu search, run on a conventional computing cluster. Results: Based on several runs, the average final objective function value achieved by the QA was 16.9 for the first patient, compared with 10.0 for Tabu and 6.7 for the SA. For the second patient, the values were 70.7 for the QA, 120.0 for Tabu, and 22.9 for the SA. The QA algorithm required 27–38% of the time required by the other two methods. Conclusion: In terms of objective function value, the QA performance was similar to Tabu but less effective than the SA. However, its speed was 3–4 times faster than the other two methods. This initial experiment suggests that QA-based heuristics may offer significant speedup over conventional clinical optimization methods, as quantum annealing hardware scales to larger sizes.« less

  16. Definition of chronic kidney disease and measurement of kidney function in original research papers: a review of the literature.

    PubMed

    Anderson, Jocelyn; Glynn, Liam G

    2011-09-01

    Over the past decade, chronic kidney disease (CKD) has become an area of intensive clinical and epidemiological research. Despite the clarity provided by the Kidney Disease Outcomes Quality Initiative (KDOQI) guidelines, there appears to be within the CKD research literature significant disagreement on how to define CKD and measure kidney function. The objectives of this study were to investigate the variety of methods used to define CKD and to measure kidney function in original research papers as well as to investigate whether the quality of the journal had any effect on the quality of the methodology used. This was a descriptive review and not a meta-analysis. Information was extracted from each article including publication details (including the journal's impact factor), definition of CKD, method used to estimate kidney function and quantity of serum creatinine readings used to define CKD. An electronic search of MEDLINE through OVID was completed using the search term CKD. The search was limited to articles in English published in 2009. Studies were included in the review only if they were original research articles including patients with CKD. Articles were excluded if they reported data from a paediatric population, a population solely on dialysis or if there was no full-text access through OVID. Each article was assessed for quality with respect to using KDOQI CKD definition criteria. A description of the pooled data was completed and chi-square tests were used to investigate the relation between article quality and journal quality. Analysis was carried out using SPSS (15.0) and a P-value of <0.05 was considered to indicate statistical significance. The final review included 301 articles. There were a variety of methods used to define CKD in original research articles. Less than 20% (n = 59) of the articles adhered to the established international criteria for defining CKD. The majority of articles (52.1%) did not indicate the quantity of serum creatinine measurements used to define CKD. The impact factor or specialist nature of the scientific journal appears to have no bearing on whether or not published articles use the gold standard KDOQI guidelines for labelling a patient with a diagnosis of CKD. This review of literature found that a variety of definitions are being used in original research articles to define CKD and measure kidney function which calls into question the validity and reliability of such research findings and associated clinical guidelines. International consensus is urgently required to improve validity and generalizability of CKD research findings.

  17. Definition of Specific Functions and Procedural Skills Required by Cuban Specialists in Intensive Care and Emergency Medicine.

    PubMed

    Véliz, Pedro L; Berra, Esperanza M; Jorna, Ana R

    2015-07-01

    INTRODUCTION Medical specialties' core curricula should take into account functions to be carried out, positions to be filled and populations to be served. The functions in the professional profile for specialty training of Cuban intensive care and emergency medicine specialists do not include all the activities that they actually perform in professional practice. OBJECTIVE Define the specific functions and procedural skills required of Cuban specialists in intensive care and emergency medicine. METHODS The study was conducted from April 2011 to September 2013. A three-stage methodological strategy was designed using qualitative techniques. By purposive maximum variation sampling, 82 professionals were selected. Documentary analysis and key informant criteria were used in the first stage. Two expert groups were formed in the second stage: one used various group techniques (focus group, oral and written brainstorming) and the second used a three-round Delphi method. In the final stage, a third group of experts was questioned in semistructured in-depth interviews, and a two-round Delphi method was employed to assess priorities. RESULTS Ultimately, 78 specific functions were defined: 47 (60.3%) patient care, 16 (20.5%) managerial, 6 (7.7%) teaching, and 9 (11.5%) research. Thirty-one procedural skills were identified. The specific functions and procedural skills defined relate to the profession's requirements in clinical care of the critically ill, management of patient services, teaching and research at the specialist's different occupational levels. CONCLUSIONS The specific functions and procedural skills required of intensive care and emergency medicine specialists were precisely identified by a scientific method. This product is key to improving the quality of teaching, research, administration and patient care in this specialty in Cuba. The specific functions and procedural skills identified are theoretical, practical, methodological and social contributions to inform future curricular reform and to help intensive care specialists enhance their performance in comprehensive patient care. KEYWORDS Intensive care, urgent care, emergency medicine, continuing medical education, curriculum, diagnostic techniques and procedures, medical residency, Cuba.

  18. The First Spacelab Mission

    NASA Technical Reports Server (NTRS)

    Craft, H.

    1984-01-01

    The role of the mission manager in coordinating the payload with the space transportation system is studied. The establishment of the investigators working group to assist in achieving the mission objectives is examined. Analysis of the scientific requirements to assure compatibility with available resources, and analysis of the payload in order to define orbital flight requirements are described. The training of payload specialists, launch site integration, and defining the requirements for the operation of the integrated payload and the payload operations control center are functions of the mission manager. The experiences gained from the management of the Spacelab One Mission, which can be implemented in future missions, are discussed. Examples of material processing, earth observations, and life sciences advances from the First Spacelab Mission are presented.

  19. Advanced Launch System propulsion focused technology liquid methane turbopump technical implementation plan

    NASA Technical Reports Server (NTRS)

    Csomor, A.; Nielson, C. E.

    1989-01-01

    This program will focus on the integration of all functional disciplines of the design, manufacturing, materials, fabrication and producibility to define and demonstrate a highly reliable, easily maintained, low cost liquid methane turbopump as a component for the STBE (Space Transportation Booster Engine) using the STME (main engine) oxygen turbopump. A cost model is to be developed to predict the recurring cost of production hardware and operations. A prime objective of the program is to design the liquid methane turbopump to be used in common with a LH2 turbopump optimized for the STME. Time phasing of the effort is presented and interrelationship of the tasks is defined. Major subcontractors are identified and their roles in the program are described.

  20. Artificial grasping system for the paralyzed hand.

    PubMed

    Ferrari de Castro, M C; Cliquet, A

    2000-03-01

    Neuromuscular electrical stimulation has been used in upper limb rehabilitation towards restoring motor hand function. In this work, an 8 channel microcomputer controlled stimulator with monophasic square voltage output was used. Muscle activation sequences were defined to perform palmar and lateral prehension and power grip (index finger extension type). The sequences used allowed subjects to demonstrate their ability to hold and release objects that are encountered in daily living, permitting activities such as drinking, eating, writing, and typing.

  1. Improved multi-objective ant colony optimization algorithm and its application in complex reasoning

    NASA Astrophysics Data System (ADS)

    Wang, Xinqing; Zhao, Yang; Wang, Dong; Zhu, Huijie; Zhang, Qing

    2013-09-01

    The problem of fault reasoning has aroused great concern in scientific and engineering fields. However, fault investigation and reasoning of complex system is not a simple reasoning decision-making problem. It has become a typical multi-constraint and multi-objective reticulate optimization decision-making problem under many influencing factors and constraints. So far, little research has been carried out in this field. This paper transforms the fault reasoning problem of complex system into a paths-searching problem starting from known symptoms to fault causes. Three optimization objectives are considered simultaneously: maximum probability of average fault, maximum average importance, and minimum average complexity of test. Under the constraints of both known symptoms and the causal relationship among different components, a multi-objective optimization mathematical model is set up, taking minimizing cost of fault reasoning as the target function. Since the problem is non-deterministic polynomial-hard(NP-hard), a modified multi-objective ant colony algorithm is proposed, in which a reachability matrix is set up to constrain the feasible search nodes of the ants and a new pseudo-random-proportional rule and a pheromone adjustment mechinism are constructed to balance conflicts between the optimization objectives. At last, a Pareto optimal set is acquired. Evaluation functions based on validity and tendency of reasoning paths are defined to optimize noninferior set, through which the final fault causes can be identified according to decision-making demands, thus realize fault reasoning of the multi-constraint and multi-objective complex system. Reasoning results demonstrate that the improved multi-objective ant colony optimization(IMACO) can realize reasoning and locating fault positions precisely by solving the multi-objective fault diagnosis model, which provides a new method to solve the problem of multi-constraint and multi-objective fault diagnosis and reasoning of complex system.

  2. Lower bound for LCD image quality

    NASA Astrophysics Data System (ADS)

    Olson, William P.; Balram, Nikhil

    1996-03-01

    The paper presents an objective lower bound for the discrimination of patterns and fine detail in images on a monochrome LCD. In applications such as medical imaging and military avionics the information of interest is often at the highest frequencies in the image. Since LCDs are sampled data systems, their output modulation is dependent on the phase between the input signal and the sampling points. This phase dependence becomes particularly significant at high spatial frequencies. In order to use an LCD for applications such as those mentioned above it is essential to have a lower (worst case) bound on the performance of the display. We address this problem by providing a mathematical model for the worst case output modulation of an LCD in response to a sine wave input. This function can be interpreted as a worst case modulation transfer function (MTF). The intersection of the worst case MTF with the contrast threshold function (CTF) of the human visual system defines the highest spatial frequency that will always be detectable. In addition to providing the worst case limiting resolution, this MTF is combined with the CTF to produce objective worst case image quality values using the modulation transfer function area (MTFA) metric.

  3. A topo-graph model for indistinct target boundary definition from anatomical images.

    PubMed

    Cui, Hui; Wang, Xiuying; Zhou, Jianlong; Gong, Guanzhong; Eberl, Stefan; Yin, Yong; Wang, Lisheng; Feng, Dagan; Fulham, Michael

    2018-06-01

    It can be challenging to delineate the target object in anatomical imaging when the object boundaries are difficult to discern due to the low contrast or overlapping intensity distributions from adjacent tissues. We propose a topo-graph model to address this issue. The first step is to extract a topographic representation that reflects multiple levels of topographic information in an input image. We then define two types of node connections - nesting branches (NBs) and geodesic edges (GEs). NBs connect nodes corresponding to initial topographic regions and GEs link the nodes at a detailed level. The weights for NBs are defined to measure the similarity of regional appearance, and weights for GEs are defined with geodesic and local constraints. NBs contribute to the separation of topographic regions and the GEs assist the delineation of uncertain boundaries. Final segmentation is achieved by calculating the relevance of the unlabeled nodes to the labels by the optimization of a graph-based energy function. We test our model on 47 low contrast CT studies of patients with non-small cell lung cancer (NSCLC), 10 contrast-enhanced CT liver cases and 50 breast and abdominal ultrasound images. The validation criteria are the Dice's similarity coefficient and the Hausdorff distance. Student's t-test show that our model outperformed the graph models with pixel-only, pixel and regional, neighboring and radial connections (p-values <0.05). Our findings show that the topographic representation and topo-graph model provides improved delineation and separation of objects from adjacent tissues compared to the tested models. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. A Combination of Thematic and Similarity-Based Semantic Processes Confers Resistance to Deficit Following Left Hemisphere Stroke

    PubMed Central

    Kalénine, Solène; Mirman, Daniel; Buxbaum, Laurel J.

    2012-01-01

    Semantic knowledge may be organized in terms of similarity relations based on shared features and/or complementary relations based on co-occurrence in events. Thus, relationships between manipulable objects such as tools may be defined by their functional properties (what the objects are used for) or thematic properties (e.g., what the objects are used with or on). A recent study from our laboratory used eye-tracking to examine incidental activation of semantic relations in a word–picture matching task and found relatively early activation of thematic relations (e.g., broom–dustpan), later activation of general functional relations (e.g., broom–sponge), and an intermediate pattern for specific functional relations (e.g., broom–vacuum cleaner). Combined with other recent studies, these results suggest that there are distinct semantic systems for thematic and similarity-based knowledge and that the “specific function” condition drew on both systems. This predicts that left hemisphere stroke that damages either system (but not both) may spare specific function processing. The present experiment tested these hypotheses using the same experimental paradigm with participants with left hemisphere lesions (N = 17). The results revealed that, compared to neurologically intact controls (N = 12), stroke participants showed later activation of thematic and general function relations, but activation of specific function relations was spared and was significantly earlier for stroke participants than controls. Across the stroke participants, activation of thematic and general function relations was negatively correlated, further suggesting that damage tended to affect either one semantic system or the other. These results support the distinction between similarity-based and complementarity-based semantic relations and suggest that relations that draw on both systems are relatively more robust to damage. PMID:22586383

  5. Functional Information: Towards Synthesis of Biosemiotics and Cybernetics

    PubMed Central

    Sharov, Alexei A.

    2012-01-01

    Biosemiotics and cybernetics are closely related, yet they are separated by the boundary between life and non-life: biosemiotics is focused on living organisms, whereas cybernetics is applied mostly to non-living artificial devices. However, both classes of systems are agents that perform functions necessary for reaching their goals. I propose to shift the focus of biosemiotics from living organisms to agents in general, which all belong to a pragmasphere or functional universe. Agents should be considered in the context of their hierarchy and origin because their semiosis can be inherited or induced by higher-level agents. To preserve and disseminate their functions, agents use functional information - a set of signs that encode and control their functions. It includes stable memory signs, transient messengers, and natural signs. The origin and evolution of functional information is discussed in terms of transitions between vegetative, animal, and social levels of semiosis, defined by Kull. Vegetative semiosis differs substantially from higher levels of semiosis, because signs are recognized and interpreted via direct code-based matching and are not associated with ideal representations of objects. Thus, I consider a separate classification of signs at the vegetative level that includes proto-icons, proto-indexes, and proto-symbols. Animal and social semiosis are based on classification, and modeling of objects, which represent the knowledge of agents about their body (Innenwelt) and environment (Umwelt). PMID:22368439

  6. Functional Information: Towards Synthesis of Biosemiotics and Cybernetics.

    PubMed

    Sharov, Alexei A

    2010-04-27

    Biosemiotics and cybernetics are closely related, yet they are separated by the boundary between life and non-life: biosemiotics is focused on living organisms, whereas cybernetics is applied mostly to non-living artificial devices. However, both classes of systems are agents that perform functions necessary for reaching their goals. I propose to shift the focus of biosemiotics from living organisms to agents in general, which all belong to a pragmasphere or functional universe. Agents should be considered in the context of their hierarchy and origin because their semiosis can be inherited or induced by higher-level agents. To preserve and disseminate their functions, agents use functional information - a set of signs that encode and control their functions. It includes stable memory signs, transient messengers, and natural signs. The origin and evolution of functional information is discussed in terms of transitions between vegetative, animal, and social levels of semiosis, defined by Kull. Vegetative semiosis differs substantially from higher levels of semiosis, because signs are recognized and interpreted via direct code-based matching and are not associated with ideal representations of objects. Thus, I consider a separate classification of signs at the vegetative level that includes proto-icons, proto-indexes, and proto-symbols. Animal and social semiosis are based on classification, and modeling of objects, which represent the knowledge of agents about their body (Innenwelt) and environment (Umwelt).

  7. Cardiopulmonary Exercise Testing in Patients Following Massive and Submassive Pulmonary Embolism.

    PubMed

    Albaghdadi, Mazen S; Dudzinski, David M; Giordano, Nicholas; Kabrhel, Christopher; Ghoshhajra, Brian; Jaff, Michael R; Weinberg, Ido; Baggish, Aaron

    2018-03-03

    Little data exist regarding the functional capacity of patients following acute pulmonary embolism. We sought to characterize the natural history of symptom burden, right ventricular (RV) structure and function, and exercise capacity among survivors of massive and submassive pulmonary embolism. Survivors of submassive or massive pulmonary embolism (n=20, age 57±13.3 years, 8/20 female) underwent clinical evaluation, transthoracic echocardiography, and cardiopulmonary exercise testing at 1 and 6 months following hospital discharge. At 1 month, 9/20 (45%) patients had New York Heart Association II or greater symptoms, 13/20 (65%) demonstrated either persistent RV dilation or systolic dysfunction, and 14/20 (70%) had objective exercise impairment as defined by a peak oxygen consumption (V˙O 2 ) of <80% of age-sex predicted maximal values (16.25 [13.4-20.98] mL/kg per minute). At 6 months, no appreciable improvements in symptom severity, RV structure or function, and peak V˙O 2 (17.45 [14.08-22.48] mL/kg per minute, P =NS) were observed. No patients demonstrated an exercise limitation attributable to either RV/pulmonary vascular coupling, as defined by a VE/VCO 2 slope >33, or a pulmonary mechanical limit to exercise at either time point. Similarly, persistent RV dilation or dysfunction was not significantly related to symptom burden or peak V˙O 2 at either time point. Persistent symptoms, abnormalities of RV structure and function, and objective exercise limitation are common among survivors of massive and submassive pulmonary embolism. Functional impairment appears to be attributable to general deconditioning rather than intrinsic cardiopulmonary limitation, suggesting an important role for prescribed exercise rehabilitation as a means toward improved patient outcomes and quality of life. © 2018 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.

  8. The Validity of Dependence as a Health Outcome Measure in Alzheimer’s Disease

    PubMed Central

    Spackman, D. Eldon; Kadiyala, Srikanth; Neumann, Peter J.; Veenstra, David L.; Sullivan, Sean D.

    2013-01-01

    Background Relating to Alzheimer’s disease (AD), dependence has been defined as the increased need for assistance due to deterioration in cognition, physical functioning, and behavior. Our objective was to evaluate the association between dependence and measures of functional impairment. Methods Data were compiled by the National Alzheimer’s Coordinating Center. We used multinomial logistic regression to estimate the association between dependence and cognition, physical functioning, and behavior. Results The independent association with dependence was positive. Dependence was most strongly associated with physical functioning. A secondary analysis suggested a strong association of dependence with multiple impairments, as measured by the interaction terms, in more severe patients. Conclusions We find that dependence is simultaneously associated with physical functioning, cognition, and behavior, which support the construct validity of dependence. Dependence might be a more simple measure to explain the multifaceted disease progression of AD and convey the increasing need for care. PMID:23512996

  9. The validity of dependence as a health outcome measure in Alzheimer's disease.

    PubMed

    Spackman, D Eldon; Kadiyala, Srikanth; Neumann, Peter J; Veenstra, David L; Sullivan, Sean D

    2013-05-01

    Relating to Alzheimer's disease (AD), dependence has been defined as the increased need for assistance due to deterioration in cognition, physical functioning, and behavior. Our objective was to evaluate the association between dependence and measures of functional impairment. Data were compiled by the National Alzheimer's Coordinating Center. We used multinomial logistic regression to estimate the association between dependence and cognition, physical functioning, and behavior. The independent association with dependence was positive. Dependence was most strongly associated with physical functioning. A secondary analysis suggested a strong association of dependence with multiple impairments, as measured by the interaction terms, in more severe patients. We find that dependence is simultaneously associated with physical functioning, cognition, and behavior, which support the construct validity of dependence. Dependence might be a more simple measure to explain the multifaceted disease progression of AD and convey the increasing need for care.

  10. Functional remediation components: A conceptual method of evaluating the effects of remediation on risks to ecological receptors.

    PubMed

    Burger, Joanna; Gochfeld, Michael; Bunn, Amoret; Downs, Janelle; Jeitner, Christian; Pittfield, Taryn; Salisbury, Jennifer

    2016-01-01

    Governmental agencies, regulators, health professionals, tribal leaders, and the public are faced with understanding and evaluating the effects of cleanup activities on species, populations, and ecosystems. While engineers and managers understand the processes involved in different remediation types such as capping, pump and treat, and natural attenuation, there is often a disconnect between (1) how ecologists view the influence of different types of remediation, (2) how the public perceives them, and (3) how engineers understand them. The overall goal of the present investigation was to define the components of remediation types (= functional remediation). Objectives were to (1) define and describe functional components of remediation, regardless of the remediation type, (2) provide examples of each functional remediation component, and (3) explore potential effects of functional remediation components in the post-cleanup phase that may involve continued monitoring and assessment. Functional remediation components include types, numbers, and intensity of people, trucks, heavy equipment, pipes, and drill holes, among others. Several components may be involved in each remediation type, and each results in ecological effects, ranging from trampling of plants, to spreading invasive species, to disturbing rare species, and to creating fragmented habitats. In some cases remediation may exert a greater effect on ecological receptors than leaving the limited contamination in place. A goal of this conceptualization is to break down functional components of remediation such that managers, regulators, and the public might assess the effects of timing, extent, and duration of different remediation options on ecological systems.

  11. Viscoacoustic model for near-field ultrasonic levitation.

    PubMed

    Melikhov, Ivan; Chivilikhin, Sergey; Amosov, Alexey; Jeanson, Romain

    2016-11-01

    Ultrasonic near-field levitation allows for contactless support and transportation of an object over vibrating surface. We developed an accurate model predicting pressure distribution in the gap between the surface and levitating object. The formulation covers a wide range of the air flow regimes: from viscous squeezed flow dominating in small gap to acoustic wave propagation in larger gap. The paper explains derivation of the governing equations from the basic fluid dynamics. The nonreflective boundary conditions were developed to properly define air flow at the outlet. Comparing to direct computational fluid dynamics modeling our approach allows achieving good accuracy while keeping the computation cost low. Using the model we studied the levitation force as a function of gap distance. It was shown that there are three distinguished flow regimes: purely viscous, viscoacoustic, and acoustic. The regimes are defined by the balance of viscous and inertial forces. In the viscous regime the pressure in the gap is close to uniform while in the intermediate viscoacoustic and the acoustic regimes the pressure profile is wavy. The model was validated by a dedicated levitation experiment and compared to similar published results.

  12. Viscoacoustic model for near-field ultrasonic levitation

    NASA Astrophysics Data System (ADS)

    Melikhov, Ivan; Chivilikhin, Sergey; Amosov, Alexey; Jeanson, Romain

    2016-11-01

    Ultrasonic near-field levitation allows for contactless support and transportation of an object over vibrating surface. We developed an accurate model predicting pressure distribution in the gap between the surface and levitating object. The formulation covers a wide range of the air flow regimes: from viscous squeezed flow dominating in small gap to acoustic wave propagation in larger gap. The paper explains derivation of the governing equations from the basic fluid dynamics. The nonreflective boundary conditions were developed to properly define air flow at the outlet. Comparing to direct computational fluid dynamics modeling our approach allows achieving good accuracy while keeping the computation cost low. Using the model we studied the levitation force as a function of gap distance. It was shown that there are three distinguished flow regimes: purely viscous, viscoacoustic, and acoustic. The regimes are defined by the balance of viscous and inertial forces. In the viscous regime the pressure in the gap is close to uniform while in the intermediate viscoacoustic and the acoustic regimes the pressure profile is wavy. The model was validated by a dedicated levitation experiment and compared to similar published results.

  13. Practical Strategies for Integrating Final Ecosystem Goods and ...

    EPA Pesticide Factsheets

    The concept of Final Ecosystem Goods and Services (FEGS) explicitly connects ecosystem services to the people that benefit from them. This report presents a number of practical strategies for incorporating FEGS, and more broadly ecosystem services, into the decision-making process. Whether a decision process is in early or late stages, or whether a process includes informal or formal decision analysis, there are multiple points where ecosystem services concepts can be integrated. This report uses Structured Decision Making (SDM) as an organizing framework to illustrate the role ecosystem services can play in a values-focused decision-process, including: • Clarifying the decision context: Ecosystem services can help clarify the potential impacts of an issue on natural resources together with their spatial and temporal extent based on supply and delivery of those services, and help identify beneficiaries for inclusion as stakeholders in the deliberative process. • Defining objectives and performance measures: Ecosystem services may directly represent stakeholder objectives, or may be means toward achieving other objectives. • Creating alternatives: Ecosystem services can bring to light creative alternatives for achieving other social, economic, health, or general well-being objectives. • Estimating consequences: Ecosystem services assessments can implement ecological production functions (EPFs) and ecological benefits functions (EBFs) to link decision alt

  14. Development of microgravity, full body functional reach envelope using 3-D computer graphic models and virtual reality technology

    NASA Technical Reports Server (NTRS)

    Lindsey, Patricia F.

    1994-01-01

    In microgravity conditions mobility is greatly enhanced and body stability is difficult to achieve. Because of these difficulties, optimum placement and accessibility of objects and controls can be critical to required tasks on board shuttle flights or on the proposed space station. Anthropometric measurement of the maximum reach of occupants of a microgravity environment provide knowledge about maximum functional placement for tasking situations. Calculations for a full body, functional reach envelope for microgravity environments are imperative. To this end, three dimensional computer modeled human figures, providing a method of anthropometric measurement, were used to locate the data points that define the full body, functional reach envelope. Virtual reality technology was utilized to enable an occupant of the microgravity environment to experience movement within the reach envelope while immersed in a simulated microgravity environment.

  15. A concise introduction to Colombeau generalized functions and their applications in classical electrodynamics

    NASA Astrophysics Data System (ADS)

    Gsponer, Andre

    2009-01-01

    The objective of this introduction to Colombeau algebras of generalized functions (in which distributions can be freely multiplied) is to explain in elementary terms the essential concepts necessary for their application to basic nonlinear problems in classical physics. Examples are given in hydrodynamics and electrodynamics. The problem of the self-energy of a point electric charge is worked out in detail: the Coulomb potential and field are defined as Colombeau generalized functions, and integrals of nonlinear expressions corresponding to products of distributions (such as the square of the Coulomb field and the square of the delta function) are calculated. Finally, the methods introduced in Gsponer (2007 Eur. J. Phys. 28 267, 2007 Eur. J. Phys. 28 1021 and 2007 Eur. J. Phys. 28 1241), to deal with point-like singularities in classical electrodynamics are confirmed.

  16. ASSESSMENT OF UPPER EXTREMITY IMPAIRMENT, FUNCTION, AND ACTIVITY FOLLOWING STROKE: FOUNDATIONS FOR CLINICAL DECISION MAKING

    PubMed Central

    Lang, Catherine E.; Bland, Marghuretta D.; Bailey, Ryan R.; Schaefer, Sydney Y.; Birkenmeier, Rebecca L.

    2012-01-01

    The purpose of this review is to provide a comprehensive approach for assessing the upper extremity (UE) after stroke. First, common upper extremity impairments and how to assess them are briefly discussed. While multiple UE impairments are typically present after stroke, the severity of one impairment, paresis, is the primary determinant of UE functional loss. Second, UE function is operationally defined and a number of clinical measures are discussed. It is important to consider how impairment and loss of function affect UE activity outside of the clinical environment. Thus, this review also identifies accelerometry as an objective method for assessing UE activity in daily life. Finally, the role that each of these levels of assessment should play in clinical decision making is discussed in order to optimize the provision of stroke rehabilitation services. PMID:22975740

  17. Background Noises Versus Intraseasonal Variation Signals: Small vs. Large Convective Cloud Objects From CERES Aqua Observations

    NASA Technical Reports Server (NTRS)

    Xu, Kuan-Man

    2015-01-01

    During inactive phases of Madden-Julian Oscillation (MJO), there are plenty of deep but small convective systems and far fewer deep and large ones. During active phases of MJO, a manifestation of an increase in the occurrence of large and deep cloud clusters results from an amplification of large-scale motions by stronger convective heating. This study is designed to quantitatively examine the roles of small and large cloud clusters during the MJO life cycle. We analyze the cloud object data from Aqua CERES (Clouds and the Earth's Radiant Energy System) observations between July 2006 and June 2010 for tropical deep convective (DC) and cirrostratus (CS) cloud object types according to the real-time multivariate MJO index, which assigns the tropics to one of the eight MJO phases each day. The cloud object is a contiguous region of the earth with a single dominant cloud-system type. The criteria for defining these cloud types are overcast footprints and cloud top pressures less than 400 hPa, but DC has higher cloud optical depths (=10) than those of CS (<10). The size distributions, defined as the footprint numbers as a function of cloud object diameters, for particular MJO phases depart greatly from the combined (8-phase) distribution at large cloud-object diameters due to the reduced/increased numbers of cloud objects related to changes in the large-scale environments. The medium diameter corresponding to the combined distribution is determined and used to partition all cloud objects into "small" and "large" groups of a particular phase. The two groups corresponding to the combined distribution have nearly equal numbers of footprints. The medium diameters are 502 km for DC and 310 km for cirrostratus. The range of the variation between two extreme phases (typically, the most active and depressed phases) for the small group is 6-11% in terms of the numbers of cloud objects and the total footprint numbers. The corresponding range for the large group is 19-44%. In terms of the probability density functions of radiative and cloud physical properties, there are virtually no differences between the MJO phases for the small group, but there are significant differences for the large groups for both DC and CS types. These results suggest that the intreseasonal variation signals reside at the large cloud clusters while the small cloud clusters represent the background noises resulting from various types of the tropical waves with different wavenumbers and propagation speeds/directions.

  18. Global Patterns of Guild Composition and Functional Diversity of Spiders

    PubMed Central

    Cardoso, Pedro; Pekár, Stano; Jocqué, Rudy; Coddington, Jonathan A.

    2011-01-01

    The objectives of this work are: (1) to define spider guilds for all extant families worldwide; (2) test if guilds defined at family level are good surrogates of species guilds; (3) compare the taxonomic and guild composition of spider assemblages from different parts of the world; (4) compare the taxonomic and functional diversity of spider assemblages and; (5) relate functional diversity with habitat structure. Data on foraging strategy, prey range, vertical stratification and circadian activity was collected for 108 families. Spider guilds were defined by hierarchical clustering. We searched for inconsistencies between family guild placement and the known guild of each species. Richness and abundance per guild before and after correcting guild placement were compared, as were the proportions of each guild and family between all possible pairs of sites. Functional diversity per site was calculated based on hierarchical clustering. Eight guilds were discriminated: (1) sensing, (2) sheet, (3) space, and (4) orb web weavers; (5) specialists; (6) ambush, (7) ground, and (8) other hunters. Sixteen percent of the species richness corresponding to 11% of all captured individuals was incorrectly attributed to a guild by family surrogacy; however, the correlation of uncorrected vs. corrected guilds was invariably high. The correlation of guild richness or abundances was generally higher than the correlation of family richness or abundances. Functional diversity was not always higher in the tropics than in temperate regions. Families may potentially serve as ecological surrogates for species. Different families may present similar roles in the ecosystems, with replacement of some taxa by other within the same guild. Spiders in tropical regions seem to have higher redundancy of functional roles and/or finer resource partitioning than in temperate regions. Although species and family diversity were higher in the tropics, functional diversity seems to be also influenced by altitude and habitat structure. PMID:21738772

  19. Koszul information geometry and Souriau Lie group thermodynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barbaresco, Frédéric, E-mail: frederic.barbaresco@thalesgroup.com

    The François Massieu 1869 idea to derive some mechanical and thermal properties of physical systems from 'Characteristic Functions', was developed by Gibbs and Duhem in thermodynamics with the concept of potentials, and introduced by Poincaré in probability. This paper deals with generalization of this Characteristic Function concept by Jean-Louis Koszul in Mathematics and by Jean-Marie Souriau in Statistical Physics. The Koszul-Vinberg Characteristic Function (KVCF) on convex cones will be presented as cornerstone of 'Information Geometry' theory, defining Koszul Entropy as Legendre transform of minus the logarithm of KVCF, and Fisher Information Metrics as hessian of these dual functions, invariant bymore » their automorphisms. In parallel, Souriau has extended the Characteristic Function in Statistical Physics looking for other kinds of invariances through co-adjoint action of a group on its momentum space, defining physical observables like energy, heat and momentum as pure geometrical objects. In covariant Souriau model, Gibbs equilibriums states are indexed by a geometric parameter, the Geometric (Planck) Temperature, with values in the Lie algebra of the dynamical Galileo/Poincaré groups, interpreted as a space-time vector, giving to the metric tensor a null Lie derivative. Fisher Information metric appears as the opposite of the derivative of Mean 'Moment map' by geometric temperature, equivalent to a Geometric Capacity or Specific Heat. These elements has been developed by author in [10][11].« less

  20. Associations of Mental Health and Physical Function with Colonoscopy-related Pain.

    PubMed

    Yamada, Eiji; Watanabe, Seitaro; Nakajima, Atsushi

    2017-01-01

    Objective To clarify the effects of mental health and physical function in association with colonoscopy-related pain. Methods The mental health and physical function were evaluated using the Japanese version of the SF-8 Health Survey questionnaire. Poor physical status was defined as a physical component summary (PCS) <40 and poor mental status as a mental component summary (MCS) <40. Pain was assessed using a visual analogue scale (VAS), with significant pain defined as VAS ≥70 mm and insignificant pain as VAS <70 mm. The background and colonoscopic findings were compared in patients with significant and insignificant pain. Patients This study evaluated consecutive Japanese patients who were positive on fecal occult blood tests and underwent total colonoscopy. Results Of the 100 patients, 23 had significant and 77 had insignificant colonoscopy-related pain. A multiple logistic regression analysis showed that MCS <40 [odds ratio (OR) 6.03; 95% confidence interval (CI) 1.41-25.9, p=0.0156], PCS <40 (OR 5.96; 95% CI 1.45-24.5, p=0.0133), and ≥300 seconds to reach the cecum (OR 4.13; 95% CI 1.16-14.7, p=0.0281) were independent risk factors for colonoscopy-related pain. Conclusion The mental health and physical function are important determinants of colonoscopy-related pain. Evaluating the mental health and physical function of patients prior to colonoscopy may effectively predict the degree of colonoscopy-related pain.

  1. Combining Trust and Behavioral Analysis to Detect Security Threats in Open Environments

    DTIC Science & Technology

    2010-11-01

    behavioral feature values. This would provide a baseline notional object trust and is formally defined as follows: TO(1)[0, 1] = ∑ 0,n:νbt wtP (S) (8...TO(2)[0, 1] = ∑ wtP (S) · identity(O,P ) (9) 28- 12 RTO-MP-IST-091 Combining Trust and Behavioral Analysis to Detect Security Threats in Open...respectively. The wtP weight function determines the significance of a particular behavioral feature in the final trust calculation. Note that the weight

  2. A formalism for the calculus of variations with spinors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bäckdahl, Thomas, E-mail: thobac@chalmers.se; Valiente Kroon, Juan A., E-mail: j.a.valiente-kroon@qmul.ac.uk

    2016-02-15

    We develop a frame and dyad gauge-independent formalism for the calculus of variations of functionals involving spinorial objects. As a part of this formalism, we define a modified variation operator which absorbs frame and spin dyad gauge terms. This formalism is applicable to both the standard spacetime (i.e., SL(2, ℂ)) 2-spinors as well as to space (i.e., SU(2, ℂ)) 2-spinors. We compute expressions for the variations of the connection and the curvature spinors.

  3. Drag and drop simulation: from pictures to full three-dimensional simulations

    NASA Astrophysics Data System (ADS)

    Bergmann, Michel; Iollo, Angelo

    2014-11-01

    We present a suite of methods to achieve ``drag and drop'' simulation, i.e., to fully automatize the process to perform thee-dimensional flow simulations around a bodies defined by actual images of moving objects. The overall approach requires a skeleton graph generation to get level set function from pictures, optimal transportation to get body velocity on the surface and then flow simulation thanks to a cartesian method based on penalization. We illustrate this paradigm simulating the swimming of a mackerel fish.

  4. The challenge of defining risk-based metrics to improve food safety: inputs from the BASELINE project.

    PubMed

    Manfreda, Gerardo; De Cesare, Alessandra

    2014-08-01

    In 2002, the Regulation (EC) 178 of the European Parliament and of the Council states that, in order to achieve the general objective of a high level of protection of human health and life, food law shall be based on risk analysis. However, the Commission Regulation No 2073/2005 on microbiological criteria for foodstuffs requires that food business operators ensure that foodstuffs comply with the relevant microbiological criteria. Such criteria define the acceptability of a product, a batch of foodstuffs or a process, based on the absence, presence or number of micro-organisms, and/or on the quantity of their toxins/metabolites, per unit(s) of mass, volume, area or batch. The same Regulation describes a food safety criterion as a mean to define the acceptability of a product or a batch of foodstuff applicable to products placed on the market; moreover, it states a process hygiene criterion as a mean indicating the acceptable functioning of the production process. Both food safety criteria and process hygiene criteria are not based on risk analysis. On the contrary, the metrics formulated by the Codex Alimentarius Commission in 2004, named Food Safety Objective (FSO) and Performance Objective (PO), are risk-based and fit the indications of Regulation 178/2002. The main aims of this review are to illustrate the key differences between microbiological criteria and the risk-based metrics defined by the Codex Alimentarius Commission and to explore the opportunity and also the possibility to implement future European Regulations including PO and FSO as supporting parameters to microbiological criteria. This review clarifies also the implications of defining an appropriate level of human protection, how to establish FSO and PO and how to implement them in practice linked to each other through quantitative risk assessment models. The contents of this review should clarify the context for application of the results collected during the EU funded project named BASELINE (www.baselineeurope.eu) as described in the papers of this special issue. Such results show how to derive POs for specific food/biological hazard combinations selected among fish, egg, dairy, meat and plant products. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Iron Deficiency Anemia and Cognitive Function in Infancy

    PubMed Central

    Carter, R. Colin; Jacobson, Joseph L.; Burden, Matthew J.; Armony-Sivan, Rinat; Dodge, Neil C.; Angelilli, Mary Lu; Lozoff, Betsy; Jacobson, Sandra W.

    2011-01-01

    OBJECTIVES This study examined effects of iron deficiency anemia (IDA) on specific domains of infant cognitive function and the role of IDA-related socioemotional deficits in mediating and/or moderating these effects. METHODS Infants were recruited during routine 9-month visits to an inner-city clinic. IDA was defined as hemoglobin level <110 g/L with ≥2 abnormal iron deficiency indicators (mean corpuscular volume, red cell distribution width, zinc protoporphyrin, transferrin saturation, and ferritin). At 9 and 12 months, the Fagan Test of Infant Intelligence (FTII); A-not-B task; Emotionality, Activity, and Sociability Temperament Survey; and Behavior Rating Scale were administered. Analyses were adjusted for potential confounders, including age and sociodemographic variables. RESULTS Twenty-eight infants met criteria for IDA, 28 had nonanemic iron deficiency (NA ID) and 21 had iron sufficiency (IS). There was a linear effect for object permanence at 9 months: infants with IDA were least likely to exhibit object permanence, IS most likely, and NA ID intermediate. Infants with IDA and those with hemoglobin level ≤105 g/L showed poorer recognition memory on the FTII than infants without IDA. The Behavior Rating Scale orientation/engagement measure partially mediated these effects. Stronger effects of IDA on these outcomes were seen in infants who scored more poorly on the socioemotional measures. CONCLUSIONS These data indicate poorer object permanence and short-term memory encoding and/or retrieval in infants with IDA at 9 months. These cognitive effects were attributable, in part, to IDA-related deficits in socioemotional function. Children with poor socioemotional performance seem to be more vulnerable to the effects of IDA on cognitive function. PMID:20660551

  6. Functional connectivity supporting the selective maintenance of feature-location binding in visual working memory

    PubMed Central

    Takahama, Sachiko; Saiki, Jun

    2014-01-01

    Information on an object's features bound to its location is very important for maintaining object representations in visual working memory. Interactions with dynamic multi-dimensional objects in an external environment require complex cognitive control, including the selective maintenance of feature-location binding. Here, we used event-related functional magnetic resonance imaging to investigate brain activity and functional connectivity related to the maintenance of complex feature-location binding. Participants were required to detect task-relevant changes in feature-location binding between objects defined by color, orientation, and location. We compared a complex binding task requiring complex feature-location binding (color-orientation-location) with a simple binding task in which simple feature-location binding, such as color-location, was task-relevant and the other feature was task-irrelevant. Univariate analyses showed that the dorsolateral prefrontal cortex (DLPFC), hippocampus, and frontoparietal network were activated during the maintenance of complex feature-location binding. Functional connectivity analyses indicated cooperation between the inferior precentral sulcus (infPreCS), DLPFC, and hippocampus during the maintenance of complex feature-location binding. In contrast, the connectivity for the spatial updating of simple feature-location binding determined by reanalyzing the data from Takahama et al. (2010) demonstrated that the superior parietal lobule (SPL) cooperated with the DLPFC and hippocampus. These results suggest that the connectivity for complex feature-location binding does not simply reflect general memory load and that the DLPFC and hippocampus flexibly modulate the dorsal frontoparietal network, depending on the task requirements, with the infPreCS involved in the maintenance of complex feature-location binding and the SPL involved in the spatial updating of simple feature-location binding. PMID:24917833

  7. Functional connectivity supporting the selective maintenance of feature-location binding in visual working memory.

    PubMed

    Takahama, Sachiko; Saiki, Jun

    2014-01-01

    Information on an object's features bound to its location is very important for maintaining object representations in visual working memory. Interactions with dynamic multi-dimensional objects in an external environment require complex cognitive control, including the selective maintenance of feature-location binding. Here, we used event-related functional magnetic resonance imaging to investigate brain activity and functional connectivity related to the maintenance of complex feature-location binding. Participants were required to detect task-relevant changes in feature-location binding between objects defined by color, orientation, and location. We compared a complex binding task requiring complex feature-location binding (color-orientation-location) with a simple binding task in which simple feature-location binding, such as color-location, was task-relevant and the other feature was task-irrelevant. Univariate analyses showed that the dorsolateral prefrontal cortex (DLPFC), hippocampus, and frontoparietal network were activated during the maintenance of complex feature-location binding. Functional connectivity analyses indicated cooperation between the inferior precentral sulcus (infPreCS), DLPFC, and hippocampus during the maintenance of complex feature-location binding. In contrast, the connectivity for the spatial updating of simple feature-location binding determined by reanalyzing the data from Takahama et al. (2010) demonstrated that the superior parietal lobule (SPL) cooperated with the DLPFC and hippocampus. These results suggest that the connectivity for complex feature-location binding does not simply reflect general memory load and that the DLPFC and hippocampus flexibly modulate the dorsal frontoparietal network, depending on the task requirements, with the infPreCS involved in the maintenance of complex feature-location binding and the SPL involved in the spatial updating of simple feature-location binding.

  8. The NASA Integrated Information Technology Architecture

    NASA Technical Reports Server (NTRS)

    Baldridge, Tim

    1997-01-01

    This document defines an Information Technology Architecture for the National Aeronautics and Space Administration (NASA), where Information Technology (IT) refers to the hardware, software, standards, protocols and processes that enable the creation, manipulation, storage, organization and sharing of information. An architecture provides an itemization and definition of these IT structures, a view of the relationship of the structures to each other and, most importantly, an accessible view of the whole. It is a fundamental assumption of this document that a useful, interoperable and affordable IT environment is key to the execution of the core NASA scientific and project competencies and business practices. This Architecture represents the highest level system design and guideline for NASA IT related activities and has been created on the authority of the NASA Chief Information Officer (CIO) and will be maintained under the auspices of that office. It addresses all aspects of general purpose, research, administrative and scientific computing and networking throughout the NASA Agency and is applicable to all NASA administrative offices, projects, field centers and remote sites. Through the establishment of five Objectives and six Principles this Architecture provides a blueprint for all NASA IT service providers: civil service, contractor and outsourcer. The most significant of the Objectives and Principles are the commitment to customer-driven IT implementations and the commitment to a simpler, cost-efficient, standards-based, modular IT infrastructure. In order to ensure that the Architecture is presented and defined in the context of the mission, project and business goals of NASA, this Architecture consists of four layers in which each subsequent layer builds on the previous layer. They are: 1) the Business Architecture: the operational functions of the business, or Enterprise, 2) the Systems Architecture: the specific Enterprise activities within the context of IT systems, 3) the Technical Architecture: a common, vendor-independent framework for design, integration and implementation of IT systems and 4) the Product Architecture: vendor=specific IT solutions. The Systems Architecture is effectively a description of the end-user "requirements". Generalized end-user requirements are discussed and subsequently organized into specific mission and project functions. The Technical Architecture depicts the framework, and relationship, of the specific IT components that enable the end-user functionality as described in the Systems Architecture. The primary components as described in the Technical Architecture are: 1) Applications: Basic Client Component, Object Creation Applications, Collaborative Applications, Object Analysis Applications, 2) Services: Messaging, Information Broker, Collaboration, Distributed Processing, and 3) Infrastructure: Network, Security, Directory, Certificate Management, Enterprise Management and File System. This Architecture also provides specific Implementation Recommendations, the most significant of which is the recognition of IT as core to NASA activities and defines a plan, which is aligned with the NASA strategic planning processes, for keeping the Architecture alive and useful.

  9. Bio-objects and the media: the role of communication in bio-objectification processes.

    PubMed

    Maeseele, Pieter; Allgaier, Joachim; Martinelli, Lucia

    2013-06-01

    The representation of biological innovations in and through communication and media practices is vital for understanding the nature of "bio-objects" and the process we call "bio-objectification." This paper discusses two ideal-typical analytical approaches based on different underlying communication models, ie, the traditional (science- and media-centered) and media sociological (a multi-layered process involving various social actors in defining the meanings of scientific and technological developments) approach. In this analysis, the latter is not only found to be the most promising approach for understanding the circulation, (re)production, and (re)configuration of meanings of bio-objects, but also to interpret the relationship between media and science. On the basis of a few selected examples, this paper highlights how media function as a primary arena for the (re)production and (re)configuration of scientific and biomedical information with regards to bio-objects in the public sphere in general, and toward decision-makers, interest groups, and the public in specific.

  10. "Phase capture" in the perception of interpolated shape: cue combination and the influence function.

    PubMed

    Levi, Dennis M; Wing-Hong Li, Roger; Klein, Stanley A

    2003-09-01

    This study was concerned with what stimulus information observers use to judge the shape of simple objects. We used a string of four Gabor patches to define a contour. A fifth, center patch served as a test pattern. The observers' task was to judge the location of the test pattern relative to the contour. The contour was either a straight line, or an arc with positive or negative curvature (the radius of curvature was either 2 or 6 deg). We asked whether phase shifts in the inner or outer pairs of patches distributed along the contour influence the perceived shape. That is, we measured the phase shift influence function. We found that shifting the inner patches of the string by 0.25 cycle results in almost complete phase capture (attraction) at the smallest separation (2 lambda), and the capture effect falls off rapidly with separation. A 0.25 cycle shift of the outer pair of patches has a much smaller effect, in the opposite direction (repulsion). In our experiments, the contour is defined by two cues--the cue provided by the Gabor carrier (the 'feature' cue) and that defined by the Gaussian envelope (the 'envelope' cue). Our phase shift influence function can be thought of as a cue combination task. An ideal observer would weight the cues by the inverse variance of the two cues. The variance in each of these cues predicts the main features of our results quite accurately.

  11. Tmd Factorization and Evolution for Tmd Correlation Functions

    NASA Astrophysics Data System (ADS)

    Mert Aybat, S.; Rogers, Ted C.

    We discuss the application of transverse momentum dependent (TMD) factorization theorems to phenomenology. Our treatment relies on recent extensions of the Collins-Soper-Sterman (CSS) formalism. Emphasis is placed on the importance of using well-defined TMD parton distribution functions (PDFs) and fragmentation functions (FFs) in calculating the evolution of these objects. We explain how parametrizations of unpolarized TMDs can be obtained from currently existing fixed-scale Gaussian fits and previous implementations of the CSS formalism in the Drell-Yan process, and provide some examples. We also emphasize the importance of agreed-upon definitions for having an unambiguous prescription for calculating higher orders in the hard part, and provide examples of higher order calculations. We end with a discussion of strategies for extending the phenomenological applications of TMD factorization to situations beyond the unpolarized case.

  12. Commonalities and Differences in Functional Safety Systems Between ISS Payloads and Industrial Applications

    NASA Astrophysics Data System (ADS)

    Malyshev, Mikhail; Kreimer, Johannes

    2013-09-01

    Safety analyses for electrical, electronic and/or programmable electronic (E/E/EP) safety-related systems used in payload applications on-board the International Space Station (ISS) are often based on failure modes, effects and criticality analysis (FMECA). For industrial applications of E/E/EP safety-related systems, comparable strategies exist and are defined in the IEC-61508 standard. This standard defines some quantitative criteria based on potential failure modes (for example, Safe Failure Fraction). These criteria can be calculated for an E/E/EP system or components to assess their compliance to requirements of a particular Safety Integrity Level (SIL). The standard defines several SILs depending on how much risk has to be mitigated by a safety-critical system. When a FMECA is available for an ISS payload or its subsystem, it may be possible to calculate the same or similar parameters as defined in the 61508 standard. One example of a payload that has a dedicated functional safety subsystem is the Electromagnetic Levitator (EML). This payload for the ISS is planned to be operated on-board starting 2014. The EML is a high-temperature materials processing facility. The dedicated subsystem "Hazard Control Electronics" (HCE) is implemented to ensure compliance to failure tolerance in limiting samples processing parameters to maintain generation of the potentially toxic by-products to safe limits in line with the requirements applied to the payloads by the ISS Program. The objective of this paper is to assess the implementation of the HCE in the EML against criteria for functional safety systems in the IEC-61508 standard and to evaluate commonalities and differences with respect to safety requirements levied on ISS Payloads. An attempt is made to assess a possibility of using commercially available components and systems certified for compliance to industrial functional safety standards in ISS payloads.

  13. Diabetes is associated with subclinical functional limitation in nondisabled older individuals: the Health, Aging, and Body Composition study.

    PubMed

    De Rekeneire, Nathalie; Resnick, Helaine E; Schwartz, Ann V; Shorr, Ronald I; Kuller, Lewis H; Simonsick, Eleanor M; Vellas, Bruno; Harris, Tamara B

    2003-12-01

    The aim of this study was to examine the role of comorbid conditions and body composition in the association between diabetes and subclinical functional limitation, an indication of early functional decline, in well-functioning older individuals. This was a cross-sectional analysis of 3,075 well-functioning black and white men and women aged 70-79 years, enrolled in the Health, Aging, and Body Composition study. Diabetes was defined by self-report and/or hypoglycemic medication use or fasting glucose >/=126 mg/dl. Subclinical functional limitation was defined using self-report of capacity and objective performance measures. Comorbid conditions were identified by self-reported diagnoses, medication use, and clinical measures. Body composition measures included anthropometry and total fat (dual X-ray absorptiometry). Of 2,926 participants, 1,252 (42.8%) had subclinical functional limitation at baseline. Among 2,370 individuals without diabetes, 40% had subclinical functional limitation, whereas the prevalence was 53% among the 556 diabetic participants with an age/sex/race-adjusted odds ratio (OR) 1.70 (95% CI 1.40-2.06). This association remained significant when adjusted for body composition measures (OR 1.54 [1.26-1.88]), diabetes-related comorbidities, and other potential confounders (OR 1.40 [1.14-1.73]). In the fully adjusted model, consideration of HbA(1c) (< or >/=7%) and diabetes duration showed that poor glycemic control in diabetic individuals explained the association with subclinical functional limitation. In a well-functioning older population, diabetes is associated with early indicators of functional decline, even after accounting for body composition and diabetes-related comorbidities. Poor glycemic control contributes to this relationship. Whether improvement in glycemic control in older people with diabetes would change this association should be tested.

  14. Shouldering the load: A review of Joan Stevenson's work on occupational lifting and design evaluation of load carriage equipment.

    PubMed

    Costigan, Patrick A; Morin, Evelyn L; Reid, Susan A

    2014-01-01

    In this paper, Dr. Joan Stevenson's work on assessment of the effects of lifting, supporting and transporting loads is reviewed. A defining attribute of this work is the use of objective, biomechanical measures as the basis from which a fuller understanding of all factors affecting worker performance can be obtained, and how such performance should be measured and evaluated. The central objectives and conclusions of Dr. Stevenson's research programs spanning the years from 1985 through 2012 are summarized and discussed in terms of an overall research trajectory. The guiding principle of Dr. Stevenson's work is to reduce the potential harm to which workers are exposed through the development of bona fide occupational standards, a better understanding of risk factors leading to low back pain, and the establishment of an enhanced objective design process for functional load-bearing clothing and equipment.

  15. Implementation of jump-diffusion algorithms for understanding FLIR scenes

    NASA Astrophysics Data System (ADS)

    Lanterman, Aaron D.; Miller, Michael I.; Snyder, Donald L.

    1995-07-01

    Our pattern theoretic approach to the automated understanding of forward-looking infrared (FLIR) images brings the traditionally separate endeavors of detection, tracking, and recognition together into a unified jump-diffusion process. New objects are detected and object types are recognized through discrete jump moves. Between jumps, the location and orientation of objects are estimated via continuous diffusions. An hypothesized scene, simulated from the emissive characteristics of the hypothesized scene elements, is compared with the collected data by a likelihood function based on sensor statistics. This likelihood is combined with a prior distribution defined over the set of possible scenes to form a posterior distribution. The jump-diffusion process empirically generates the posterior distribution. Both the diffusion and jump operations involve the simulation of a scene produced by a hypothesized configuration. Scene simulation is most effectively accomplished by pipelined rendering engines such as silicon graphics. We demonstrate the execution of our algorithm on a silicon graphics onyx/reality engine.

  16. Multi-Objective Hybrid Optimal Control for Interplanetary Mission Planning

    NASA Technical Reports Server (NTRS)

    Englander, Jacob; Vavrina, Matthew; Ghosh, Alexander

    2015-01-01

    Preliminary design of low-thrust interplanetary missions is a highly complex process. The mission designer must choose discrete parameters such as the number of flybys, the bodies at which those flybys are performed and in some cases the final destination. In addition, a time-history of control variables must be chosen which defines the trajectory. There are often many thousands, if not millions, of possible trajectories to be evaluated. The customer who commissions a trajectory design is not usually interested in a point solution, but rather the exploration of the trade space of trajectories between several different objective functions. This can be a very expensive process in terms of the number of human analyst hours required. An automated approach is therefore very diserable. This work presents such as an approach by posing the mission design problem as a multi-objective hybrid optimal control problem. The method is demonstrated on a hypothetical mission to the main asteroid belt.

  17. Surface-admittance equivalence principle for nonradiating and cloaking problems

    NASA Astrophysics Data System (ADS)

    Labate, Giuseppe; Alù, Andrea; Matekovits, Ladislau

    2017-06-01

    In this paper, we address nonradiating and cloaking problems exploiting the surface equivalence principle, by imposing at any arbitrary boundary the control of the admittance discontinuity between the overall object (with or without cloak) and the background. After a rigorous demonstration, we apply this model to a nonradiating problem, appealing for anapole modes and metamolecules modeling, and to a cloaking problem, appealing for non-Foster metasurface design. A straightforward analytical condition is obtained for controlling the scattering of a dielectric object over a surface boundary of interest. Previous quasistatic results are confirmed and a general closed-form solution beyond the subwavelength regime is provided. In addition, this formulation can be extended to other wave phenomena once the proper admittance function is defined (thermal, acoustics, elastomechanics, etc.).

  18. Minimax terminal approach problem in two-level hierarchical nonlinear discrete-time dynamical system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shorikov, A. F., E-mail: afshorikov@mail.ru

    We consider a discrete–time dynamical system consisting of three controllable objects. The motions of all objects are given by the corresponding vector nonlinear or linear discrete–time recurrent vector relations, and control system for its has two levels: basic (first or I level) that is dominating and subordinate level (second or II level) and both have different criterions of functioning and united a priori by determined informational and control connections defined in advance. For the dynamical system in question, we propose a mathematical formalization in the form of solving a multistep problem of two-level hierarchical minimax program control over the terminalmore » approach process with incomplete information and give a general scheme for its solving.« less

  19. A stochastic conflict resolution model for trading pollutant discharge permits in river systems.

    PubMed

    Niksokhan, Mohammad Hossein; Kerachian, Reza; Amin, Pedram

    2009-07-01

    This paper presents an efficient methodology for developing pollutant discharge permit trading in river systems considering the conflict of interests of involving decision-makers and the stakeholders. In this methodology, a trade-off curve between objectives is developed using a powerful and recently developed multi-objective genetic algorithm technique known as the Nondominated Sorting Genetic Algorithm-II (NSGA-II). The best non-dominated solution on the trade-off curve is defined using the Young conflict resolution theory, which considers the utility functions of decision makers and stakeholders of the system. These utility functions are related to the total treatment cost and a fuzzy risk of violating the water quality standards. The fuzzy risk is evaluated using the Monte Carlo analysis. Finally, an optimization model provides the trading discharge permit policies. The practical utility of the proposed methodology in decision-making is illustrated through a realistic example of the Zarjub River in the northern part of Iran.

  20. Fatigue in a Representative Population of Older Persons and Its Association With Functional Impairment, Functional Limitation, and Disability

    PubMed Central

    Nayfield, Susan G.; Patel, Kushang V.; Eldadah, Basil; Cesari, Matteo; Ferrucci, Luigi; Ceresini, Graziano; Guralnik, Jack M.

    2009-01-01

    Background Older persons often complain of fatigue, but the functional consequences of this symptom are unclear. The aim of the present study was to evaluate fatigue and its association with measures of physical function and disability in a representative sample of the older population. Methods Cross-sectional data from a population-based sample of 1,055 Italian men and women aged 65 and older were analyzed. Fatigue was defined according to two questions evaluating whether participants felt that “everything was an effort” and/or they “could not get going” on three or more days in the past week. Objective measures of physical function were handgrip strength, the Short Physical Performance Battery (SPPB), and 400-m walking speed. Disability was defined as the inability to complete the 400-m walk test and self-reported difficulty in activities of daily living (ADL) and instrumental activities of daily living (IADL). Results The prevalence of fatigue was higher in women (29%) than in men (15%). In age-adjusted analyses, fatigued men and women had weaker handgrip strength, lower SPPB score, slower walking speed, and higher mobility, ADL, and IADL disability than nonfatigued persons. Further adjustment for health behaviors, diseases, inflammatory markers, and thyroid function generally reduced the relationship between fatigue and functional outcomes, but fatigue remained significantly associated with SPPB score, walking speed, and mobility and IADL disability. Conclusions Older persons who report fatigue had significantly poorer functional status than those who did not report this symptom. The causal link between fatigue and these outcomes should be further investigated. PMID:19176328

  1. Psychoanalysis and social cognitive neuroscience: a new framework for a dialogue.

    PubMed

    Georgieff, Nicolas

    2011-12-01

    The fields of psychoanalysis and neuroscience use different methods of description, analysis and comprehension of reality, and because each is based on a different methodology, each approach constructs a different representation of reality. Thus, psychoanalysis could contribute to a general psychology involving neuroscience to the extent that a "psychoanalytical psychology" (the theory of mental functioning that is extrapolated from psychoanalytical practice) defines natural objects of study (mind functions) for a multidisciplinary approach. However, the so called "naturalisation" of psychoanalytical concepts (metapsychology) does not imply the reduction of these concepts to biology; rather, it suggests a search for compatibility between psychoanalytical concepts and neuroscientific description. Such compatibility would mean the search for common objects that could be described from either a psychoanalytic or a neuroscientific point of view. We suggest that inter-subjectivity, empathy or "co-thinking" processes, from early development to the psychoanalytic relationship or the interaction between the patient and the analyst, could be such a common object for cognitive social neuroscience and psychoanalysis. Together, neuroscience and psychoanalysis could then contribute to a multidisciplinary approach of psychic inter- or co-activity. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. Regression analysis as a design optimization tool

    NASA Technical Reports Server (NTRS)

    Perley, R.

    1984-01-01

    The optimization concepts are described in relation to an overall design process as opposed to a detailed, part-design process where the requirements are firmly stated, the optimization criteria are well established, and a design is known to be feasible. The overall design process starts with the stated requirements. Some of the design criteria are derived directly from the requirements, but others are affected by the design concept. It is these design criteria that define the performance index, or objective function, that is to be minimized within some constraints. In general, there will be multiple objectives, some mutually exclusive, with no clear statement of their relative importance. The optimization loop that is given adjusts the design variables and analyzes the resulting design, in an iterative fashion, until the objective function is minimized within the constraints. This provides a solution, but it is only the beginning. In effect, the problem definition evolves as information is derived from the results. It becomes a learning process as we determine what the physics of the system can deliver in relation to the desirable system characteristics. As with any learning process, an interactive capability is a real attriubute for investigating the many alternatives that will be suggested as learning progresses.

  3. Identification and restoration in 3D fluorescence microscopy

    NASA Astrophysics Data System (ADS)

    Dieterlen, Alain; Xu, Chengqi; Haeberle, Olivier; Hueber, Nicolas; Malfara, R.; Colicchio, B.; Jacquey, Serge

    2004-06-01

    3-D optical fluorescent microscopy becomes now an efficient tool for volumic investigation of living biological samples. The 3-D data can be acquired by Optical Sectioning Microscopy which is performed by axial stepping of the object versus the objective. For any instrument, each recorded image can be described by a convolution equation between the original object and the Point Spread Function (PSF) of the acquisition system. To assess performance and ensure the data reproducibility, as for any 3-D quantitative analysis, the system indentification is mandatory. The PSF explains the properties of the image acquisition system; it can be computed or acquired experimentally. Statistical tools and Zernike moments are shown appropriate and complementary to describe a 3-D system PSF and to quantify the variation of the PSF as function of the optical parameters. Some critical experimental parameters can be identified with these tools. This is helpful for biologist to define an aquisition protocol optimizing the use of the system. Reduction of out-of-focus light is the task of 3-D microscopy; it is carried out computationally by deconvolution process. Pre-filtering the images improves the stability of deconvolution results, now less dependent on the regularization parameter; this helps the biologists to use restoration process.

  4. Scenario-Based Assessment of User Needs for Point-of-Care Robots

    PubMed Central

    Lee, Hyeong Suk

    2018-01-01

    Objectives This study aimed to derive specific user requirements and barriers in a real medical environment to define the essential elements and functions of two types of point-of-care (POC) robot: a telepresence robot as a tool for teleconsultation, and a bedside robot to provide emotional care for patients. Methods An analysis of user requirements was conducted; user needs were gathered and identified, and detailed, realistic scenarios were created. The prototype robots were demonstrated in physical environments for envisioning and evaluation. In all, three nurses and three clinicians participated as evaluators to observe the demonstrations and evaluate the robot systems. The evaluators were given a brief explanation of each scene and the robots' functionality. Four major functions of the teleconsultation robot were defined and tested in the demonstration. In addition, four major functions of the bedside robot were evaluated. Results Among the desired functions for a teleconsultation robot, medical information delivery and communication had high priority. For a bedside robot, patient support, patient monitoring, and healthcare provider support were the desired functions. The evaluators reported that the teleconsultation robot can increase support from and access to specialists and resources. They mentioned that the bedside robot can improve the quality of hospital life. Problems identified in the demonstration were those of space conflict, communication errors, and safety issues. Conclusions Incorporating this technology into healthcare services will enhance communication and teamwork skills across distances and thereby facilitate teamwork. However, repeated tests will be needed to evaluate and ensure improved performance. PMID:29503748

  5. Improvement of quality of 3D printed objects by elimination of microscopic structural defects in fused deposition modeling.

    PubMed

    Gordeev, Evgeniy G; Galushko, Alexey S; Ananikov, Valentine P

    2018-01-01

    Additive manufacturing with fused deposition modeling (FDM) is currently optimized for a wide range of research and commercial applications. The major disadvantage of FDM-created products is their low quality and structural defects (porosity), which impose an obstacle to utilizing them in functional prototyping and direct digital manufacturing of objects intended to contact with gases and liquids. This article describes a simple and efficient approach for assessing the quality of 3D printed objects. Using this approach it was shown that the wall permeability of a printed object depends on its geometric shape and is gradually reduced in a following series: cylinder > cube > pyramid > sphere > cone. Filament feed rate, wall geometry and G-code-defined wall structure were found as primary parameters that influence the quality of 3D-printed products. Optimization of these parameters led to an overall increase in quality and improvement of sealing properties. It was demonstrated that high quality of 3D printed objects can be achieved using routinely available printers and standard filaments.

  6. The Human Likeness Dimension of the “Uncanny Valley Hypothesis”: Behavioral and Functional MRI Findings

    PubMed Central

    Cheetham, Marcus; Suter, Pascal; Jäncke, Lutz

    2011-01-01

    The uncanny valley hypothesis (Mori, 1970) predicts differential experience of negative and positive affect as a function of human likeness. Affective experience of humanlike robots and computer-generated characters (avatars) dominates “uncanny” research, but findings are inconsistent. Importantly, it is unknown how objects are actually perceived along the hypothesis’ dimension of human likeness (DOH), defined in terms of human physical similarity. To examine whether the DOH can also be defined in terms of effects of categorical perception (CP), stimuli from morph continua with controlled differences in physical human likeness between avatar and human faces as endpoints were presented. Two behavioral studies found a sharp category boundary along the DOH and enhanced visual discrimination (i.e., CP) of fine-grained differences between pairs of faces at the category boundary. Discrimination was better for face pairs presenting category change in the human-to-avatar than avatar-to-human direction along the DOH. To investigate brain representation of physical change and category change along the DOH, an event-related functional magnetic resonance imaging study used the same stimuli in a pair-repetition priming paradigm. Bilateral mid-fusiform areas and a different right mid-fusiform area were sensitive to physical change within the human and avatar categories, respectively, whereas entirely different regions were sensitive to the human-to-avatar (caudate head, putamen, thalamus, red nucleus) and avatar-to-human (hippocampus, amygdala, mid-insula) direction of category change. These findings show that Mori’s DOH definition does not reflect subjective perception of human likeness and suggest that future “uncanny” studies consider CP and the DOH’s category structure in guiding experience of non-human objects. PMID:22131970

  7. Fast and robust shape diameter function.

    PubMed

    Chen, Shuangmin; Liu, Taijun; Shu, Zhenyu; Xin, Shiqing; He, Ying; Tu, Changhe

    2018-01-01

    The shape diameter function (SDF) is a scalar function defined on a closed manifold surface, measuring the neighborhood diameter of the object at each point. Due to its pose oblivious property, SDF is widely used in shape analysis, segmentation and retrieval. However, computing SDF is computationally expensive since one has to place an inverted cone at each point and then average the penetration distances for a number of rays inside the cone. Furthermore, the shape diameters are highly sensitive to local geometric features as well as the normal vectors, hence diminishing their applications to real-world meshes which often contain rich geometric details and/or various types of defects, such as noise and gaps. In order to increase the robustness of SDF and promote it to a wide range of 3D models, we define SDF by offsetting the input object a little bit. This seemingly minor change brings three significant benefits: First, it allows us to compute SDF in a robust manner since the offset surface is able to give reliable normal vectors. Second, it runs many times faster since at each point we only need to compute the penetration distance along a single direction, rather than tens of directions. Third, our method does not require watertight surfaces as the input-it supports both point clouds and meshes with noise and gaps. Extensive experimental results show that the offset-surface based SDF is robust to noise and insensitive to geometric details, and it also runs about 10 times faster than the existing method. We also exhibit its usefulness using two typical applications including shape retrieval and shape segmentation, and observe a significant improvement over the existing SDF.

  8. Preliminary investigation of Brain Network Activation (BNA) and its clinical utility in sport-related concussion

    PubMed Central

    Reches, A.; Kutcher, J.; Elbin, R. J.; Or-Ly, H.; Sadeh, B.; Greer, J.; McAllister, D. J.; Geva, A.; Kontos, A. P.

    2017-01-01

    ABSTRACT Background: The clinical diagnosis and management of patients with sport-related concussion is largely dependent on subjectively reported symptoms, clinical examinations, cognitive, balance, vestibular and oculomotor testing. Consequently, there is an unmet need for objective assessment tools that can identify the injury from a physiological perspective and add an important layer of information to the clinician’s decision-making process. Objective: The goal of the study was to evaluate the clinical utility of the EEG-based tool named Brain Network Activation (BNA) as a longitudinal assessment method of brain function in the management of young athletes with concussion. Methods: Athletes with concussion (n = 86) and age-matched controls (n = 81) were evaluated at four time points with symptom questionnaires and BNA. BNA scores were calculated by comparing functional networks to a previously defined normative reference brain network model to the same cognitive task. Results: Subjects above 16 years of age exhibited a significant decrease in BNA scores immediately following injury, as well as notable changes in functional network activity, relative to the controls. Three representative case studies of the tested population are discussed in detail, to demonstrate the clinical utility of BNA. Conclusion: The data support the utility of BNA to augment clinical examinations, symptoms and additional tests by providing an effective method for evaluating objective electrophysiological changes associated with sport-related concussions. PMID:28055228

  9. Valuing hydrological alteration in multi-objective water resources management

    NASA Astrophysics Data System (ADS)

    Bizzi, Simone; Pianosi, Francesca; Soncini-Sessa, Rodolfo

    2012-11-01

    SummaryThe management of water through the impoundment of rivers by dams and reservoirs is necessary to support key human activities such as hydropower production, agriculture and flood risk mitigation. Advances in multi-objective optimization techniques and ever growing computing power make it possible to design reservoir operating policies that represent Pareto-optimal tradeoffs between multiple interests. On the one hand, such optimization methods can enhance performances of commonly targeted objectives (such as hydropower production or water supply), on the other hand they risk strongly penalizing all the interests not directly (i.e. mathematically) included in the optimization algorithm. The alteration of the downstream hydrological regime is a well established cause of ecological degradation and its evaluation and rehabilitation is commonly required by recent legislation (as the Water Framework Directive in Europe). However, it is rarely embedded in reservoir optimization routines and, even when explicitly considered, the criteria adopted for its evaluation are doubted and not commonly trusted, undermining the possibility of real implementation of environmentally friendly policies. The main challenges in defining and assessing hydrological alterations are: how to define a reference state (referencing); how to define criteria upon which to build mathematical indicators of alteration (measuring); and finally how to aggregate the indicators in a single evaluation index (valuing) that can serve as objective function in the optimization problem. This paper aims to address these issues by: (i) discussing the benefits and constrains of different approaches to referencing, measuring and valuing hydrological alteration; (ii) testing two alternative indices of hydrological alteration, one based on the established framework of Indicators of Hydrological Alteration (Richter et al., 1996), and one satisfying the mathematical properties required by widely used optimization methods based on dynamic programming; (iii) demonstrating and discussing these indices by application River Ticino, in Italy; (iv) providing a framework to effectively include hydrological alteration within reservoir operation optimization.

  10. Attribute conjunctions and the part configuration advantage in object category learning.

    PubMed

    Saiki, J; Hummel, J E

    1996-07-01

    Five experiments demonstrated that in object category learning people are particularly sensitive to conjunctions of part shapes and relative locations. Participants learned categories defined by a part's shape and color (part-color conjunctions) or by a part's shape and its location relative to another part (part-location conjunctions). The statistical properties of the categories were identical across these conditions, as were the salience of color and relative location. Participants were better at classifying objects defined by part-location conjunctions than objects defined by part-color conjunctions. Subsequent experiments revealed that this effect was not due to the specific color manipulation or the role of location per se. These results suggest that the shape bias in object categorization is at least partly due to sensitivity to part-location conjunctions and suggest a new processing constraint on category learning.

  11. Automatic target recognition apparatus and method

    DOEpatents

    Baumgart, Chris W.; Ciarcia, Christopher A.

    2000-01-01

    An automatic target recognition apparatus (10) is provided, having a video camera/digitizer (12) for producing a digitized image signal (20) representing an image containing therein objects which objects are to be recognized if they meet predefined criteria. The digitized image signal (20) is processed within a video analysis subroutine (22) residing in a computer (14) in a plurality of parallel analysis chains such that the objects are presumed to be lighter in shading than the background in the image in three of the chains and further such that the objects are presumed to be darker than the background in the other three chains. In two of the chains the objects are defined by surface texture analysis using texture filter operations. In another two of the chains the objects are defined by background subtraction operations. In yet another two of the chains the objects are defined by edge enhancement processes. In each of the analysis chains a calculation operation independently determines an error factor relating to the probability that the objects are of the type which should be recognized, and a probability calculation operation combines the results of the analysis chains.

  12. Radar sea reflection for low-e targets

    NASA Astrophysics Data System (ADS)

    Chow, Winston C.; Groves, Gordon W.

    1998-09-01

    Modeling radar signal reflection from a wavy sea surface uses a realistic characteristic of the large surface features and parameterizes the effect of the small roughness elements. Representation of the reflection coefficient at each point of the sea surface as a function of the Specular Deviation Angle is, to our knowledge, a novel approach. The objective is to achieve enough simplification and retain enough fidelity to obtain a practical multipath model. The 'specular deviation angle' as used in this investigation is defined and explained. Being a function of the sea elevations, which are stochastic in nature, this quantity is also random and has a probability density function. This density function depends on the relative geometry of the antenna and target positions, and together with the beam- broadening effect of the small surface ripples determined the reflectivity of the sea surface at each point. The probability density function of the specular deviation angle is derived. The distribution of the specular deviation angel as function of position on the mean sea surface is described.

  13. [Mathematic concept model of accumulation of functional disorders associated with environmental factors].

    PubMed

    Zaĭtseva, N V; Trusov, P V; Kir'ianov, D A

    2012-01-01

    The mathematic concept model presented describes accumulation of functional disorders associated with environmental factors, plays predictive role and is designed for assessments of possible effects caused by heterogenous factors with variable exposures. Considering exposure changes with self-restoration process opens prospects of using the model to evaluate, analyse and manage occupational risks. To develop current theoretic approaches, the authors suggested a model considering age-related body peculiarities, systemic interactions of organs, including neuro-humoral regulation, accumulation of functional disorders due to external factors, rehabilitation of functions during treatment. General objective setting covers defining over a hundred unknow coefficients that characterize speed of various processes within the body. To solve this problem, the authors used iteration approach, successive identification, that starts from the certain primary approximation of the model parameters and processes subsequent updating on the basis of new theoretic and empirical knowledge.

  14. Ocean data assimilation using optimal interpolation with a quasi-geostrophic model

    NASA Technical Reports Server (NTRS)

    Rienecker, Michele M.; Miller, Robert N.

    1991-01-01

    A quasi-geostrophic (QG) stream function is analyzed by optimal interpolation (OI) over a 59-day period in a 150-km-square domain off northern California. Hydrographic observations acquired over five surveys were assimilated into a QG open boundary ocean model. Assimilation experiments were conducted separately for individual surveys to investigate the sensitivity of the OI analyses to parameters defining the decorrelation scale of an assumed error covariance function. The analyses were intercompared through dynamical hindcasts between surveys. The best hindcast was obtained using the smooth analyses produced with assumed error decorrelation scales identical to those of the observed stream function. The rms difference between the hindcast stream function and the final analysis was only 23 percent of the observation standard deviation. The two sets of OI analyses were temporally smoother than the fields from statistical objective analysis and in good agreement with the only independent data available for comparison.

  15. The guidance of spatial attention during visual search for color combinations and color configurations.

    PubMed

    Berggren, Nick; Eimer, Martin

    2016-09-01

    Representations of target-defining features (attentional templates) guide the selection of target objects in visual search. We used behavioral and electrophysiological measures to investigate how such search templates control the allocation of attention in search tasks where targets are defined by the combination of 2 colors or by a specific spatial configuration of these colors. Target displays were preceded by spatially uninformative cue displays that contained items in 1 or both target-defining colors. Experiments 1 and 2 demonstrated that, during search for color combinations, attention is initially allocated independently and in parallel to all objects with target-matching colors, but is then rapidly withdrawn from objects that only have 1 of the 2 target colors. In Experiment 3, targets were defined by a particular spatial configuration of 2 colors, and could be accompanied by nontarget objects with a different configuration of the same colors. Attentional guidance processes were unable to distinguish between these 2 types of objects. Both attracted attention equally when they appeared in a cue display, and both received parallel focal-attentional processing and were encoded into working memory when they were presented in the same target display. Results demonstrate that attention can be guided simultaneously by multiple features from the same dimension, but that these guidance processes have no access to the spatial-configural properties of target objects. They suggest that attentional templates do not represent target objects in an integrated pictorial fashion, but contain separate representations of target-defining features. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  16. Healthy control subjects are poorly defined in case-control studies of irritable bowel syndrome

    PubMed Central

    Ghorbani, Shireen; Nejad, Amir; Law, David; Chua, Kathleen S.; Amichai, Meridythe M.; Pimentel, Mark

    2015-01-01

    Background Case-control studies are vital for understanding the pathophysiology of gastrointestinal disease. While the definition of disease is clear, the definition of healthy control is not. This is particularly relevant for functional bowel diseases such as irritable bowel syndrome (IBS). In this study, a systematic review formed the basis for a prospective study evaluating the effectiveness of commonly used techniques for defining healthy controls in IBS. Methods A systematic review of the literature was conducted to identify case-control studies involving functional gastrointestinal disorders. “Lack of Rome criteria”, self-description as “healthy” and the bowel disease questionnaire (BDQ) were common methods for identifying healthy controls. These 3 methods were then applied to a cohort of 53 non-patient subjects to determine their validity compared to objective outcome measures (7-day stool diary). Results “Lack of Rome criteria” and “healthy” self-description were the most common methods for identifying healthy control subjects, but many studies failed to describe the methods used. In the prospective study, more subjects were identified as non-healthy using the BDQ than using either lack of Rome criteria (P=0.01) or “healthy” self-description (P=0.026). Furthermore, stool diaries identified several subjects with abnormal stool form and/or frequency which were not identified using lack of Rome criteria or the “healthy” question. Comparisons revealed no agreement (κ) between the different methods for defining healthy controls. Conclusions The definitions of healthy controls in studies of functional bowel diseases such as IBS are inconsistent. Since functional symptoms are common, a strict definition of “normal” is needed in this area of research. PMID:25609236

  17. Culture & Cognition Laboratory

    DTIC Science & Technology

    2011-05-01

    life: Real world social-interaction cooperative tasks are inherently unequal in difficulty. Re-scoring performance on unequal tasks in order to enable...real- world situations to which this model is intended to apply, it is possible for calls for help to not be heard, or for a potential help-provider to...not have clear, well-defined objectives. Since many complex real- worlds tasks are not well-defined, defining a realistic objective can be considered a

  18. Research and Evaluations of the Health Aspects of Disasters, Part VII: The Relief/Recovery Framework.

    PubMed

    Birnbaum, Marvin L; Daily, Elaine K; O'Rourke, Ann P

    2016-04-01

    The principal goal of research relative to disasters is to decrease the risk that a hazard will result in a disaster. Disaster studies pursue two distinct directions: (1) epidemiological (non-interventional); and (2) interventional. Both interventional and non-interventional studies require data/information obtained from assessments of function. Non-interventional studies examine the epidemiology of disasters. Interventional studies evaluate specific interventions/responses in terms of their effectiveness in meeting their respective objectives, their contribution to the overarching goal, other effects created, their respective costs, and the efficiency with which they achieved their objectives. The results of interventional studies should contribute to evidence that will be used to inform the decisions used to define standards of care and best practices for a given setting based on these standards. Interventional studies are based on the Disaster Logic Model (DLM) and are used to change or maintain levels of function (LOFs). Relief and Recovery interventional studies seek to determine the effects, outcomes, impacts, costs, and value of the intervention provided after the onset of a damaging event. The Relief/Recovery Framework provides the structure needed to systematically study the processes involved in providing relief or recovery interventions that result in a new LOF for a given Societal System and/or its component functions. It consists of the following transformational processes (steps): (1) identification of the functional state prior to the onset of the event (pre-event); (2) assessments of the current functional state; (3) comparison of the current functional state with the pre-event state and with the results of the last assessment; (4) needs identification; (5) strategic planning, including establishing the overall strategic goal(s), objectives, and priorities for interventions; (6) identification of options for interventions; (7) selection of the most appropriate intervention(s); (8) operational planning; (9) implementation of the intervention(s); (10) assessments of the effects and changes in LOFs resulting from the intervention(s); (11) determination of the costs of providing the intervention; (12) determination of the current functional status; (13) synthesis of the findings with current evidence to define the benefits and value of the intervention to the affected population; and (14) codification of the findings into new evidence. Each of these steps in the Framework is a production function that facilitates evaluation, and the outputs of the transformation process establish the current state for the next step in the process. The evidence obtained is integrated into augmenting the respective Response Capacities of a community-at-risk. The ultimate impact of enhanced Response Capacity is determined by studying the epidemiology of the next event.

  19. Optimal Inversion Parameters for Full Waveform Inversion using OBS Data Set

    NASA Astrophysics Data System (ADS)

    Kim, S.; Chung, W.; Shin, S.; Kim, D.; Lee, D.

    2017-12-01

    In recent years, full Waveform Inversion (FWI) has been the most researched technique in seismic data processing. It uses the residuals between observed and modeled data as an objective function; thereafter, the final subsurface velocity model is generated through a series of iterations meant to minimize the residuals.Research on FWI has expanded from acoustic media to elastic media. In acoustic media, the subsurface property is defined by P-velocity; however, in elastic media, properties are defined by multiple parameters, such as P-velocity, S-velocity, and density. Further, the elastic media can also be defined by Lamé constants, density or impedance PI, SI; consequently, research is being carried out to ascertain the optimal parameters.From results of advanced exploration equipment and Ocean Bottom Seismic (OBS) survey, it is now possible to obtain multi-component seismic data. However, to perform FWI on these data and generate an accurate subsurface model, it is important to determine optimal inversion parameters among (Vp, Vs, ρ), (λ, μ, ρ), and (PI, SI) in elastic media. In this study, staggered grid finite difference method was applied to simulate OBS survey. As in inversion, l2-norm was set as objective function. Further, the accurate computation of gradient direction was performed using the back-propagation technique and its scaling was done using the Pseudo-hessian matrix.In acoustic media, only Vp is used as the inversion parameter. In contrast, various sets of parameters, such as (Vp, Vs, ρ) and (λ, μ, ρ) can be used to define inversion in elastic media. Therefore, it is important to ascertain the parameter that gives the most accurate result for inversion with OBS data set.In this study, we generated Vp and Vs subsurface models by using (λ, μ, ρ) and (Vp, Vs, ρ) as inversion parameters in every iteration, and compared the final two FWI results.This research was supported by the Basic Research Project(17-3312) of the Korea Institute of Geoscience and Mineral Resources(KIGAM) funded by the Ministry of Science, ICT and Future Planning of Korea.

  20. Functional remediation components: A conceptual method of evaluating the effects of remediation on risks to ecological receptors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burger, Joanna; Gochfeld, Michael; Bunn, Amoret

    2016-08-30

    Governmental agencies, regulators, health professionals, tribal leaders, and the public are faced with understanding and evaluating the effects of cleanup activities on species, populations, and ecosystems. While engineers and managers understand the processes involved in different remediation types such as capping, pump and treat, and natural attenuation, there is often a disconnect between (1) how ecologists view the influence of different types of remediation, (2) how the public perceives them, and (3) how engineers understand them. The overall goal of the present investigation was to define the components of remediation types (= functional remediation). Objectives were to (1) define andmore » describe functional components of remediation, regardless of the remediation type, (2) provide examples of each functional remediation component, and (3) explore potential effects of functional remediation components in the post-cleanup phase that may involve continued monitoring and assessment. Functional remediation components include types, numbers, and intensity of people, trucks, heavy equipment, pipes, and drill holes, among others. Several components may be involved in each remediation type, and each results in ecological effects, ranging from trampling of plants, to spreading invasive species, to disturbing rare species, and to creating fragmented habitats. In some cases remediation may exert a greater effect on ecological receptors than leaving the limited contamination in place. A goal of this conceptualization is to break down functional components of remediation such that managers, regulators, and the public might assess the effects of timing, extent, and duration of different remediation options on ecological systems.« less

  1. Targeting Treatments to Improve Cognitive Function in Mood Disorder: Suggestions From Trials Using Erythropoietin.

    PubMed

    Miskowiak, Kamilla Woznica; Rush, A John; Gerds, Thomas A; Vinberg, Maj; Kessing, Lars V

    2016-12-01

    There is no established efficacious treatment for cognitive dysfunction in unipolar and bipolar disorder. This may be partially due to lack of consensus regarding the need to screen for cognitive impairment in cognition trials or which screening criteria to use. We have demonstrated in 2 randomized placebo-controlled trials that 8 weeks of erythropoietin (EPO) treatment has beneficial effects on verbal memory across unipolar and bipolar disorder, with 58% of EPO-treated patients displaying a clinically relevant memory improvement as compared to 15% of those treated with placebo. We reassessed the data from our 2 EPO trials conducted between September 2009 and October 2012 to determine whether objective performance-based memory impairment or subjective self-rated cognitive impairment at baseline was related to the effect of EPO on cognitive function as assessed by Rey Auditory Verbal Learning Test (RAVLT) total recall with multiple logistic regression adjusted for diagnosis, age, gender, symptom severity, and education levels. We included 79 patients with an ICD-10 diagnosis of unipolar or bipolar disorder, of whom 39 received EPO and 40 received placebo (saline). For EPO-treated patients with objective memory dysfunction at baseline (n = 16) (defined as RAVLT total recall ≤ 43), the odds of a clinically relevant memory improvement were increased by a factor of 290.6 (95% CI, 2.7-31,316.4; P = .02) compared to patients with no baseline impairment (n = 23). Subjective cognitive complaints (measured with the Cognitive and Physical Functioning Questionnaire) and longer illness duration were associated with small increases in patients' chances of treatment efficacy on memory (53% and 16% increase, respectively; P ≤ .04). Diagnosis, gender, age, baseline depression severity, and number of mood episodes did not significantly change the chances of EPO treatment success (P ≥ .06). In the placebo-treated group, the odds of memory improvement were not significantly different for patients with or without objectively defined memory dysfunction (P ≥ .59) or subjective complaints at baseline (P ≥ .06). Baseline objectively assessed memory impairments and-to a lesser degree-subjective cognitive complaints increased the chances of treatment efficacy on cognition in unipolar and bipolar disorder. ClinicalTrials.gov identifier: NCT00916552. © Copyright 2016 Physicians Postgraduate Press, Inc.

  2. From scenarios to domain models: processes and representations

    NASA Astrophysics Data System (ADS)

    Haddock, Gail; Harbison, Karan

    1994-03-01

    The domain specific software architectures (DSSA) community has defined a philosophy for the development of complex systems. This philosophy improves productivity and efficiency by increasing the user's role in the definition of requirements, increasing the systems engineer's role in the reuse of components, and decreasing the software engineer's role to the development of new components and component modifications only. The scenario-based engineering process (SEP), the first instantiation of the DSSA philosophy, has been adopted by the next generation controller project. It is also the chosen methodology of the trauma care information management system project, and the surrogate semi-autonomous vehicle project. SEP uses scenarios from the user to create domain models and define the system's requirements. Domain knowledge is obtained from a variety of sources including experts, documents, and videos. This knowledge is analyzed using three techniques: scenario analysis, task analysis, and object-oriented analysis. Scenario analysis results in formal representations of selected scenarios. Task analysis of the scenario representations results in descriptions of tasks necessary for object-oriented analysis and also subtasks necessary for functional system analysis. Object-oriented analysis of task descriptions produces domain models and system requirements. This paper examines the representations that support the DSSA philosophy, including reference requirements, reference architectures, and domain models. The processes used to create and use the representations are explained through use of the scenario-based engineering process. Selected examples are taken from the next generation controller project.

  3. Developing CORBA-Based Distributed Scientific Applications from Legacy Fortran Programs

    NASA Technical Reports Server (NTRS)

    Sang, Janche; Kim, Chan; Lopez, Isaac

    2000-01-01

    Recent progress in distributed object technology has enabled software applications to be developed and deployed easily such that objects or components can work together across the boundaries of the network, different operating systems, and different languages. A distributed object is not necessarily a complete application but rather a reusable, self-contained piece of software that co-operates with other objects in a plug-and-play fashion via a well-defined interface. The Common Object Request Broker Architecture (CORBA), a middleware standard defined by the Object Management Group (OMG), uses the Interface Definition Language (IDL) to specify such an interface for transparent communication between distributed objects. Since IDL can be mapped to any programming language, such as C++, Java, Smalltalk, etc., existing applications can be integrated into a new application and hence the tasks of code re-writing and software maintenance can be reduced. Many scientific applications in aerodynamics and solid mechanics are written in Fortran. Refitting these legacy Fortran codes with CORBA objects can increase the codes reusability. For example, scientists could link their scientific applications to vintage Fortran programs such as Partial Differential Equation(PDE) solvers in a plug-and-play fashion. Unfortunately, CORBA IDL to Fortran mapping has not been proposed and there seems to be no direct method of generating CORBA objects from Fortran without having to resort to manually writing C/C++ wrappers. In this paper, we present an efficient methodology to integrate Fortran legacy programs into a distributed object framework. Issues and strategies regarding the conversion and decomposition of Fortran codes into CORBA objects are discussed. The following diagram shows the conversion and decomposition mechanism we proposed. Our goal is to keep the Fortran codes unmodified. The conversion- aided tool takes the Fortran application program as input and helps programmers generate C/C++ header file and IDL file for wrapping the Fortran code. Programmers need to determine by themselves how to decompose the legacy application into several reusable components based on the cohesion and coupling factors among the functions and subroutines. However, programming effort still can be greatly reduced because function headings and types have been converted to C++ and IDL styles. Most Fortran applications use the COMMON block to facilitate the transfer of large amount of variables among several functions. The COMMON block plays the similar role of global variables used in C. In the CORBA-compliant programming environment, global variables can not be used to pass values between objects. One approach to dealing with this problem is to put the COMMON variables into the parameter list. We do not adopt this approach because it requires modification of the Fortran source code which violates our design consideration. Our approach is to extract the COMMON blocks and convert them into a structure-typed attribute in C++. Through attributes, each component can initialize the variables and return the computation result back to the client. We have tested successfully the proposed conversion methodology based on the f2c converter. Since f2c only translates Fortran to C, we still needed to edit the converted code to meet the C++ and IDL syntax. For example, C++/IDL requires a tag in the structure type, while C does not. In this paper, we identify the necessary changes to the f2c converter in order to directly generate the C++ header and the IDL file. Our future work is to add GUI interface to ease the decomposition task by simply dragging and dropping icons.

  4. Optical and Near-infrared Spectra of σ Orionis Isolated Planetary-mass Objects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zapatero Osorio, M. R.; Béjar, V. J. S.; Ramírez, K. Peña, E-mail: mosorio@cab.inta-csic.es, E-mail: vbejar@iac.es, E-mail: karla.pena@uantof.cl

    We have obtained low-resolution optical (0.7–0.98 μ m) and near-infrared (1.11–1.34 μ m and 0.8–2.5 μ m) spectra of 12 isolated planetary-mass candidates ( J = 18.2–19.9 mag) of the 3 Myr σ Orionis star cluster with the aim of determining the spectroscopic properties of very young, substellar dwarfs and assembling a complete cluster mass function. We have classified our targets by visual comparison with high- and low-gravity standards and by measuring newly defined spectroscopic indices. We derived L0–L4.5 and M9–L2.5 using high- and low-gravity standards, respectively. Our targets reveal clear signposts of youth, thus corroborating their cluster membership andmore » planetary masses (6–13 M {sub Jup}). These observations complete the σ Orionis mass function by spectroscopically confirming the planetary-mass domain to a confidence level of ∼75%. The comparison of our spectra with BT-Settl solar metallicity model atmospheres yields a temperature scale of 2350–1800 K and a low surface gravity of log g ≈ 4.0 [cm s{sup −2}], as would be expected for young planetary-mass objects. We discuss the properties of the cluster’s least-massive population as a function of spectral type. We have also obtained the first optical spectrum of S Ori 70, a T dwarf in the direction of σ Orionis. Our data provide reference optical and near-infrared spectra of very young L dwarfs and a mass function that may be used as templates for future studies of low-mass substellar objects and exoplanets. The extrapolation of the σ Orionis mass function to the solar neighborhood may indicate that isolated planetary-mass objects with temperatures of ∼200–300 K and masses in the interval 6–13 M {sub Jup} may be as numerous as very low-mass stars.« less

  5. Impact of chronic obstructive pulmonary diseases on left ventricular diastolic function in hospitalized elderly patients

    PubMed Central

    Huang, Ying-Shuo; Feng, Ying-Chao; Zhang, Jian; Bai, Li; Huang, Wei; Li, Min; Sun, Ying

    2015-01-01

    Objective To evaluate the impact of chronic obstructive pulmonary disease (COPD) on left ventricular (LV) diastolic function in hospitalized elderly patients. Methods This was a case–control observational study of 148 consecutive hospitalized elderly patients (≥65 years old): 73 subjects without COPD as controls and 75 patients with COPD. Mild-to-moderate COPD was defined as stages 1 and 2, while severe and very severe COPD was defined as stages 3 and 4, according to the Global Initiative for Chronic Obstructive Lung Disease guidelines. Clinical characteristics and echocardiographic parameters were analyzed and compared. Results Compared with the control group, patients with COPD had a higher frequency of LV diastolic dysfunction and heart failure with preserved ejection fraction. Smoking frequency, frequency of cerebrovascular diseases and diabetes, and serum N-terminal pro-B-type natriuretic peptide (NT-proBNP) levels were higher in the COPD group (all P<0.05). COPD patients showed more abnormalities in diastolic function (E/e′: 11.51±2.50 vs 10.42±3.25, P=0.047), but no differences in systolic function and right ventricular function (all P>0.05). Patients with severe/very severe COPD showed no differences in LV diastolic function compared to patients with mild/moderate COPD (P>0.05), but serum NT-proBNP levels were higher in severe/very severe COPD (P<0.05). Conclusion Results suggest that early-stage COPD may have an impact on the LV diastolic function. Severe COPD mainly affected right ventricular function. In hospitalized elderly patients with COPD, LV diastolic dysfunction should be taken into account together with right ventricular function. PMID:25565790

  6. The Neurologic Assessment in Neuro-Oncology (NANO) scale: a tool to assess neurologic function for integration into the Response Assessment in Neuro-Oncology (RANO) criteria.

    PubMed

    Nayak, Lakshmi; DeAngelis, Lisa M; Brandes, Alba A; Peereboom, David M; Galanis, Evanthia; Lin, Nancy U; Soffietti, Riccardo; Macdonald, David R; Chamberlain, Marc; Perry, James; Jaeckle, Kurt; Mehta, Minesh; Stupp, Roger; Muzikansky, Alona; Pentsova, Elena; Cloughesy, Timothy; Iwamoto, Fabio M; Tonn, Joerg-Christian; Vogelbaum, Michael A; Wen, Patrick Y; van den Bent, Martin J; Reardon, David A

    2017-05-01

    The Macdonald criteria and the Response Assessment in Neuro-Oncology (RANO) criteria define radiologic parameters to classify therapeutic outcome among patients with malignant glioma and specify that clinical status must be incorporated and prioritized for overall assessment. But neither provides specific parameters to do so. We hypothesized that a standardized metric to measure neurologic function will permit more effective overall response assessment in neuro-oncology. An international group of physicians including neurologists, medical oncologists, radiation oncologists, and neurosurgeons with expertise in neuro-oncology drafted the Neurologic Assessment in Neuro-Oncology (NANO) scale as an objective and quantifiable metric of neurologic function evaluable during a routine office examination. The scale was subsequently tested in a multicenter study to determine its overall reliability, inter-observer variability, and feasibility. The NANO scale is a quantifiable evaluation of 9 relevant neurologic domains based on direct observation and testing conducted during routine office visits. The score defines overall response criteria. A prospective, multinational study noted a >90% inter-observer agreement rate with kappa statistic ranging from 0.35 to 0.83 (fair to almost perfect agreement), and a median assessment time of 4 minutes (interquartile range, 3-5). The NANO scale provides an objective clinician-reported outcome of neurologic function with high inter-observer agreement. It is designed to combine with radiographic assessment to provide an overall assessment of outcome for neuro-oncology patients in clinical trials and in daily practice. Furthermore, it complements existing patient-reported outcomes and cognition testing to combine for a global clinical outcome assessment of well-being among brain tumor patients. © The Author(s) 2017. Published by Oxford University Press on behalf of the Society for Neuro-Oncology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  7. Risk perception in consumer product use.

    PubMed

    Weegels, M F; Kanis, H

    2000-05-01

    In the literature, at least two distinct connotations of risk can be found: so called objective risk, defined as the ratio of a particular number of accidents and a measure of exposure, and subjective risk, defined as the perception and awareness of risks by the person(s) involved. This article explores the significance of risk perception and awareness in understanding and clarifying how and why accidents involving consumer products occur. Based on empirical evidence from video-recorded reconstructions of accidents with consumer products, the risk perception and awareness of users in relation to featural and functional product characteristics, and their influence on actual product use culminating in an accident, is addressed. In contrast with what is usually assumed in the literature, the findings show that the majority of the subjects had no idea that they were running any risk of injuring themselves while they operated the product. In several accidents, the product either offered functionalities not anticipated in the design or did not adequately reflect its condition. The implications of the findings for design practice as well as for risk research are discussed.

  8. Accommodation requirements for microgravity science and applications research on space station

    NASA Technical Reports Server (NTRS)

    Uhran, M. L.; Holland, L. R.; Wear, W. O.

    1985-01-01

    Scientific research conducted in the microgravity environment of space represents a unique opportunity to explore and exploit the benefits of materials processing in the virtual abscence of gravity induced forces. NASA has initiated the preliminary design of a permanently manned space station that will support technological advances in process science and stimulate the development of new and improved materials having applications across the commercial spectrum. A study is performed to define from the researchers' perspective, the requirements for laboratory equipment to accommodate microgravity experiments on the space station. The accommodation requirements focus on the microgravity science disciplines including combustion science, electronic materials, metals and alloys, fluids and transport phenomena, glasses and ceramics, and polymer science. User requirements have been identified in eleven research classes, each of which contain an envelope of functional requirements for related experiments having similar characteristics, objectives, and equipment needs. Based on these functional requirements seventeen items of experiment apparatus and twenty items of core supporting equipment have been defined which represent currently identified equipment requirements for a pressurized laboratory module at the initial operating capability of the NASA space station.

  9. Uncertainty Analysis of Simulated Hydraulic Fracturing

    NASA Astrophysics Data System (ADS)

    Chen, M.; Sun, Y.; Fu, P.; Carrigan, C. R.; Lu, Z.

    2012-12-01

    Artificial hydraulic fracturing is being used widely to stimulate production of oil, natural gas, and geothermal reservoirs with low natural permeability. Optimization of field design and operation is limited by the incomplete characterization of the reservoir, as well as the complexity of hydrological and geomechanical processes that control the fracturing. Thus, there are a variety of uncertainties associated with the pre-existing fracture distribution, rock mechanics, and hydraulic-fracture engineering that require evaluation of their impact on the optimized design. In this study, a multiple-stage scheme was employed to evaluate the uncertainty. We first define the ranges and distributions of 11 input parameters that characterize the natural fracture topology, in situ stress, geomechanical behavior of the rock matrix and joint interfaces, and pumping operation, to cover a wide spectrum of potential conditions expected for a natural reservoir. These parameters were then sampled 1,000 times in an 11-dimensional parameter space constrained by the specified ranges using the Latin-hypercube method. These 1,000 parameter sets were fed into the fracture simulators, and the outputs were used to construct three designed objective functions, i.e. fracture density, opened fracture length and area density. Using PSUADE, three response surfaces (11-dimensional) of the objective functions were developed and global sensitivity was analyzed to identify the most sensitive parameters for the objective functions representing fracture connectivity, which are critical for sweep efficiency of the recovery process. The second-stage high resolution response surfaces were constructed with dimension reduced to the number of the most sensitive parameters. An additional response surface with respect to the objective function of the fractal dimension for fracture distributions was constructed in this stage. Based on these response surfaces, comprehensive uncertainty analyses were conducted among input parameters and objective functions. In addition, reduced-order emulation models resulting from this analysis can be used for optimal control of hydraulic fracturing. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  10. Cortical mechanisms for the segregation and representation of acoustic textures.

    PubMed

    Overath, Tobias; Kumar, Sukhbinder; Stewart, Lauren; von Kriegstein, Katharina; Cusack, Rhodri; Rees, Adrian; Griffiths, Timothy D

    2010-02-10

    Auditory object analysis requires two fundamental perceptual processes: the definition of the boundaries between objects, and the abstraction and maintenance of an object's characteristic features. Although it is intuitive to assume that the detection of the discontinuities at an object's boundaries precedes the subsequent precise representation of the object, the specific underlying cortical mechanisms for segregating and representing auditory objects within the auditory scene are unknown. We investigated the cortical bases of these two processes for one type of auditory object, an "acoustic texture," composed of multiple frequency-modulated ramps. In these stimuli, we independently manipulated the statistical rules governing (1) the frequency-time space within individual textures (comprising ramps with a given spectrotemporal coherence) and (2) the boundaries between textures (adjacent textures with different spectrotemporal coherences). Using functional magnetic resonance imaging, we show mechanisms defining boundaries between textures with different coherences in primary and association auditory cortices, whereas texture coherence is represented only in association cortex. Furthermore, participants' superior detection of boundaries across which texture coherence increased (as opposed to decreased) was reflected in a greater neural response in auditory association cortex at these boundaries. The results suggest a hierarchical mechanism for processing acoustic textures that is relevant to auditory object analysis: boundaries between objects are first detected as a change in statistical rules over frequency-time space, before a representation that corresponds to the characteristics of the perceived object is formed.

  11. Method and apparatus for configuration control of redundant robots

    NASA Technical Reports Server (NTRS)

    Seraji, Homayoun (Inventor)

    1991-01-01

    A method and apparatus to control a robot or manipulator configuration over the entire motion based on augmentation of the manipulator forward kinematics is disclosed. A set of kinematic functions is defined in Cartesian or joint space to reflect the desirable configuration that will be achieved in addition to the specified end-effector motion. The user-defined kinematic functions and the end-effector Cartesian coordinates are combined to form a set of task-related configuration variables as generalized coordinates for the manipulator. A task-based adaptive scheme is then utilized to directly control the configuration variables so as to achieve tracking of some desired reference trajectories throughout the robot motion. This accomplishes the basic task of desired end-effector motion, while utilizing the redundancy to achieve any additional task through the desired time variation of the kinematic functions. The present invention can also be used for optimization of any kinematic objective function, or for satisfaction of a set of kinematic inequality constraints, as in an obstacle avoidance problem. In contrast to pseudoinverse-based methods, the configuration control scheme ensures cyclic motion of the manipulator, which is an essential requirement for repetitive operations. The control law is simple and computationally very fast, and does not require either the complex manipulator dynamic model or the complicated inverse kinematic transformation. The configuration control scheme can alternatively be implemented in joint space.

  12. Associations of Mental Health and Physical Function with Colonoscopy-related Pain

    PubMed Central

    Yamada, Eiji; Watanabe, Seitaro; Nakajima, Atsushi

    2017-01-01

    Objective To clarify the effects of mental health and physical function in association with colonoscopy-related pain. Methods The mental health and physical function were evaluated using the Japanese version of the SF-8 Health Survey questionnaire. Poor physical status was defined as a physical component summary (PCS) <40 and poor mental status as a mental component summary (MCS) <40. Pain was assessed using a visual analogue scale (VAS), with significant pain defined as VAS ≥70 mm and insignificant pain as VAS <70 mm. The background and colonoscopic findings were compared in patients with significant and insignificant pain. Patients This study evaluated consecutive Japanese patients who were positive on fecal occult blood tests and underwent total colonoscopy. Results Of the 100 patients, 23 had significant and 77 had insignificant colonoscopy-related pain. A multiple logistic regression analysis showed that MCS <40 [odds ratio (OR) 6.03; 95% confidence interval (CI) 1.41-25.9, p=0.0156], PCS <40 (OR 5.96; 95% CI 1.45-24.5, p=0.0133), and ≥300 seconds to reach the cecum (OR 4.13; 95% CI 1.16-14.7, p=0.0281) were independent risk factors for colonoscopy-related pain. Conclusion The mental health and physical function are important determinants of colonoscopy-related pain. Evaluating the mental health and physical function of patients prior to colonoscopy may effectively predict the degree of colonoscopy-related pain. PMID:28202858

  13. Iron deficiency anemia and cognitive function in infancy.

    PubMed

    Carter, R Colin; Jacobson, Joseph L; Burden, Matthew J; Armony-Sivan, Rinat; Dodge, Neil C; Angelilli, Mary Lu; Lozoff, Betsy; Jacobson, Sandra W

    2010-08-01

    This study examined effects of iron deficiency anemia (IDA) on specific domains of infant cognitive function and the role of IDA-related socioemotional deficits in mediating and/or moderating these effects. Infants were recruited during routine 9-month visits to an inner-city clinic. IDA was defined as hemoglobin level <110 g/L with > or =2 abnormal iron deficiency indicators (mean corpuscular volume, red cell distribution width, zinc protoporphyrin, transferrin saturation, and ferritin). At 9 and 12 months, the Fagan Test of Infant Intelligence (FTII); A-not-B task; Emotionality, Activity, and Sociability Temperament Survey; and Behavior Rating Scale were administered. Analyses were adjusted for potential confounders, including age and sociodemographic variables. Twenty-eight infants met criteria for IDA, 28 had nonanemic iron deficiency (NA ID) and 21 had iron sufficiency (IS). There was a linear effect for object permanence at 9 months: infants with IDA were least likely to exhibit object permanence, IS most likely, and NA ID intermediate. Infants with IDA and those with hemoglobin level < or =105 g/L showed poorer recognition memory on the FTII than infants without IDA. The Behavior Rating Scale orientation/engagement measure partially mediated these effects. Stronger effects of IDA on these outcomes were seen in infants who scored more poorly on the socioemotional measures. These data indicate poorer object permanence and short-term memory encoding and/or retrieval in infants with IDA at 9 months. These cognitive effects were attributable, in part, to IDA-related deficits in socioemotional function. Children with poor socioemotional performance seem to be more vulnerable to the effects of IDA on cognitive function.

  14. Applicability of the CIELAB and CIEDE2000 Formulae for Detection of Colour Changes in Colour-Changeable Chewing Gum for Evaluating Masticatory Function

    PubMed Central

    Yeerken, Yesiboli; Said, Mohamed; Li, Na; Taniguchi, Hisashi

    2017-01-01

    Introduction Mastication is one of the essential stomatognathic functions and is impaired when mandibulectomy is performed for removal of head and neck lesions. Aim The purpose of this study was to investigate the correlation between perceived chewing ability {Masticatory Score (MS)} and objective mixing ability (∆E) in patients who had undergone marginal mandibulectomy. Materials and Methods Twenty normal dentate subjects as control group and twenty mandibulectomy patients who had undergone marginal mandibulectomy and wearing a dentomaxillary prosthesis were enrolled. Perceived chewing ability MS and objective ∆E were evaluated using a food intake questionnaire and the colour-changeable chewing gum, respectively. They were instructed to chew the gum continuously for 100 strokes on their usual side. The chewed gum was measured using the CIELAB colour space defined by a colourimeter and L, a* and b* were obtained. The change in colour of the gum after chewing was calculated using CIELAB (∆Eab) and the CIEDE2000 (∆E00) formula. The relationships of a*, ∆Eab, and ∆E00 with MS score were analyzed using the Spearman’s rank correlation coefficient. Results A correlation was found between perceived chewing ability (MS) and objective mixing ability (index of the masticatory function {∆E}) in marginal mandibulectomy patients. (∆E00 = 0.481, a* = 0.587, ∆Eab = 0.668). Conclusion Within the limitations of this study, it can be concluded that the CIEDE2000 formula for calculation of colour difference can be used to evaluate masticatory function in patients who have undergone marginal mandibulectomy. PMID:28571278

  15. Animal behavior as a conceptual framework for the study of obsessive-compulsive disorder (OCD).

    PubMed

    Eilam, David; Zor, Rama; Fineberg, Naomi; Hermesh, Haggai

    2012-06-01

    Research on affective disorders may benefit from the methodology of studying animal behavior, in which tools are available for qualitatively and quantitatively measuring and assessing behavior with as much sophistication and attention to detail as in the analysis of the brain. To illustrate this, we first briefly review the characteristics of obsessive-compulsive disorder (OCD), and then demonstrate how the quinpirole rat model is used as a conceptual model in studying human OCD patients. Like the rat model, the study of OCD in humans is based on video-telemetry, whereby observable, measurable, and relatively objective characteristics of OCD behavior may be extracted. In this process, OCD rituals are defined in terms of the space in which they are executed and the movements (acts) that are performed at each location or object in this space. Accordingly, OCD behavior is conceived of as comprising three hierarchical components: (i) rituals (as defined by the patients); (ii) visits to objects/locations in the environment at which the patient stops during the ritual; and (iii) acts performed at each object/location during visits. Scoring these structural components (behavioral units) is conveniently possible with readily available tools for behavioral description and analysis, providing quantitative and qualitative measures of the OCD hallmarks of repetition and addition, as well as the reduced functionality in OCD behavior. Altogether, the concept that was developed in the context of an animal model provides a useful tool that may facilitate OCD diagnosis, assessment and treatment, and may be similarly applied for other psychiatric disorders. Copyright © 2011 Elsevier B.V. All rights reserved.

  16. Objective definition of rainfall intensity-duration thresholds for the initiation of post-fire debris flows in southern California

    USGS Publications Warehouse

    Staley, Dennis; Kean, Jason W.; Cannon, Susan H.; Schmidt, Kevin M.; Laber, Jayme L.

    2012-01-01

    Rainfall intensity–duration (ID) thresholds are commonly used to predict the temporal occurrence of debris flows and shallow landslides. Typically, thresholds are subjectively defined as the upper limit of peak rainstorm intensities that do not produce debris flows and landslides, or as the lower limit of peak rainstorm intensities that initiate debris flows and landslides. In addition, peak rainstorm intensities are often used to define thresholds, as data regarding the precise timing of debris flows and associated rainfall intensities are usually not available, and rainfall characteristics are often estimated from distant gauging locations. Here, we attempt to improve the performance of existing threshold-based predictions of post-fire debris-flow occurrence by utilizing data on the precise timing of debris flows relative to rainfall intensity, and develop an objective method to define the threshold intensities. We objectively defined the thresholds by maximizing the number of correct predictions of debris flow occurrence while minimizing the rate of both Type I (false positive) and Type II (false negative) errors. We identified that (1) there were statistically significant differences between peak storm and triggering intensities, (2) the objectively defined threshold model presents a better balance between predictive success, false alarms and failed alarms than previous subjectively defined thresholds, (3) thresholds based on measurements of rainfall intensity over shorter duration (≤60 min) are better predictors of post-fire debris-flow initiation than longer duration thresholds, and (4) the objectively defined thresholds were exceeded prior to the recorded time of debris flow at frequencies similar to or better than subjective thresholds. Our findings highlight the need to better constrain the timing and processes of initiation of landslides and debris flows for future threshold studies. In addition, the methods used to define rainfall thresholds in this study represent a computationally simple means of deriving critical values for other studies of nonlinear phenomena characterized by thresholds.

  17. Effects of modeling errors on trajectory predictions in air traffic control automation

    NASA Technical Reports Server (NTRS)

    Jackson, Michael R. C.; Zhao, Yiyuan; Slattery, Rhonda

    1996-01-01

    Air traffic control automation synthesizes aircraft trajectories for the generation of advisories. Trajectory computation employs models of aircraft performances and weather conditions. In contrast, actual trajectories are flown in real aircraft under actual conditions. Since synthetic trajectories are used in landing scheduling and conflict probing, it is very important to understand the differences between computed trajectories and actual trajectories. This paper examines the effects of aircraft modeling errors on the accuracy of trajectory predictions in air traffic control automation. Three-dimensional point-mass aircraft equations of motion are assumed to be able to generate actual aircraft flight paths. Modeling errors are described as uncertain parameters or uncertain input functions. Pilot or autopilot feedback actions are expressed as equality constraints to satisfy control objectives. A typical trajectory is defined by a series of flight segments with different control objectives for each flight segment and conditions that define segment transitions. A constrained linearization approach is used to analyze trajectory differences caused by various modeling errors by developing a linear time varying system that describes the trajectory errors, with expressions to transfer the trajectory errors across moving segment transitions. A numerical example is presented for a complete commercial aircraft descent trajectory consisting of several flight segments.

  18. A Knowledge Management Approach to Support Software Process Improvement Implementation Initiatives

    NASA Astrophysics Data System (ADS)

    Montoni, Mariano Angel; Cerdeiral, Cristina; Zanetti, David; Cavalcanti da Rocha, Ana Regina

    The success of software process improvement (SPI) implementation initiatives depends fundamentally of the strategies adopted to support the execution of such initiatives. Therefore, it is essential to define adequate SPI implementation strategies aiming to facilitate the achievement of organizational business goals and to increase the benefits of process improvements. The objective of this work is to present an approach to support the execution of SPI implementation initiatives. We also describe a methodology applied to capture knowledge related to critical success factors that influence SPI initiatives. This knowledge was used to define effective SPI strategies aiming to increase the success of SPI initiatives coordinated by a specific SPI consultancy organization. This work also presents the functionalities of a set of tools integrated in a process-centered knowledge management environment, named CORE-KM, customized to support the presented approach.

  19. User productivity as a function of AutoCAD interface design.

    PubMed

    Mitta, D A; Flores, P L

    1995-12-01

    Increased operator productivity is a desired outcome of user-CAD interaction scenarios. Two objectives of this research were to (1) define a measure of operator productivity and (2) empirically investigate the potential effects of CAD interface design on operator productivity, where productivity is defined as the percentage of a drawing session correctly completed per unit time. Here, AutoCAD provides the CAD environment of interest. Productivity with respect to two AutoCAD interface designs (menu, template) and three task types (draw, dimension, display) was investigated. Analysis of user productivity data revealed significantly higher productivity under the menu interface condition than under the template interface condition. A significant effect of task type was also discovered, where user productivity under display tasks was higher than productivity under the draw and dimension tasks. Implications of these results are presented.

  20. Stable polyurethane coatings for electronic circuits. NASA tech briefs, fall 1982, volume 7, no. 1

    NASA Technical Reports Server (NTRS)

    1982-01-01

    One of the most severe deficiencies of polyurethanes as engineering materials for electrical applications has been their sensitivity to combined humidity and temperature environments. Gross failure by reversion of urethane connector potting materials has occurred under these conditions. This has resulted in both scrapping of expensive hardware and reduction in reliability in other instances. A basic objective of this study has been to gain a more complete understanding of the mechanisms and interactions of moisture in urethane systems to guide the development of reversion resistant materials for connector potting and conformal coating applications in high humidity environments. Basic polymer studies of molecular weight and distribution, polymer structure, and functionality were carried out to define those areas responsible for hydrolytic instability and to define polymer structural feature conducive to optimum hydrolytic stability.

  1. Repository-based software engineering program: Concept document

    NASA Technical Reports Server (NTRS)

    1992-01-01

    This document provides the context for Repository-Based Software Engineering's (RBSE's) evolving functional and operational product requirements, and it is the parent document for development of detailed technical and management plans. When furnished, requirements documents will serve as the governing RBSE product specification. The RBSE Program Management Plan will define resources, schedules, and technical and organizational approaches to fulfilling the goals and objectives of this concept. The purpose of this document is to provide a concise overview of RBSE, describe the rationale for the RBSE Program, and define a clear, common vision for RBSE team members and customers. The document also provides the foundation for developing RBSE user and system requirements and a corresponding Program Management Plan. The concept is used to express the program mission to RBSE users and managers and to provide an exhibit for community review.

  2. Numerical model of tapered fiber Bragg gratings for comprehensive analysis and optimization of their sensing and strain-induced tunable dispersion properties.

    PubMed

    Osuch, Tomasz; Markowski, Konrad; Jędrzejewski, Kazimierz

    2015-06-10

    A versatile numerical model for spectral transmission/reflection, group delay characteristic analysis, and design of tapered fiber Bragg gratings (TFBGs) is presented. This approach ensures flexibility with defining both distribution of refractive index change of the gratings (including apodization) and shape of the taper profile. Additionally, sensing and tunable dispersion properties of the TFBGs were fully examined, considering strain-induced effects. The presented numerical approach, together with Pareto optimization, were also used to design the best tanh apodization profiles of the TFBG in terms of maximizing its spectral width with simultaneous minimization of the group delay oscillations. Experimental verification of the model confirms its correctness. The combination of model versatility and possibility to define the other objective functions of Pareto optimization creates a universal tool for TFBG analysis and design.

  3. Applications of dewetting in micro and nanotechnology.

    PubMed

    Gentili, Denis; Foschi, Giulia; Valle, Francesco; Cavallini, Massimiliano; Biscarini, Fabio

    2012-06-21

    Dewetting is a spontaneous phenomenon where a thin film on a surface ruptures into an ensemble of separated objects, like droplets, stripes, and pillars. Spatial correlations with characteristic distance and object size emerge spontaneously across the whole dewetted area, leading to regular motifs with long-range order. Characteristic length scales depend on film thickness, which is a convenient and robust technological parameter. Dewetting is therefore an attractive paradigm for organizing a material into structures of well-defined micro- or nanometre-size, precisely positioned on a surface, thus avoiding lithographical processes. This tutorial review introduces the reader to the physical-chemical basis of dewetting, shows how the dewetting process can be applied to different functional materials with relevance in technological applications, and highlights the possible strategies to control the length scales of the dewetting process.

  4. Visual Impairment at Baseline is Associated with Future Poor Physical Functioning Among Middle-Aged Women: The Study of Women's Health Across the Nation, Michigan site

    PubMed Central

    Chandrasekaran, Navasuja; Harlow, Sioban; Moroi, Sayoko; Musch, David; Peng, Qing; Karvonen-Gutierrez, Carrie

    2016-01-01

    Objectives Emerging evidence suggests that the prevalence rates of poor functioning and of disability are increasing among middle-aged individuals. Visual impairment is associated with poor functioning among older adults but little is known about the impact of vision on functioning during midlife. The objective of this study was to assess the impact of visual impairment on future physical functioning among middle-aged women. Study design In this longitudinal study, the sample consisted of 483 women aged 42 to 56 years, from the Michigan site of the Study of Women's Health Across the Nation. Main Outcome Measures At baseline, distance and near vision were measured using a Titmus vision screener. Visual impairment was defined as visual acuity worse than 20/40. Physical functioning was measured up to 10 years later using performance-based measures, including a 40-foot timed walk, timed stair climb and forward reach. Results Women with impaired distance vision at baseline had 2.81 centimeters less forward reach distance (95% confidence interval (CI): −4.19,−1.42) and 4.26 seconds longer stair climb time (95% CI: 2.73, 5.79) at follow-up than women without impaired distance vision. Women with impaired near vision also had less forward reach distance (2.26 centimeters, 95% CI: −3.30,−1.21) than those without impaired near vision. Conclusion Among middle-aged women, visual impairment is a marker of poor physical functioning. Routine eye testing and vision correction may help improve physical functioning among midlife individuals. PMID:28041592

  5. On the Relationship Between Generalization Error, Hypothesis Complexity, and Sample Complexity for Radial Basis Functions

    DTIC Science & Technology

    1994-01-01

    torque general nature. We then provide in section 3 a precise at a particular joint of a robot arm , and x the set of an- statement of a specific...sampling Y according to first need to introduce some terminology and to define P(ylx). In the robot arm example described above, it a number of...mathematical objects. A summary of the would mean that one could move the robot arm into most common notations and definitions used in this pa- ’Note that

  6. Research in Knowledge Representation for Natural Language Communication and Planning Assistance

    DTIC Science & Technology

    1987-10-01

    elements of PFR Instants of time are represented as individuals where they form a continuum Let "seconds" map real numbers to instants where "seconds(n...34 denotes n seconds. Points in space form a 3-dimensional continuum. Changing relations are represented as functions on instants of time. Formulas and...occupies at time t. "occ.space(x)(t)" is defined iff x is a physical object, I is an instant of lime, and x exists at t Further, x must occupy a non

  7. Guidelines for Implementing Advanced Distribution Management Systems-Requirements for DMS Integration with DERMS and Microgrids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jianhui; Chen, Chen; Lu, Xiaonan

    2015-08-01

    This guideline focuses on the integration of DMS with DERMS and microgrids connected to the distribution grid by defining generic and fundamental design and implementation principles and strategies. It starts by addressing the current status, objectives, and core functionalities of each system, and then discusses the new challenges and the common principles of DMS design and implementation for integration with DERMS and microgrids to realize enhanced grid operation reliability and quality power delivery to consumers while also achieving the maximum energy economics from the DER and microgrid connections.

  8. Building bridges between cellular and molecular structural biology.

    PubMed

    Patwardhan, Ardan; Brandt, Robert; Butcher, Sarah J; Collinson, Lucy; Gault, David; Grünewald, Kay; Hecksel, Corey; Huiskonen, Juha T; Iudin, Andrii; Jones, Martin L; Korir, Paul K; Koster, Abraham J; Lagerstedt, Ingvar; Lawson, Catherine L; Mastronarde, David; McCormick, Matthew; Parkinson, Helen; Rosenthal, Peter B; Saalfeld, Stephan; Saibil, Helen R; Sarntivijai, Sirarat; Solanes Valero, Irene; Subramaniam, Sriram; Swedlow, Jason R; Tudose, Ilinca; Winn, Martyn; Kleywegt, Gerard J

    2017-07-06

    The integration of cellular and molecular structural data is key to understanding the function of macromolecular assemblies and complexes in their in vivo context. Here we report on the outcomes of a workshop that discussed how to integrate structural data from a range of public archives. The workshop identified two main priorities: the development of tools and file formats to support segmentation (that is, the decomposition of a three-dimensional volume into regions that can be associated with defined objects), and the development of tools to support the annotation of biological structures.

  9. Penalized Nonlinear Least Squares Estimation of Time-Varying Parameters in Ordinary Differential Equations

    PubMed Central

    Cao, Jiguo; Huang, Jianhua Z.; Wu, Hulin

    2012-01-01

    Ordinary differential equations (ODEs) are widely used in biomedical research and other scientific areas to model complex dynamic systems. It is an important statistical problem to estimate parameters in ODEs from noisy observations. In this article we propose a method for estimating the time-varying coefficients in an ODE. Our method is a variation of the nonlinear least squares where penalized splines are used to model the functional parameters and the ODE solutions are approximated also using splines. We resort to the implicit function theorem to deal with the nonlinear least squares objective function that is only defined implicitly. The proposed penalized nonlinear least squares method is applied to estimate a HIV dynamic model from a real dataset. Monte Carlo simulations show that the new method can provide much more accurate estimates of functional parameters than the existing two-step local polynomial method which relies on estimation of the derivatives of the state function. Supplemental materials for the article are available online. PMID:23155351

  10. Using Multicriteria Analysis in Issues Concerning Adaptation of Historic Facilities for the Needs of Public Utility Buildings with a Function of a Theatre

    NASA Astrophysics Data System (ADS)

    Obracaj, Piotr; Fabianowski, Dariusz

    2017-10-01

    Implementations concerning adaptation of historic facilities for public utility objects are associated with the necessity of solving many complex, often conflicting expectations of future users. This mainly concerns the function that includes construction, technology and aesthetic issues. The list of issues is completed with proper protection of historic values, different in each case. The procedure leading to obtaining the expected solution is a multicriteria procedure, usually difficult to accurately define and requiring designer’s large experience. An innovative approach has been used for the analysis, namely - the modified EA FAHP (Extent Analysis Fuzzy Analytic Hierarchy Process) Chang’s method of a multicriteria analysis for the assessment of complex functional and spatial issues. Selection of optimal spatial form of an adapted historic building intended for the multi-functional public utility facility was analysed. The assumed functional flexibility was determined in the scope of: education, conference, and chamber spectacles, such as drama, concerts, in different stage-audience layouts.

  11. Quantification of soil structure based on Minkowski functions

    NASA Astrophysics Data System (ADS)

    Vogel, H.-J.; Weller, U.; Schlüter, S.

    2010-10-01

    The structure of soils and other geologic media is a complex three-dimensional object. Most of the physical material properties including mechanical and hydraulic characteristics are immediately linked to the structure given by the pore space and its spatial distribution. It is an old dream and still a formidable challenge to relate structural features of porous media to their functional properties. Using tomographic techniques, soil structure can be directly observed at a range of spatial scales. In this paper we present a scale-invariant concept to quantify complex structures based on a limited set of meaningful morphological functions. They are based on d+1 Minkowski functionals as defined for d-dimensional bodies. These basic quantities are determined as a function of pore size or aggregate size obtained by filter procedures using mathematical morphology. The resulting Minkowski functions provide valuable information on the size of pores and aggregates, the pore surface area and the pore topology having the potential to be linked to physical properties. The theoretical background and the related algorithms are presented and the approach is demonstrated for the pore structure of an arable soil and the pore structure of a sand both obtained by X-ray micro-tomography. We also analyze the fundamental problem of limited resolution which is critical for any attempt to quantify structural features at any scale using samples of different size recorded at different resolutions. The results demonstrate that objects smaller than 5 voxels are critical for quantitative analysis.

  12. Subgrid-scale scalar flux modelling based on optimal estimation theory and machine-learning procedures

    NASA Astrophysics Data System (ADS)

    Vollant, A.; Balarac, G.; Corre, C.

    2017-09-01

    New procedures are explored for the development of models in the context of large eddy simulation (LES) of a passive scalar. They rely on the combination of the optimal estimator theory with machine-learning algorithms. The concept of optimal estimator allows to identify the most accurate set of parameters to be used when deriving a model. The model itself can then be defined by training an artificial neural network (ANN) on a database derived from the filtering of direct numerical simulation (DNS) results. This procedure leads to a subgrid scale model displaying good structural performance, which allows to perform LESs very close to the filtered DNS results. However, this first procedure does not control the functional performance so that the model can fail when the flow configuration differs from the training database. Another procedure is then proposed, where the model functional form is imposed and the ANN used only to define the model coefficients. The training step is a bi-objective optimisation in order to control both structural and functional performances. The model derived from this second procedure proves to be more robust. It also provides stable LESs for a turbulent plane jet flow configuration very far from the training database but over-estimates the mixing process in that case.

  13. CART V: recent advancements in computer-aided camouflage assessment

    NASA Astrophysics Data System (ADS)

    Müller, Thomas; Müller, Markus

    2011-05-01

    In order to facilitate systematic, computer aided improvements of camouflage and concealment assessment methods, the software system CART (Camouflage Assessment in Real-Time) was built up for the camouflage assessment of objects in multispectral image sequences (see contributions to SPIE 2007-2010 [1], [2], [3], [4]). It comprises a semi-automatic marking of target objects (ground truth generation) including their propagation over the image sequence and the evaluation via user-defined feature extractors as well as methods to assess the object's movement conspicuity. In this fifth part in an annual series at the SPIE conference in Orlando, this paper presents the enhancements over the recent year and addresses the camouflage assessment of static and moving objects in multispectral image data that can show noise or image artefacts. The presented methods fathom the correlations between image processing and camouflage assessment. A novel algorithm is presented based on template matching to assess the structural inconspicuity of an object objectively and quantitatively. The results can easily be combined with an MTI (moving target indication) based movement conspicuity assessment function in order to explore the influence of object movement to a camouflage effect in different environments. As the results show, the presented methods contribute to a significant benefit in the field of camouflage assessment.

  14. Optimal control of switching time in switched stochastic systems with multi-switching times and different costs

    NASA Astrophysics Data System (ADS)

    Liu, Xiaomei; Li, Shengtao; Zhang, Kanjian

    2017-08-01

    In this paper, we solve an optimal control problem for a class of time-invariant switched stochastic systems with multi-switching times, where the objective is to minimise a cost functional with different costs defined on the states. In particular, we focus on problems in which a pre-specified sequence of active subsystems is given and the switching times are the only control variables. Based on the calculus of variation, we derive the gradient of the cost functional with respect to the switching times on an especially simple form, which can be directly used in gradient descent algorithms to locate the optimal switching instants. Finally, a numerical example is given, highlighting the validity of the proposed methodology.

  15. Comprehensive evaluation of functional and anatomical disorders of the patients with distal occlusion and accompanying obstructive sleep apnea syndrome

    NASA Astrophysics Data System (ADS)

    Nabiev, F. H.; Dobrodeev, A. S.; Libin, P. V.; Kotov, I. I.; Ovsyannikov, A. G.

    2015-11-01

    The paper defines the therapeutic and rehabilitation approach to the patients with Angle's classification Class II dento-facial anomalies, accompanied by obstructive sleep apnea (OSA). The proposed comprehensive approach to the diagnostics and treatment of patients with posterior occlusion, accompanied by OSA, allows for objective evaluation of intensity of a dento-facial anomaly and accompanying respiratory disorders in the nasal and oral pharynx, which allows for the pathophysiological mechanisms of OSA to be identified, and an optimal plan for surgical procedures to be developed. The proposed comprehensive approach to the diagnostics and treatment of patients with Angle's classification Class II dento-facial anomalies provides high functional and aesthetic results.

  16. TraceContract: A Scala DSL for Trace Analysis

    NASA Technical Reports Server (NTRS)

    Barringer, Howard; Havelund, Klaus

    2011-01-01

    In this paper we describe TRACECONTRACT, an API for trace analysis, implemented in the SCALA programming language. We argue that for certain forms of trace analysis the best weapon is a high level programming language augmented with constructs for temporal reasoning. A trace is a sequence of events, which may for example be generated by a running program, instrumented appropriately to generate events. The API supports writing properties in a notation that combines an advanced form of data parameterized state machines with temporal logic. The implementation utilizes SCALA's support for defining internal Domain Specific Languages (DSLs). Furthermore SCALA's combination of object oriented and functional programming features, including partial functions and pattern matching, makes it an ideal host language for such an API.

  17. A Vision for Spaceflight Reliability: NASA's Objectives Based Strategy

    NASA Technical Reports Server (NTRS)

    Groen, Frank; Evans, John; Hall, Tony

    2015-01-01

    In defining the direction for a new Reliability and Maintainability standard, OSMA has extracted the essential objectives that our programs need, to undertake a reliable mission. These objectives have been structured to lead mission planning through construction of an objective hierarchy, which defines the critical approaches for achieving high reliability and maintainability (R M). Creating a hierarchy, as a basis for assurance implementation, is a proven approach; yet, it holds the opportunity to enable new directions, as NASA moves forward in tackling the challenges of space exploration.

  18. [Quality management in cardiovascular echography].

    PubMed

    Gullace, Giuseppe

    2002-12-01

    The quality management of an organization can be defined as the ability to identify, plan and implement programs of measure, analysis, verification and control that allow to monitor management, resources, activities, processes and output/outcome of the same organization, including the satisfaction of the customers. Whatever the model used, it is demonstrated that the management-quality system, either for professional quality or for organization, turns out to be effective even in the health organizations within and to any level of organizational-structural complexity. The present paper concerns the experience of the Italian Society of Cardiovascular Echography (SIEC) on quality certification, both as a scientific society compared to other health organizations and to cardiovascular echo laboratories, and the definition of minimum requirements for the accreditation of the same laboratories. The model most frequently used for quality management is represented by the ISO 9000: Vision 2000, that is a management model with specific reference to the organization and the customer satisfaction. The model applied to the health structure needs a rapid change in mentality that addresses the operators to define, share and achieve objectives to be brought on by means of an active collaboration, group activity and deep sense of belonging necessary to the attainment of expected objectives. When the model is applied by a scientific society, it is necessary to take into account the different structural and functional organization, the constitution and the operators differing on the point of view of origin, experiences, mentality, and roles. The ISO 9000: Vision 2000 model can be applied also to the cardiovascular echo laboratory which may be compared to a simple organization; for its corrected functioning, SIEC has defined minimal requirements for the accreditation, realization and modalities to carry out and manage quality. The quality system represents a new way of operating of an organization that enhances capability and performance of the operators, stimulates their creativity and facilitates the activities of all, to guarantee both the quality of the product and the satisfaction of operators and customers at the same time.

  19. Inkjet Printing of Functional and Structural Materials: Fluid Property Requirements, Feature Stability, and Resolution

    NASA Astrophysics Data System (ADS)

    Derby, Brian

    2010-08-01

    Inkjet printing is viewed as a versatile manufacturing tool for applications in materials fabrication in addition to its traditional role in graphics output and marking. The unifying feature in all these applications is the dispensing and precise positioning of very small volumes of fluid (1-100 picoliters) on a substrate before transformation to a solid. The application of inkjet printing to the fabrication of structures for structural or functional materials applications requires an understanding as to how the physical processes that operate during inkjet printing interact with the properties of the fluid precursors used. Here we review the current state of understanding of the mechanisms of drop formation and how this defines the fluid properties that are required for a given liquid to be printable. The interactions between individual drops and the substrate as well as between adjacent drops are important in defining the resolution and accuracy of printed objects. Pattern resolution is limited by the extent to which a liquid drop spreads on a substrate and how spreading changes with the overlap of adjacent drops to form continuous features. There are clearly defined upper and lower bounds to the width of a printed continuous line, which can be defined in terms of materials and process variables. Finer-resolution features can be achieved through appropriate patterning and structuring of the substrate prior to printing, which is essential if polymeric semiconducting devices are to be fabricated. Low advancing and receding contact angles promote printed line stability but are also more prone to solute segregation or “coffee staining” on drying.

  20. The normalities and abnormalities associated with speech in psychometrically-defined schizotypy.

    PubMed

    Cohen, Alex S; Auster, Tracey L; McGovern, Jessica E; MacAulay, Rebecca K

    2014-12-01

    Speech deficits are thought to be an important feature of schizotypy--defined as the personality organization reflecting a putative liability for schizophrenia. There is reason to suspect that these deficits manifest as a function of limited cognitive resources. To evaluate this idea, we examined speech from individuals with psychometrically-defined schizotypy during a low cognitively-demanding task versus a relatively high cognitively-demanding task. A range of objective, computer-based measures of speech tapping speech production (silence, number and length of pauses, number and length of utterances), speech variability (global and local intonation and emphasis) and speech content (word fillers, idea density) were employed. Data for control (n=37) and schizotypy (n=39) groups were examined. Results did not confirm our hypotheses. While the cognitive-load task reduced speech expressivity for subjects as a group for most variables, the schizotypy group was not more pathological in speech characteristics compared to the control group. Interestingly, some aspects of speech in schizotypal versus control subjects were healthier under high cognitive load. Moreover, schizotypal subjects performed better, at a trend level, than controls on the cognitively demanding task. These findings hold important implications for our understanding of the neurocognitive architecture associated with the schizophrenia-spectrum. Of particular note concerns the apparent mismatch between self-reported schizotypal traits and objective performance, and the resiliency of speech under cognitive stress in persons with high levels of schizotypy. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. Judgements about the relation between force and trajectory variables in verbally described ballistic projectile motion.

    PubMed

    White, Peter A

    2013-01-01

    How accurate are explicit judgements about familiar forms of object motion, and how are they made? Participants judged the relations between force exerted in kicking a soccer ball and variables that define the trajectory of the ball: launch angle, maximum height attained, and maximum distance reached. Judgements tended to conform to a simple heuristic that judged force tends to increase as maximum height and maximum distance increase, with launch angle not being influential. Support was also found for the converse prediction, that judged maximum height and distance tend to increase as the amount of force described in the kick increases. The observed judgemental tendencies did not resemble the objective relations, in which force is a function of interactions between the trajectory variables. This adds to a body of research indicating that practical knowledge based on experiences of actions on objects is not available to the processes that generate judgements in higher cognition and that such judgements are generated by simple rules that do not capture the objective interactions between the physical variables.

  2. A systematic review of cognitive failures in daily life: Healthy populations.

    PubMed

    Carrigan, Nicole; Barkus, Emma

    2016-04-01

    Cognitive failures are minor errors in thinking reported by clinical and non-clinical individuals during everyday life. It is not yet clear how subjectively-reported cognitive failures relate to objective neuropsychological ability. We aimed to consolidate the definition of cognitive failures, outline evidence for the relationship with objective cognition, and develop a unified model of factors that increase cognitive failures. We conducted a systematic review of cognitive failures, identifying 45 articles according to the PRISMA statement. Failures were defined as reflecting proneness to errors in 'real world' planned thought and action. Vulnerability to failures was not consistently associated with objective cognitive performance. A range of stable and variable factors were linked to increased risk of cognitive failures. We conclude that cognitive failures measure real world cognitive capacity rather than pure 'unchallenged' ability. Momentary state may interact with predisposing trait factors to increase the likelihood of failures occurring. Inclusion of self-reported cognitive failures in objective cognitive research will increase the translational relevance of ability into more ecologically valid aspects of real world functioning. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Interactive model evaluation tool based on IPython notebook

    NASA Astrophysics Data System (ADS)

    Balemans, Sophie; Van Hoey, Stijn; Nopens, Ingmar; Seuntjes, Piet

    2015-04-01

    In hydrological modelling, some kind of parameter optimization is mostly performed. This can be the selection of a single best parameter set, a split in behavioural and non-behavioural parameter sets based on a selected threshold or a posterior parameter distribution derived with a formal Bayesian approach. The selection of the criterion to measure the goodness of fit (likelihood or any objective function) is an essential step in all of these methodologies and will affect the final selected parameter subset. Moreover, the discriminative power of the objective function is also dependent from the time period used. In practice, the optimization process is an iterative procedure. As such, in the course of the modelling process, an increasing amount of simulations is performed. However, the information carried by these simulation outputs is not always fully exploited. In this respect, we developed and present an interactive environment that enables the user to intuitively evaluate the model performance. The aim is to explore the parameter space graphically and to visualize the impact of the selected objective function on model behaviour. First, a set of model simulation results is loaded along with the corresponding parameter sets and a data set of the same variable as the model outcome (mostly discharge). The ranges of the loaded parameter sets define the parameter space. A selection of the two parameters visualised can be made by the user. Furthermore, an objective function and a time period of interest need to be selected. Based on this information, a two-dimensional parameter response surface is created, which actually just shows a scatter plot of the parameter combinations and assigns a color scale corresponding with the goodness of fit of each parameter combination. Finally, a slider is available to change the color mapping of the points. Actually, the slider provides a threshold to exclude non behaviour parameter sets and the color scale is only attributed to the remaining parameter sets. As such, by interactively changing the settings and interpreting the graph, the user gains insight in the model structural behaviour. Moreover, a more deliberate choice of objective function and periods of high information content can be identified. The environment is written in an IPython notebook and uses the available interactive functions provided by the IPython community. As such, the power of the IPython notebook as a development environment for scientific computing is illustrated (Shen, 2014).

  4. The fractal geometry of life.

    PubMed

    Losa, Gabriele A

    2009-01-01

    The extension of the concepts of Fractal Geometry (Mandelbrot [1983]) toward the life sciences has led to significant progress in understanding complex functional properties and architectural / morphological / structural features characterising cells and tissues during ontogenesis and both normal and pathological development processes. It has even been argued that fractal geometry could provide a coherent description of the design principles underlying living organisms (Weibel [1991]). Fractals fulfil a certain number of theoretical and methodological criteria including a high level of organization, shape irregularity, functional and morphological self-similarity, scale invariance, iterative pathways and a peculiar non-integer fractal dimension [FD]. Whereas mathematical objects are deterministic invariant or self-similar over an unlimited range of scales, biological components are statistically self-similar only within a fractal domain defined by upper and lower limits, called scaling window, in which the relationship between the scale of observation and the measured size or length of the object can be established (Losa and Nonnenmacher [1996]). Selected examples will contribute to depict complex biological shapes and structures as fractal entities, and also to show why the application of the fractal principle is valuable for measuring dimensional, geometrical and functional parameters of cells, tissues and organs occurring within the vegetal and animal realms. If the criteria for a strict description of natural fractals are met, then it follows that a Fractal Geometry of Life may be envisaged and all natural objects and biological systems exhibiting self-similar patterns and scaling properties may be considered as belonging to the new subdiscipline of "fractalomics".

  5. Unsupervised Detection of Planetary Craters by a Marked Point Process

    NASA Technical Reports Server (NTRS)

    Troglio, G.; Benediktsson, J. A.; Le Moigne, J.; Moser, G.; Serpico, S. B.

    2011-01-01

    With the launch of several planetary missions in the last decade, a large amount of planetary images is being acquired. Preferably, automatic and robust processing techniques need to be used for data analysis because of the huge amount of the acquired data. Here, the aim is to achieve a robust and general methodology for crater detection. A novel technique based on a marked point process is proposed. First, the contours in the image are extracted. The object boundaries are modeled as a configuration of an unknown number of random ellipses, i.e., the contour image is considered as a realization of a marked point process. Then, an energy function is defined, containing both an a priori energy and a likelihood term. The global minimum of this function is estimated by using reversible jump Monte-Carlo Markov chain dynamics and a simulated annealing scheme. The main idea behind marked point processes is to model objects within a stochastic framework: Marked point processes represent a very promising current approach in the stochastic image modeling and provide a powerful and methodologically rigorous framework to efficiently map and detect objects and structures in an image with an excellent robustness to noise. The proposed method for crater detection has several feasible applications. One such application area is image registration by matching the extracted features.

  6. Millimeter wave scattering characteristics and radar cross section measurements of common roadway objects

    NASA Astrophysics Data System (ADS)

    Zoratti, Paul K.; Gilbert, R. Kent; Majewski, Ronald; Ference, Jack

    1995-12-01

    Development of automotive collision warning systems has progressed rapidly over the past several years. A key enabling technology for these systems is millimeter-wave radar. This paper addresses a very critical millimeter-wave radar sensing issue for automotive radar, namely the scattering characteristics of common roadway objects such as vehicles, roadsigns, and bridge overpass structures. The data presented in this paper were collected on ERIM's Fine Resolution Radar Imaging Rotary Platform Facility and processed with ERIM's image processing tools. The value of this approach is that it provides system developers with a 2D radar image from which information about individual point scatterers `within a single target' can be extracted. This information on scattering characteristics will be utilized to refine threat assessment processing algorithms and automotive radar hardware configurations. (1) By evaluating the scattering characteristics identified in the radar image, radar signatures as a function of aspect angle for common roadway objects can be established. These signatures will aid in the refinement of threat assessment processing algorithms. (2) Utilizing ERIM's image manipulation tools, total RCS and RCS as a function of range and azimuth can be extracted from the radar image data. This RCS information will be essential in defining the operational envelope (e.g. dynamic range) within which any radar sensor hardware must be designed.

  7. Two-tier tissue decomposition for histopathological image representation and classification.

    PubMed

    Gultekin, Tunc; Koyuncu, Can Fahrettin; Sokmensuer, Cenk; Gunduz-Demir, Cigdem

    2015-01-01

    In digital pathology, devising effective image representations is crucial to design robust automated diagnosis systems. To this end, many studies have proposed to develop object-based representations, instead of directly using image pixels, since a histopathological image may contain a considerable amount of noise typically at the pixel-level. These previous studies mostly employ color information to define their objects, which approximately represent histological tissue components in an image, and then use the spatial distribution of these objects for image representation and classification. Thus, object definition has a direct effect on the way of representing the image, which in turn affects classification accuracies. In this paper, our aim is to design a classification system for histopathological images. Towards this end, we present a new model for effective representation of these images that will be used by the classification system. The contributions of this model are twofold. First, it introduces a new two-tier tissue decomposition method for defining a set of multityped objects in an image. Different than the previous studies, these objects are defined combining texture, shape, and size information and they may correspond to individual histological tissue components as well as local tissue subregions of different characteristics. As its second contribution, it defines a new metric, which we call dominant blob scale, to characterize the shape and size of an object with a single scalar value. Our experiments on colon tissue images reveal that this new object definition and characterization provides distinguishing representation of normal and cancerous histopathological images, which is effective to obtain more accurate classification results compared to its counterparts.

  8. Systems Modeling to Implement Integrated System Health Management Capability

    NASA Technical Reports Server (NTRS)

    Figueroa, Jorge F.; Walker, Mark; Morris, Jonathan; Smith, Harvey; Schmalzel, John

    2007-01-01

    ISHM capability includes: detection of anomalies, diagnosis of causes of anomalies, prediction of future anomalies, and user interfaces that enable integrated awareness (past, present, and future) by users. This is achieved by focused management of data, information and knowledge (DIaK) that will likely be distributed across networks. Management of DIaK implies storage, sharing (timely availability), maintaining, evolving, and processing. Processing of DIaK encapsulates strategies, methodologies, algorithms, etc. focused on achieving high ISHM Functional Capability Level (FCL). High FCL means a high degree of success in detecting anomalies, diagnosing causes, predicting future anomalies, and enabling health integrated awareness by the user. A model that enables ISHM capability, and hence, DIaK management, is denominated the ISHM Model of the System (IMS). We describe aspects of the IMS that focus on processing of DIaK. Strategies, methodologies, and algorithms require proper context. We describe an approach to define and use contexts, implementation in an object-oriented software environment (G2), and validation using actual test data from a methane thruster test program at NASA SSC. Context is linked to existence of relationships among elements of a system. For example, the context to use a strategy to detect leak is to identify closed subsystems (e.g. bounded by closed valves and by tanks) that include pressure sensors, and check if the pressure is changing. We call these subsystems Pressurizable Subsystems. If pressure changes are detected, then all members of the closed subsystem become suspect of leakage. In this case, the context is defined by identifying a subsystem that is suitable for applying a strategy. Contexts are defined in many ways. Often, a context is defined by relationships of function (e.g. liquid flow, maintaining pressure, etc.), form (e.g. part of the same component, connected to other components, etc.), or space (e.g. physically close, touching the same common element, etc.). The context might be defined dynamically (if conditions for the context appear and disappear dynamically) or statically. Although this approach is akin to case-based reasoning, we are implementing it using a software environment that embodies tools to define and manage relationships (of any nature) among objects in a very intuitive manner. Context for higher level inferences (that use detected anomalies or events), primarily for diagnosis and prognosis, are related to causal relationships. This is useful to develop root-cause analysis trees showing an event linked to its possible causes and effects. The innovation pertaining to RCA trees encompasses use of previously defined subsystems as well as individual elements in the tree. This approach allows more powerful implementations of RCA capability in object-oriented environments. For example, if a pressurizable subsystem is leaking, its root-cause representation within an RCA tree will show that the cause is that all elements of that subsystem are suspect of leak. Such a tree would apply to all instances of leak-events detected and all elements in all pressurizable subsystems in the system. Example subsystems in our environment to build IMS include: Pressurizable Subsystem, Fluid-Fill Subsystem, Flow-Thru-Valve Subsystem, and Fluid Supply Subsystem. The software environment for IMS is designed to potentially allow definition of any relationship suitable to create a context to achieve ISHM capability.

  9. Associations between neurocognitive functioning and social and occupational resilience among South African women exposed to childhood trauma.

    PubMed

    Denckla, C A; Consedine, N S; Spies, G; Cherner, M; Henderson, D C; Koenen, K C; Seedat, S

    2017-01-01

    Background : Prior research on adaptation after early trauma among black South African women typically assessed resilience in ways that lacked contextual specificity. In addition, the neurocognitive correlates of social and occupational resilience have not been investigated. Objective : The primary aim of this exploratory study was to identify domains of neurocognitive functioning associated with social and occupational resilience, defined as functioning at a level beyond what would be expected given exposure to childhood trauma. Methods : A sample of black South African women, N  = 314, completed a neuropsychological battery, a questionnaire assessing exposure to childhood trauma, and self-report measures of functional status. We generated indices of social and occupational resilience by regressing childhood trauma exposure on social and occupational functioning, saving the residuals as indices of social and occupational functioning beyond what would be expected given exposure to childhood trauma. Results : Women with lower non-verbal memory evidenced greater social and occupational resilience above and beyond the effects attributable to age, education, HIV status, and depressive and posttraumatic stress symptoms. In addition, women with greater occupational resilience exhibited lower semantic language fluency and processing speed. Conclusion : Results are somewhat consistent with prior studies implicating memory effects in impairment following trauma, though our findings suggest that reduced abilities in these domains may be associated with greater resilience. Studies that use prospective designs and objective assessment of functional status are needed to determine whether non-verbal memory, semantic fluency, and processing speed are implicated in the neural circuitry of post-traumatic exposure resilience.

  10. Multi-objective optimization to predict muscle tensions in a pinch function using genetic algorithm

    NASA Astrophysics Data System (ADS)

    Bensghaier, Amani; Romdhane, Lotfi; Benouezdou, Fethi

    2012-03-01

    This work is focused on the determination of the thumb and the index finger muscle tensions in a tip pinch task. A biomechanical model of the musculoskeletal system of the thumb and the index finger is developed. Due to the assumptions made in carrying out the biomechanical model, the formulated force analysis problem is indeterminate leading to an infinite number of solutions. Thus, constrained single and multi-objective optimization methodologies are used in order to explore the muscular redundancy and to predict optimal muscle tension distributions. Various models are investigated using the optimization process. The basic criteria to minimize are the sum of the muscle stresses, the sum of individual muscle tensions and the maximum muscle stress. The multi-objective optimization is solved using a Pareto genetic algorithm to obtain non-dominated solutions, defined as the set of optimal distributions of muscle tensions. The results show the advantage of the multi-objective formulation over the single objective one. The obtained solutions are compared to those available in the literature demonstrating the effectiveness of our approach in the analysis of the fingers musculoskeletal systems when predicting muscle tensions.

  11. Object-Oriented Multi-Disciplinary Design, Analysis, and Optimization Tool

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi

    2011-01-01

    An Object-Oriented Optimization (O3) tool was developed that leverages existing tools and practices, and allows the easy integration and adoption of new state-of-the-art software. At the heart of the O3 tool is the Central Executive Module (CEM), which can integrate disparate software packages in a cross platform network environment so as to quickly perform optimization and design tasks in a cohesive, streamlined manner. This object-oriented framework can integrate the analysis codes for multiple disciplines instead of relying on one code to perform the analysis for all disciplines. The CEM was written in FORTRAN and the script commands for each performance index were submitted through the use of the FORTRAN Call System command. In this CEM, the user chooses an optimization methodology, defines objective and constraint functions from performance indices, and provides starting and side constraints for continuous as well as discrete design variables. The structural analysis modules such as computations of the structural weight, stress, deflection, buckling, and flutter and divergence speeds have been developed and incorporated into the O3 tool to build an object-oriented Multidisciplinary Design, Analysis, and Optimization (MDAO) tool.

  12. Optimal timber harvest scheduling with spatially defined sediment objectives

    Treesearch

    Jon Hof; Michael Bevers

    2000-01-01

    This note presents a simple model formulation that focuses on the spatial relationships over time between timber harvesting and sediment levels in water runoff courses throughout the watershed being managed. A hypothetical example is developed to demonstrate the formulation and show how sediment objectives can be spatially defined anywhere in the watershed. Spatial...

  13. Physical activity participation by presence and type of functional deficits in older women: The Women's Health and Aging Studies.

    PubMed

    Jerome, Gerald J; Glass, Thomas A; Mielke, Michelle; Xue, Qian-Li; Andersen, Ross E; Fried, Linda P

    2006-11-01

    Physical activity is important for maintaining functional independence of older persons, especially for those with existing functional deficits. Since such deficits may pose barriers to activity, it would be instructive to examine activity patterns in relation to specific types of deficits to determine the amount and type of physical activity older women pursue. This study sought to identify categories of functional deficits associated with activity levels and evaluated the potential for older women to increase their physical activity levels. Community-dwelling women, aged 70-79 years, from the Women's Health and Aging Studies I and II (N = 710), were assessed for self-reported physical activity, functional deficits and chronic conditions, along with objective measures of muscle strength. Both type (household chores, exercise, and recreational activity) and amount of physical activity (min/wk) were examined. Meeting physical activity recommendations was defined as > or =150 minutes per week of moderate intensity physical activity, and inactivity was defined as no weekly moderate intensity physical activity. Hierarchical categories of functional deficits were based on self-reported difficulty in four functional domains (i.e., mobility/exercise tolerance, upper extremity, higher functioning, and self-care), and self-reports ranged from no difficulty to difficulty in all four domains. The prevalence of inactivity and meeting activity recommendations were 14.4% and 12.7%, respectively. Severity of functional deficits was associated with increased risk of inactivity (adjusted odds ratios [ORs(adj)] = 3.14-17.61) and reduced likelihood of meeting activity recommendations (ORs(adj) =.11-.40). Even among those with higher functioning or self-care difficulties, 30% reported walking for exercise. There was evidence that older women with functional deficits can remain physically active. However, for some of these women, meeting the recommended levels of activity may be unrealistic. Efforts to increase physical activity levels among older adults should include treatment or management of functional deficits, chronic conditions, and poor strength.

  14. Prior knowledge guided active modules identification: an integrated multi-objective approach.

    PubMed

    Chen, Weiqi; Liu, Jing; He, Shan

    2017-03-14

    Active module, defined as an area in biological network that shows striking changes in molecular activity or phenotypic signatures, is important to reveal dynamic and process-specific information that is correlated with cellular or disease states. A prior information guided active module identification approach is proposed to detect modules that are both active and enriched by prior knowledge. We formulate the active module identification problem as a multi-objective optimisation problem, which consists two conflicting objective functions of maximising the coverage of known biological pathways and the activity of the active module simultaneously. Network is constructed from protein-protein interaction database. A beta-uniform-mixture model is used to estimate the distribution of p-values and generate scores for activity measurement from microarray data. A multi-objective evolutionary algorithm is used to search for Pareto optimal solutions. We also incorporate a novel constraints based on algebraic connectivity to ensure the connectedness of the identified active modules. Application of proposed algorithm on a small yeast molecular network shows that it can identify modules with high activities and with more cross-talk nodes between related functional groups. The Pareto solutions generated by the algorithm provides solutions with different trade-off between prior knowledge and novel information from data. The approach is then applied on microarray data from diclofenac-treated yeast cells to build network and identify modules to elucidate the molecular mechanisms of diclofenac toxicity and resistance. Gene ontology analysis is applied to the identified modules for biological interpretation. Integrating knowledge of functional groups into the identification of active module is an effective method and provides a flexible control of balance between pure data-driven method and prior information guidance.

  15. The sense of beauty.

    PubMed

    Hagman, George

    2002-06-01

    This paper proposes an integrative psychoanalytic model of the sense of beauty. The following definition is used: beauty is an aspect of the experience of idealisation in which an object(s), sound(s) or concept(s) is believed to possess qualities of formal perfection. The psychoanalytic literature regarding beauty is explored in depth and fundamental similarities are stressed. The author goes on to discuss the following topics: (1) beauty as sublimation: beauty reconciles the polarisation of self and world; (2) idealisation and beauty: the love of beauty is an indication of the importance of idealisation during development; (3) beauty as an interactive process: the sense of beauty is interactive and intersubjective; (4) the aesthetic and non-aesthetic emotions: specific aesthetic emotions are experienced in response to the formal design of the beautiful object; (5) surrendering to beauty: beauty provides us with an occasion for transcendence and self-renewal; (6) beauty's restorative function: the preservation or restoration of the relationship to the good object is of utmost importance; (7) the self-integrative function of beauty: the sense of beauty can also reconcile and integrate self-states of fragmentation and depletion; (8) beauty as a defence: in psychopathology, beauty can function defensively for the expression of unconscious impulses and fantasies, or as protection against self-crisis; (9) beauty and mortality: the sense of beauty can alleviate anxiety regarding death and feelings of vulnerability. In closing the paper, the author offers a new understanding of Freud'semphasis on love of beauty as a defining trait of civilisation. For a people not to value beauty would mean that they cannot hope and cannot assert life over the inevitable and ubiquitous forces of entropy and death.

  16. Maximum Entropy Methods as the Bridge Between Microscopic and Macroscopic Theory

    NASA Astrophysics Data System (ADS)

    Taylor, Jamie M.

    2016-09-01

    This paper is concerned with an investigation into a function of macroscopic variables known as the singular potential, building on previous work by Ball and Majumdar. The singular potential is a function of the admissible statistical averages of probability distributions on a state space, defined so that it corresponds to the maximum possible entropy given known observed statistical averages, although non-classical entropy-like objective functions will also be considered. First the set of admissible moments must be established, and under the conditions presented in this work the set is open, bounded and convex allowing a description in terms of supporting hyperplanes, which provides estimates on the development of singularities for related probability distributions. Under appropriate conditions it is shown that the singular potential is strictly convex, as differentiable as the microscopic entropy, and blows up uniformly as the macroscopic variable tends to the boundary of the set of admissible moments. Applications of the singular potential are then discussed, and particular consideration will be given to certain free-energy functionals typical in mean-field theory, demonstrating an equivalence between certain microscopic and macroscopic free-energy functionals. This allows statements about L^1-local minimisers of Onsager's free energy to be obtained which cannot be given by two-sided variations, and overcomes the need to ensure local minimisers are bounded away from zero and +∞ before taking L^∞ variations. The analysis also permits the definition of a dual order parameter for which Onsager's free energy allows an explicit representation. Also, the difficulties in approximating the singular potential by everywhere defined functions, in particular by polynomial functions, are addressed, with examples demonstrating the failure of the Taylor approximation to preserve relevant shape properties of the singular potential.

  17. Are face representations depth cue invariant?

    PubMed

    Dehmoobadsharifabadi, Armita; Farivar, Reza

    2016-06-01

    The visual system can process three-dimensional depth cues defining surfaces of objects, but it is unclear whether such information contributes to complex object recognition, including face recognition. The processing of different depth cues involves both dorsal and ventral visual pathways. We investigated whether facial surfaces defined by individual depth cues resulted in meaningful face representations-representations that maintain the relationship between the population of faces as defined in a multidimensional face space. We measured face identity aftereffects for facial surfaces defined by individual depth cues (Experiments 1 and 2) and tested whether the aftereffect transfers across depth cues (Experiments 3 and 4). Facial surfaces and their morphs to the average face were defined purely by one of shading, texture, motion, or binocular disparity. We obtained identification thresholds for matched (matched identity between adapting and test stimuli), non-matched (non-matched identity between adapting and test stimuli), and no-adaptation (showing only the test stimuli) conditions for each cue and across different depth cues. We found robust face identity aftereffect in both experiments. Our results suggest that depth cues do contribute to forming meaningful face representations that are depth cue invariant. Depth cue invariance would require integration of information across different areas and different pathways for object recognition, and this in turn has important implications for cortical models of visual object recognition.

  18. Combined Economic and Hydrologic Modeling to Support Collaborative Decision Making Processes

    NASA Astrophysics Data System (ADS)

    Sheer, D. P.

    2008-12-01

    For more than a decade, the core concept of the author's efforts in support of collaborative decision making has been a combination of hydrologic simulation and multi-objective optimization. The modeling has generally been used to support collaborative decision making processes. The OASIS model developed by HydroLogics Inc. solves a multi-objective optimization at each time step using a mixed integer linear program (MILP). The MILP can be configured to include any user defined objective, including but not limited too economic objectives. For example, an estimated marginal value for water for crops and M&I use were included in the objective function to drive trades in a model of the lower Rio Grande. The formulation of the MILP, constraints and objectives, in any time step is conditional: it changes based on the value of state variables and dynamic external forcing functions, such as rainfall, hydrology, market prices, arrival of migratory fish, water temperature, etc. It therefore acts as a dynamic short term multi-objective economic optimization for each time step. MILP is capable of solving a general problem that includes a very realistic representation of the physical system characteristics in addition to the normal multi-objective optimization objectives and constraints included in economic models. In all of these models, the short term objective function is a surrogate for achieving long term multi-objective results. The long term performance for any alternative (especially including operating strategies) is evaluated by simulation. An operating rule is the combination of conditions, parameters, constraints and objectives used to determine the formulation of the short term optimization in each time step. Heuristic wrappers for the simulation program have been developed improve the parameters of an operating rule, and are initiating research on a wrapper that will allow us to employ a genetic algorithm to improve the form of the rule (conditions, constraints, and short term objectives) as well. In the models operating rules represent different models of human behavior, and the objective of the modeling is to find rules for human behavior that perform well in terms of long term human objectives. The conceptual model used to represent human behavior incorporates economic multi-objective optimization for surrogate objectives, and rules that set those objectives based on current conditions and accounting for uncertainty, at least implicitly. The author asserts that real world operating rules follow this form and have evolved because they have been perceived as successful in the past. Thus, the modeling efforts focus on human behavior in much the same way that economic models focus on human behavior. This paper illustrates the above concepts with real world examples.

  19. Robot body self-modeling algorithm: a collision-free motion planning approach for humanoids.

    PubMed

    Leylavi Shoushtari, Ali

    2016-01-01

    Motion planning for humanoid robots is one of the critical issues due to the high redundancy and theoretical and technical considerations e.g. stability, motion feasibility and collision avoidance. The strategies which central nervous system employs to plan, signal and control the human movements are a source of inspiration to deal with the mentioned problems. Self-modeling is a concept inspired by body self-awareness in human. In this research it is integrated in an optimal motion planning framework in order to detect and avoid collision of the manipulated object with the humanoid body during performing a dynamic task. Twelve parametric functions are designed as self-models to determine the boundary of humanoid's body. Later, the boundaries which mathematically defined by the self-models are employed to calculate the safe region for box to avoid the collision with the robot. Four different objective functions are employed in motion simulation to validate the robustness of algorithm under different dynamics. The results also confirm the collision avoidance, reality and stability of the predicted motion.

  20. Spectral analysis of resting cardiovascular variables and responses to oscillatory LBNP before and after 6 degree head dowm bedrest

    NASA Technical Reports Server (NTRS)

    Knapp, Charles F.; Evans, J. M.; Patwardhan, A.; Levenhagen, D.; Wang, M.; Charles, John B.

    1991-01-01

    A major focus of our research program is to develop noninvasive procedures for determining changes in cardiovascular function associated with the null gravity environment. We define changes in cardiovascular function to be (1) the result of the regulatory system operating at values different from 'normal' but with an overall control system basically unchanged by the null gravity exposure, or (2) the result of operating with a control system that has significantly different regulatory characteristics after an exposure. To this end, we have used a model of weightlessness that consisted of exposing humans to 2 hrs. in the launch position, followed by 20 hrs. of 6 deg head down bedrest. Our principal objective was to use this model to measure cardiovascular responses to the 6 deg head down bedrest protocol and to develop the most sensitive 'systems identification' procedure for indicating change. A second objective, related to future experiments, is to use the procedure in combination with experiments designed to determine the degree to which a regulatory pathway has been altered and to determine the mechanisms responsible for the changes.

  1. NASA Electronic Library System (NELS) database schema, version 1.2

    NASA Technical Reports Server (NTRS)

    Melebeck, Clovis J.

    1991-01-01

    The database tables used by NELS version 1.2 are discussed. To provide the current functional capability offered by NELS, nineteen tables were created with ORACLE. Each table lists the ORACLE table name and provides a brief description of the tables intended use or function. The following sections cover four basic categories of tables: NELS object classes, NELS collections, NELS objects, and NELS supplemental tables. Also included in each section is a definition and/or relationship of each field to other fields or tables. The primary key(s) for each table is indicated with a single asterisk (*), while foreign keys are indicated with double asterisks (**). The primary key(s) indicate the key(s) which uniquely identifies a record for that table. The foreign key(s) is used to identify additional information in other table(s) for that record. The two appendices are the command which is used to construct the ORACLE tables for NELS. Appendix A contains the commands which create the tables which are defined in the following sections. Appendix B contains the commands which build the indices for these tables.

  2. Design optimization of electric vehicle battery cooling plates for thermal performance

    NASA Astrophysics Data System (ADS)

    Jarrett, Anthony; Kim, Il Yong

    The performance of high-energy battery cells utilized in electric vehicles (EVs) is greatly improved by adequate temperature control. An efficient thermal management system is also desirable to avoid diverting excessive power from the primary vehicle functions. In a battery cell stack, cooling can be provided by including cooling plates: thin metal fabrications which include one or more internal channels through which a coolant is pumped. Heat is conducted from the battery cells into the cooling plate, and transported away by the coolant. The operating characteristics of the cooling plate are determined in part by the geometry of the channel; its route, width, length, etc. In this study, a serpentine-channel cooling plate is modeled parametrically and its characteristics assessed using computational fluid dynamics (CFD). Objective functions of pressure drop, average temperature, and temperature uniformity are defined and numerical optimization is carried out by allowing the channel width and position to vary. The optimization results indicate that a single design can satisfy both pressure and average temperature objectives, but at the expense of temperature uniformity.

  3. Threshold-free method for three-dimensional segmentation of organelles

    NASA Astrophysics Data System (ADS)

    Chan, Yee-Hung M.; Marshall, Wallace F.

    2012-03-01

    An ongoing challenge in the field of cell biology is to how to quantify the size and shape of organelles within cells. Automated image analysis methods often utilize thresholding for segmentation, but the calculated surface of objects depends sensitively on the exact threshold value chosen, and this problem is generally worse at the upper and lower zboundaries because of the anisotropy of the point spread function. We present here a threshold-independent method for extracting the three-dimensional surface of vacuoles in budding yeast whose limiting membranes are labeled with a fluorescent fusion protein. These organelles typically exist as a clustered set of 1-10 sphere-like compartments. Vacuole compartments and center points are identified manually within z-stacks taken using a spinning disk confocal microscope. A set of rays is defined originating from each center point and radiating outwards in random directions. Intensity profiles are calculated at coordinates along these rays, and intensity maxima are taken as the points the rays cross the limiting membrane of the vacuole. These points are then fit with a weighted sum of basis functions to define the surface of the vacuole, and then parameters such as volume and surface area are calculated. This method is able to determine the volume and surface area of spherical beads (0.96 to 2 micron diameter) with less than 10% error, and validation using model convolution methods produce similar results. Thus, this method provides an accurate, automated method for measuring the size and morphology of organelles and can be generalized to measure cells and other objects on biologically relevant length-scales.

  4. Astrophysics of Reference Frame Tie Objects

    NASA Technical Reports Server (NTRS)

    Johnston, Kenneth J.; Boboltz, David; Fey, Alan Lee; Gaume, Ralph A.; Zacharias, Norbert

    2004-01-01

    The Astrophysics of Reference Frame Tie Objects Key Science program will investigate the underlying physics of SIM grid objects. Extragalactic objects in the SIM grid will be used to tie the SIM reference frame to the quasi-inertial reference frame defined by extragalactic objects and to remove any residual frame rotation with respect to the extragalactic frame. The current realization of the extragalactic frame is the International Celestial Reference Frame (ICRF). The ICRF is defined by the radio positions of 212 extragalactic objects and is the IAU sanctioned fundamental astronomical reference frame. This key project will advance our knowledge of the physics of the objects which will make up the SIM grid, such as quasars and chromospherically active stars, and relates directly to the stability of the SIM reference frame. The following questions concerning the physics of reference frame tie objects will be investigated.

  5. DTS Raw Data Guelph, ON Canada

    DOE Data Explorer

    Thomas Coleman

    2013-07-31

    Unprocessed active distributed temperature sensing (DTS) data from 3 boreholes in the Guelph, ON Canada region. Data from borehole 1 was collected during a fluid injection while data from boreholes 2 and 3 were collected under natural gradient conditions in a lined borehole. The column labels/headers (in the first row) define the time since start of measurement in seconds and the row labels/headers (in the first column) are the object IDs that are defined in the metadata. Each object ID is a sampling location whose exact location is defined in the metadata file. Data in each cell are temperature in Celsius at time and sampling location as defined above.

  6. Multi-Objective Hybrid Optimal Control for Multiple-Flyby Low-Thrust Mission Design

    NASA Technical Reports Server (NTRS)

    Englander, Jacob A.; Vavrina, Matthew A.; Ghosh, Alexander R.

    2015-01-01

    Preliminary design of low-thrust interplanetary missions is a highly complex process. The mission designer must choose discrete parameters such as the number of flybys, the bodies at which those flybys are performed, and in some cases the final destination. In addition, a time-history of control variables must be chosen that defines the trajectory. There are often many thousands, if not millions, of possible trajectories to be evaluated. The customer who commissions a trajectory design is not usually interested in a point solution, but rather the exploration of the trade space of trajectories between several different objective functions. This can be a very expensive process in terms of the number of human analyst hours required. An automated approach is therefore very desirable. This work presents such an approach by posing the mission design problem as a multi-objective hybrid optimal control problem. The method is demonstrated on a hypothetical mission to the main asteroid belt.

  7. Open Mobile Alliance Secure Content Exchange: Introducing Key Management Constructs and Protocols for Compromise-Resilient Easing of DRM Restrictions

    NASA Astrophysics Data System (ADS)

    Kravitz, David William

    This paper presents an insider's view of the rationale and the cryptographic mechanics of some principal elements of the Open Mobile Alliance (OMA) Secure Content Exchange (SCE) Technical Specifications. A primary goal is to enable implementation of a configurable methodology that quarantines the effects that unknown-compromised entities have on still-compliant entities in the system, while allowing import from upstream protection systems and multi-client reuse of Rights Objects that grant access to plaintext content. This has to be done without breaking compatibility with the underlying legacy OMA DRM v2.0/v2.1 Technical Specifications. It is also required that legacy devices can take at least partial advantage of the new import functionality, and can request the creation of SCE-compatible Rights Objects and utilize Rights Objects created upon request of SCE-conformant devices. This must be done in a way that the roles played by newly defined entities unrecognizable by legacy devices remain hidden.

  8. Bio-objects and the media: the role of communication in bio-objectification processes

    PubMed Central

    Maeseele, Pieter; Allgaier, Joachim; Martinelli, Lucia

    2013-01-01

    The representation of biological innovations in and through communication and media practices is vital for understanding the nature of “bio-objects” and the process we call “bio-objectification.” This paper discusses two ideal-typical analytical approaches based on different underlying communication models, ie, the traditional (science- and media-centered) and media sociological (a multi-layered process involving various social actors in defining the meanings of scientific and technological developments) approach. In this analysis, the latter is not only found to be the most promising approach for understanding the circulation, (re)production, and (re)configuration of meanings of bio-objects, but also to interpret the relationship between media and science. On the basis of a few selected examples, this paper highlights how media function as a primary arena for the (re)production and (re)configuration of scientific and biomedical information with regards to bio-objects in the public sphere in general, and toward decision-makers, interest groups, and the public in specific. PMID:23771763

  9. Barcoding Human Physical Activity to Assess Chronic Pain Conditions

    PubMed Central

    Paraschiv-Ionescu, Anisoara; Perruchoud, Christophe; Buchser, Eric; Aminian, Kamiar

    2012-01-01

    Background Modern theories define chronic pain as a multidimensional experience – the result of complex interplay between physiological and psychological factors with significant impact on patients' physical, emotional and social functioning. The development of reliable assessment tools capable of capturing the multidimensional impact of chronic pain has challenged the medical community for decades. A number of validated tools are currently used in clinical practice however they all rely on self-reporting and are therefore inherently subjective. In this study we show that a comprehensive analysis of physical activity (PA) under real life conditions may capture behavioral aspects that may reflect physical and emotional functioning. Methodology PA was monitored during five consecutive days in 60 chronic pain patients and 15 pain-free healthy subjects. To analyze the various aspects of pain-related activity behaviors we defined the concept of PA ‘barcoding’. The main idea was to combine different features of PA (type, intensity, duration) to define various PA states. The temporal sequence of different states was visualized as a ‘barcode’ which indicated that significant information about daily activity can be contained in the amount and variety of PA states, and in the temporal structure of sequence. This information was quantified using complementary measures such as structural complexity metrics (information and sample entropy, Lempel-Ziv complexity), time spent in PA states, and two composite scores, which integrate all measures. The reliability of these measures to characterize chronic pain conditions was assessed by comparing groups of subjects with clinically different pain intensity. Conclusion The defined measures of PA showed good discriminative features. The results suggest that significant information about pain-related functional limitations is captured by the structural complexity of PA barcodes, which decreases when the intensity of pain increases. We conclude that a comprehensive analysis of daily-life PA can provide an objective appraisal of the intensity of pain. PMID:22384191

  10. Definitely maybe: can unconscious processes perform the same functions as conscious processes?

    PubMed Central

    Hesselmann, Guido; Moors, Pieter

    2015-01-01

    Hassin recently proposed the “Yes It Can” (YIC) principle to describe the division of labor between conscious and unconscious processes in human cognition. According to this principle, unconscious processes can carry out every fundamental high-level cognitive function that conscious processes can perform. In our commentary, we argue that the author presents an overly idealized review of the literature in support of the YIC principle. Furthermore, we point out that the dissimilar trends observed in social and cognitive psychology, with respect to published evidence of strong unconscious effects, can better be explained by the way how awareness is defined and measured in both research fields. Finally, we show that the experimental paradigm chosen by Hassin to rule out remaining objections against the YIC principle is unsuited to verify the new default notion that all high-level cognitive functions can unfold unconsciously. PMID:25999896

  11. Chapter 4: A policy process and tools for international non-governmental organizations in the health sector using ISPRM as a case in point.

    PubMed

    Reinhardt, Jan D; von Groote, Per M; DeLisa, Joel A; Melvin, John L; Bickenbach, Jerome E; Stucki, Gerold

    2009-09-01

    The politics of international non-governmental organizations (NGOs) such as the International Society of Physical and Rehabilitation Medicine (ISPRM) serve the function of selecting and attaining particular socially valued goals. The selection and attainment of goals as the primary function of political action can be structured along a policy process or cycle comprising the stages of strategic goal setting and planning of strategic pathways, agenda setting, resource mobilization, implementation, evaluation and innovation. At the various stages of this policy process different policy tools or instruments, which can be used to influence citizen and organizational behaviour in the light of defined goals, can be applied. The objective of this paper is to introduce and describe policy tools of potential relevance to ISPRM with regard to different policy functions and stages of the policy process.

  12. Modeling transport kinetics in clinoptilolite-phosphate rock systems

    NASA Technical Reports Server (NTRS)

    Allen, E. R.; Ming, D. W.; Hossner, L. R.; Henninger, D. L.

    1995-01-01

    Nutrient release in clinoptilolite-phosphate rock (Cp-PR) systems occurs through dissolution and cation-exchange reactions. Investigating the kinetics of these reactions expands our understanding of nutrient release processes. Research was conducted to model transport kinetics of nutrient release in Cp-PR systems. The objectives were to identify empirical models that best describe NH4, K, and P release and define diffusion-controlling processes. Materials included a Texas clinoptilolite (Cp) and North Carolina phosphate rock (PR). A continuous-flow thin-disk technique was used. Models evaluated included zero order, first order, second order, parabolic diffusion, simplified Elovich, Elovich, and power function. The power-function, Elovich, and parabolic-diffusion models adequately described NH4, K, and P release. The power-function model was preferred because of its simplicity. Models indicated nutrient release was diffusion controlled. Primary transport processes controlling nutrient release for the time span observed were probably the result of a combination of several interacting transport mechanisms.

  13. Summary of the white paper of DICOM WG24 'DICOM in Surgery'

    NASA Astrophysics Data System (ADS)

    Lemke, Heinz U.

    2007-03-01

    Standards for creating and integrating information about patients, equipment, and procedures are vitally needed when planning for an efficient Operating Room (OR). The DICOM Working Group 24 (WG24) has been established to develop DICOM objects and services related to Image Guided Surgery (IGS). To determine these standards, it is important to define day-to-day, step-by-step surgical workflow practices and create surgery workflow models per procedures or per variable cases. A well-defined workflow and a high fidelity patient model will be the base of activities for both, radiation therapy and surgery. Considering the present and future requirements for surgical planning and intervention, such a patient model must be n-dimensional, were n may include the spatial and temporal dimensions as well as a number of functional variables. As the boundaries between radiation therapy, surgery and interventional radiology are becoming less well-defined, precise patient models will become the greatest common denominator for all therapeutic disciplines. In addition to imaging, the focus of WG24 should, therefore, also be to serve the therapeutic disciplines by enabling modelling technology to be based on standards.

  14. Prevalence of swallowing dysfunction screened in Swedish cohort of COPD patients

    PubMed Central

    Gonzalez Lindh, Margareta; Blom Johansson, Monica; Jennische, Margareta; Koyi, Hirsh

    2017-01-01

    Background COPD is a common problem associated with morbidity and mortality. COPD may also affect the dynamics and coordination of functions such as swallowing. A misdirected swallow may, in turn, result in the bolus entering the airway. A growing body of evidence suggests that a subgroup of people with COPD is prone to oropharyngeal dysphagia. The aim of this study was to evaluate swallowing dysfunction in patients with stable COPD and to determine the relation between signs and symptoms of swallowing dysfunction and lung function (forced expiratory volume in 1 second percent predicted). Methods Fifty-one patients with COPD in a stable phase participated in a questionnaire survey, swallowing tests, and spirometry. A post-bronchodilator ratio of the forced expiratory volume in 1 second/best of forced vital capacity and vital capacity <0.7 was used to define COPD. Swallowing function was assessed by a questionnaire and two swallowing tests (water and cookie swallow tests). Results Sixty-five percent of the patients reported subjective signs and symptoms of swallowing dysfunction in the questionnaire and 49% showed measurable ones in the swallowing tests. For the combined subjective and objective findings, 78% had a coexisting swallowing dysfunction. No significant difference was found between male and female patients. Conclusion Swallowing function is affected in COPD patients with moderate to severe airflow limitation, and the signs and symptoms of this swallowing dysfunction were subjective, objective, or both. PMID:28176891

  15. Chairmanship of the Neptune/Pluto outer planets science working group

    NASA Astrophysics Data System (ADS)

    Stern, S. Alan

    1993-11-01

    The Outer Planets Science Working Group (OPSWG) is the NASA Solar System Exploration Division (SSED) scientific steering committee for the Outer Solar System missions. OPSWG consists of 19 members and is chaired by Dr. S. Alan Stern. This proposal summarizes the FY93 activities of OPSWG, describes a set of objectives for OPSWG in FY94, and outlines the SWG's activities for FY95. As chair of OPSWG, Dr. Stern will be responsible for: organizing priorities, setting agendas, conducting meetings of the Outer Planets SWG; reporting the results of OPSWG's work to SSED; supporting those activities relating to OPSWG work, such as briefings to the SSES, COMPLEX, and OSS; supporting the JPL/SAIC Pluto study team; and other tasks requested by SSED. As the Scientific Working Group (SWG) for Jupiter and the planets beyond, OPSWG is the SSED SWG chartered to study and develop mission plans for all missions to the giant planets, Pluto, and other distant objects in the remote outer solar system. In that role, OPSWG is responsible for: defining and prioritizing scientific objectives for missions to these bodies; defining and documenting the scientific goals and rationale behind such missions; defining and prioritizing the datasets to be obtained in these missions; defining and prioritizing measurement objectives for these missions; defining and documenting the scientific rationale for strawman instrument payloads; defining and prioritizing the scientific requirements for orbital tour and flyby encounter trajectories; defining cruise science opportunities plan; providing technical feedback to JPL and SSED on the scientific capabilities of engineering studies for these missions; providing documentation to SSED concerning the scientific goals, objectives, and rationale for the mission; interfacing with other SSED and OSS committees at the request of SSED's Director or those committee chairs; providing input to SSED concerning the structure and content of the Announcement of Opportunity for payload and scientific team selection for such missions; and providing other technical or programmatic inputs concerning outer solar system missions at the request of the Director of SSED.

  16. Chairmanship of the Neptune/Pluto outer planets science working group

    NASA Technical Reports Server (NTRS)

    Stern, S. Alan

    1993-01-01

    The Outer Planets Science Working Group (OPSWG) is the NASA Solar System Exploration Division (SSED) scientific steering committee for the Outer Solar System missions. OPSWG consists of 19 members and is chaired by Dr. S. Alan Stern. This proposal summarizes the FY93 activities of OPSWG, describes a set of objectives for OPSWG in FY94, and outlines the SWG's activities for FY95. As chair of OPSWG, Dr. Stern will be responsible for: organizing priorities, setting agendas, conducting meetings of the Outer Planets SWG; reporting the results of OPSWG's work to SSED; supporting those activities relating to OPSWG work, such as briefings to the SSES, COMPLEX, and OSS; supporting the JPL/SAIC Pluto study team; and other tasks requested by SSED. As the Scientific Working Group (SWG) for Jupiter and the planets beyond, OPSWG is the SSED SWG chartered to study and develop mission plans for all missions to the giant planets, Pluto, and other distant objects in the remote outer solar system. In that role, OPSWG is responsible for: defining and prioritizing scientific objectives for missions to these bodies; defining and documenting the scientific goals and rationale behind such missions; defining and prioritizing the datasets to be obtained in these missions; defining and prioritizing measurement objectives for these missions; defining and documenting the scientific rationale for strawman instrument payloads; defining and prioritizing the scientific requirements for orbital tour and flyby encounter trajectories; defining cruise science opportunities plan; providing technical feedback to JPL and SSED on the scientific capabilities of engineering studies for these missions; providing documentation to SSED concerning the scientific goals, objectives, and rationale for the mission; interfacing with other SSED and OSS committees at the request of SSED's Director or those committee chairs; providing input to SSED concerning the structure and content of the Announcement of Opportunity for payload and scientific team selection for such missions; and providing other technical or programmatic inputs concerning outer solar system missions at the request of the Director of SSED.

  17. Definition ofthe Design Trajectory and Entry Flight Corridor for the NASA Orion Exploration Mission 1 Entry Trajectory Using an Integrated Approach and Optimization

    NASA Technical Reports Server (NTRS)

    McNamara, Luke W.; Braun, Robert D.

    2014-01-01

    One of the key design objectives of NASA's Orion Exploration Mission 1 (EM- 1) is to execute a guided entry trajectory demonstrating GN&C capability. The focus of this paper is defining the flyable entry corridor for EM-1 taking into account multiple subsystem constraints such as complex aerothermal heating constraints, aerothermal heating objectives, landing accuracy constraints, structural load limits, Human-System-Integration-Requirements, Service Module debris disposal limits and other flight test objectives. During the EM-1 Design Analysis Cycle 1 design challenges came up that made defining the flyable entry corridor for the EM-1 mission critical to mission success. This document details the optimization techniques that were explored to use with the 6-DOF ANTARES simulation to assist in defining the design entry interface state and entry corridor with respect to key flight test constraints and objectives.

  18. Evaluation of oral stereognostic ability after rehabilitating patients with complete dentures: in vivo study.

    PubMed

    Meenakshi, S; Gujjari, Anil Kumar; Thippeswamy, H N; Raghunath, N

    2014-12-01

    Stereognosis has been defined as the appreciation of the form of objects by palpation. Whilst this definition holds good for the manual exploration of objects, it is possible for the shape of objects to be explored intra orally referred to as oral stereognosis. To better understand patients' relative satisfaction with complete dentures, differences in oral stereognostic perception, based on the identification of 6 edible objects was analyzed in a group of 30 edentulous individuals at 3 stages, namely, just before (pre-treatment), 30 min after (30 min post-treatment) and 1 month after (1 month post-treatment) the insertion of new dentures. The time required to identify each object was recorded and the correctness of identification of each object was scored using oral stereognostic score. Descriptive statistics, Wilcoxon signed rank test, Spearman's rank correlation test, Pearson Chi square test was used to statistically analyze the data obtained. OSA scores was significantly increased 1 month post-treatment compared to 30 min post-treatment (p < 0.05). It was found that Oral stereognostic test is reliable for measuring patients' oral stereognostic perception and may be used as one of the clinical aids in appreciating the functional limitations imposed by the prostheses.

  19. Defining a Model for Mitochondrial Function in mESC Differentiation

    EPA Science Inventory

    Defining a Model for Mitochondrial Function in mESC DifferentiationDefining a Model for Mitochondrial Function in mESC Differentiation Differentiating embryonic stem cells (ESCs) undergo mitochondrial maturation leading to a switch from a system dependent upon glycolysis to a re...

  20. Ipsilateral renal function preservation after robot-assisted partial nephrectomy (RAPN): an objective analysis using mercapto-acetyltriglycine (MAG3) renal scan data and volumetric assessment.

    PubMed

    Zargar, Homayoun; Akca, Oktay; Autorino, Riccardo; Brandao, Luis Felipe; Laydner, Humberto; Krishnan, Jayram; Samarasekera, Dinesh; Stein, Robert J; Kaouk, Jihad H

    2015-05-01

    To objectively assess ipsilateral renal function (IRF) preservation and factors influencing it after robot-assisted partial nephrectomy (RAPN). Our database was queried to identify patients who had undergone RAPN from 2007 to 2013 and had complete pre- and postoperative mercapto-acetyltriglycine (MAG3) renal scan assessment. The estimated glomerular filtration rate (eGFR) for the operated kidney was calculated by multiplying the percentage of contribution from the renal scan by the total eGFR. IRF preservation was defined as a ratio of the postoperative eGFR for the operated kidney to the preoperative eGFR for the operated kidney. The percentage of total eGFR preservation was calculated in the same manner (postoperative eGFR/preoperative eGFR × 100). The amount of healthy rim of renal parenchyma removed was assessed by deducting the volume of tumour from the volume of the PN specimen assessed on pathology. Multivariable linear regression was used for analysis. In all, 99 patients were included in the analysis. The overall median (interquartile range) total eGFR preservation and IRF preservation for the operated kidney was 83.83 (75.2-94.1)% and 72 (60.3-81)%, respectively (P < 0.01). On multivariable analysis, volume of healthy rim of renal parenchyma removed, warm ischaemia time (WIT) > 30 min, body mass index (BMI) and operated kidney preoperative eGFR were predictive of IRF preservation. Using total eGFR tends to overestimate the degree of renal function preservation after RAPN. This is particularly relevant when studying factors affecting functional outcomes after nephron-sparing surgery. IRF may be a more precise assessment method in this setting. Operated kidney baseline renal function, BMI, WIT >30 min, and amount of resected healthy renal parenchyma represent the factors with a significant impact on the IRF preservation. RAPN provides significant preservation of renal function as shown by objective assessment criteria. © 2014 The Authors. BJU International © 2014 BJU International.

  1. Decline in Physical Function and Risk for Elder Abuse Reported to Social Services in a Community-Dwelling Population of Older Adults

    PubMed Central

    Dong, XinQi; Simon, Melissa; Evans, Denis

    2012-01-01

    Objectives Elder abuse is an important public health and human rights issue and is associated with increased morbidity and mortality. This study aimed to examine the longitudinal association between decline in physical function and the risk for elder abuse. Design Prospective population-based study Setting Geographically defined community in Chicago. Participants Chicago Health and Aging Project (CHAP) is a population-based study (N=6,159), and we identified 143 CHAP participants who had elder abuse reported to social services agency from 1993–2010. Participants The primary independent variable was objectively assessed physical function using decline in physical performance testing (Tandem stand, measured walk and chair stand). Secondary independent variables were assessed using the decline in self-reported Katz, Nagi, and Rosow-Breslau scales. Outcomes were reported and confirmed elder abuse and specific subtypes of elder abuse (physical, psychological, caregiver neglect and financial exploitation). Logistic regression models were used to assess the association of decline in physical function measures and risk for elder abuse. Results After adjusting for potential confounders, every 1 point decline in physical performance testing (OR, 1.13(1.06–1.19)), Katz impairment (OR, 1.29(1.15–1.45)), Nagi impairment (OR, 1.30(1.13–1.49)) and Rosow Breslau impairment (OR, 1.42(1.15–1.74)) were associated with increased risk for elder abuse. Lowest tertiles of physical performance testing (OR, 4.92 (1.39–17.46), highest tertiles of Katz impairment (OR, 3.99 (2.18–7.31), Nagi impairment (OR, 2.37 (1.08–5.23), and Rosow Breslau impairment (2.85 (1.39–5.84) were associated with increased risk for elder abuse. Conclusion Decline in objectively assessed physical function and self-reported physical function are associated with increased risk for elder abuse. PMID:23002901

  2. The population health record: concepts, definition, design, and implementation.

    PubMed

    Friedman, Daniel J; Parrish, R Gibson

    2010-01-01

    In 1997, the American Medical Informatics Association proposed a US information strategy that included a population health record (PopHR). Despite subsequent progress on the conceptualization, development, and implementation of electronic health records and personal health records, minimal progress has occurred on the PopHR. Adapting International Organization for Standarization electronic health records standards, we define the PopHR as a repository of statistics, measures, and indicators regarding the state of and influences on the health of a defined population, in computer processable form, stored and transmitted securely, and accessible by multiple authorized users. The PopHR is based upon an explicit population health framework and a standardized logical information model. PopHR purpose and uses, content and content sources, functionalities, business objectives, information architecture, and system architecture are described. Barriers to implementation and enabling factors and a three-stage implementation strategy are delineated.

  3. A study of characteristics of intercity transportation systems. Phase 1: Definition of transportation comparison methodology

    NASA Technical Reports Server (NTRS)

    English, J. M.; Smith, J. L.; Lifson, M. W.

    1978-01-01

    Decision making in early transportation planning must be responsive to complex value systems representing various policies and objectives. The assessment of alternative transportation concepts during the early initial phases of the system life cycle, when supportive research and technology development activities are defined, requires estimates of transportation, environmental, and socio-economic impacts throughout the system life cycle, which is a period of some 40 or 50 years. A unified methodological framework for comparing intercity passenger and freight transportation systems is described and is extended to include the comparison of long term transportation trends arising from implementation of the various R & D programs. The attributes of existing and future transportation systems are reviewed in order to establish measures for comparison, define value functions, and attribute weightings needed for comparing alternative policy actions for furthering transportation goals. Comparison criteria definitions and an illustrative example are included.

  4. Measuring case-mix complexity of tertiary care hospitals using DRGs.

    PubMed

    Park, Hayoung; Shin, Youngsoo

    2004-02-01

    The objectives of the study were to develop a model that measures and evaluates case-mix complexity of tertiary care hospitals, and to examine the characteristics of such a model. Physician panels defined three classes of case complexity and assigned disease categories represented by Adjacent Diagnosis Related Groups (ADRGs) to one of three case complexity classes. Three types of scores, indicating proportions of inpatients in each case complexity class standardized by the proportions at the national level, were defined to measure the case-mix complexity of a hospital. Discharge information for about 10% of inpatient episodes at 85 hospitals with bed size larger than 400 and their input structure and research and education activity were used to evaluate the case-mix complexity model. Results show its power to predict hospitals with the expected functions of tertiary care hospitals, i.e. resource intensive care, expensive input structure, and high levels of research and education activities.

  5. Vectors a Fortran 90 module for 3-dimensional vector and dyadic arithmetic

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brock, B.C.

    1998-02-01

    A major advance contained in the new Fortran 90 language standard is the ability to define new data types and the operators associated with them. Writing computer code to implement computations with real and complex three-dimensional vectors and dyadics is greatly simplified if the equations can be implemented directly, without the need to code the vector arithmetic explicitly. The Fortran 90 module described here defines new data types for real and complex 3-dimensional vectors and dyadics, along with the common operations needed to work with these objects. Routines to allow convenient initialization and output of the new types are alsomore » included. In keeping with the philosophy of data abstraction, the details of the implementation of the data types are maintained private, and the functions and operators are made generic to simplify the combining of real, complex, single- and double-precision vectors and dyadics.« less

  6. Experimental land observing data system feasibility study

    NASA Technical Reports Server (NTRS)

    Buckley, J. L.; Kraiman, H.

    1982-01-01

    An end-to-end data system to support a Shuttle-based Multispectral Linear Array (MLA) mission in the mid-1980's was defined. The experimental Land Observing System (ELOS) is discussed. A ground system that exploits extensive assets from the LANDSAT-D Program to effectively meet the objectives of the ELOS Mission was defined. The goal of 10 meter pixel precision, the variety of data acquisition capabilities, and the use of Shuttle are key to the mission requirements, Ground mission management functions are met through the use of GSFC's Multi-Satellite Operations Control Center (MSOCC). The MLA Image Generation Facility (MIGF) combines major hardware elements from the Applications Development Data System (ADDS) facility and LANDSAT Assessment System (LAS) with a special purpose MLA interface unit. LANDSAT-D image processing techniques, adapted to MLA characteristics, form the basis for the use of existing software and the definition of new software required.

  7. Rdesign: A data dictionary with relational database design capabilities in Ada

    NASA Technical Reports Server (NTRS)

    Lekkos, Anthony A.; Kwok, Teresa Ting-Yin

    1986-01-01

    Data Dictionary is defined to be the set of all data attributes, which describe data objects in terms of their intrinsic attributes, such as name, type, size, format and definition. It is recognized as the data base for the Information Resource Management, to facilitate understanding and communication about the relationship between systems applications and systems data usage and to help assist in achieving data independence by permitting systems applications to access data knowledge of the location or storage characteristics of the data in the system. A research and development effort to use Ada has produced a data dictionary with data base design capabilities. This project supports data specification and analysis and offers a choice of the relational, network, and hierarchical model for logical data based design. It provides a highly integrated set of analysis and design transformation tools which range from templates for data element definition, spreadsheet for defining functional dependencies, normalization, to logical design generator.

  8. NEXUS - Resilient Intelligent Middleware

    NASA Astrophysics Data System (ADS)

    Kaveh, N.; Hercock, R. Ghanea

    Service-oriented computing, a composition of distributed-object computing, component-based, and Web-based concepts, is becoming the widespread choice for developing dynamic heterogeneous software assets available as services across a network. One of the major strengths of service-oriented technologies is the high abstraction layer and large granularity level at which software assets are viewed compared to traditional object-oriented technologies. Collaboration through encapsulated and separately defined service interfaces creates a service-oriented environment, whereby multiple services can be linked together through their interfaces to compose a functional system. This approach enables better integration of legacy and non-legacy services, via wrapper interfaces, and allows for service composition at a more abstract level especially in cases such as vertical market stacks. The heterogeneous nature of service-oriented technologies and the granularity of their software components makes them a suitable computing model in the pervasive domain.

  9. On the Dielectric Properties of the Martian-like Surface Sediments

    NASA Technical Reports Server (NTRS)

    Heggy, E.; Clifford, S. M.; Morris, R. V.; Paillou, P.; Ruffie, G.

    2004-01-01

    We have undertaken laboratory electromagnetic characterization of the total set of minerals identified by TES on the Martian surface in order to investigate experimentally the dielectric properties of the sediments covering it in the frequency range from 1 to 30 MHz. Volcanic Rocks with a well defined mineralogy and petrology from potential terrestrial analogues sites have also been included in the study. Our primary objective is to evaluate the range of electrical and magnetic losses that may be encountered by the various Radar sounding and imaging experiments dedicated to map the Martian subsurface searching for underground water. The electromagnetic properties of these Mars-like materials will be presented as a function of various geophysical parameters, such as porosity, bulk density and temperature. The secondary objective, is to locate regions were surface dielectric conditions are suitable for subsurface sounding.

  10. Electrograms (ECG, EEG, EMG, EOG).

    PubMed

    Reilly, Richard B; Lee, T Clive

    2010-01-01

    There is a constant need in medicine to obtain objective measurements of physical and cognitive function as the basis for diagnosis and monitoring of health. The body can be considered as a chemical and electrical system supported by a mechanical structure. Measuring and quantifying such electrical activity provides a means for objective examination of heath status. The term electrogram, from the Greek electro meaning electricity and gram meaning write or record, is the broad definition given to the recording of electrical signal from the body. In order that comparisons of electrical activity can be made against normative data, certain methods and procedures have been defined for different electrograms. This paper reviews these methods and procedures for the more typical electrograms associated with some of the major organs in the body, providing a first point of reference for the reader.

  11. II.3. Electrograms (ECG, EEG, EMG, EOG).

    PubMed

    Reilly, Richard B; Lee, T Clive

    2010-01-01

    There is a constant need in medicine to obtain objective measurements of physical and cognitive function as the basis for diagnosis and monitoring of health. The body can be considered as a chemical and electrical system supported by a mechanical structure. Measuring and quantifying such electrical activity provides a means for objective examination of heath status. The term electrogram, from the Greek electro meaning electricity and gram meaning write or record, is the broad definition given to the recording of electrical signal from the body. In order that comparisons of electrical activity can be made against normative data, certain methods and procedures have been defined for different electrograms. This paper reviews these methods and procedures for the more typical electrograms associated with some of the major organs in the body, providing a first point of reference for the reader.

  12. Representation Elements of Spatial Thinking

    NASA Astrophysics Data System (ADS)

    Fiantika, F. R.

    2017-04-01

    This paper aims to add a reference in revealing spatial thinking. There several definitions of spatial thinking but it is not easy to defining it. We can start to discuss the concept, its basic a forming representation. Initially, the five sense catch the natural phenomenon and forward it to memory for processing. Abstraction plays a role in processing information into a concept. There are two types of representation, namely internal representation and external representation. The internal representation is also known as mental representation; this representation is in the human mind. The external representation may include images, auditory and kinesthetic which can be used to describe, explain and communicate the structure, operation, the function of the object as well as relationships. There are two main elements, representations properties and object relationships. These elements play a role in forming a representation.

  13. Computer Vision and Machine Learning for Autonomous Characterization of AM Powder Feedstocks

    NASA Astrophysics Data System (ADS)

    DeCost, Brian L.; Jain, Harshvardhan; Rollett, Anthony D.; Holm, Elizabeth A.

    2017-03-01

    By applying computer vision and machine learning methods, we develop a system to characterize powder feedstock materials for metal additive manufacturing (AM). Feature detection and description algorithms are applied to create a microstructural scale image representation that can be used to cluster, compare, and analyze powder micrographs. When applied to eight commercial feedstock powders, the system classifies powder images into the correct material systems with greater than 95% accuracy. The system also identifies both representative and atypical powder images. These results suggest the possibility of measuring variations in powders as a function of processing history, relating microstructural features of powders to properties relevant to their performance in AM processes, and defining objective material standards based on visual images. A significant advantage of the computer vision approach is that it is autonomous, objective, and repeatable.

  14. Report on the Evaluation Workshop in the Affective Domain, July, 1970.

    ERIC Educational Resources Information Center

    Lieberman, Marcus; And Others

    A report on the evaluation Workshop to define school objectives is presented. The three-week workshop in defining and measuring objectives in the areas of interests, attitudes and values was held at Emerson School in Elmhurst, Illinois. Some questions studied by the workshop group include the following: Can interests, attitudes, and values be…

  15. An ERP Study on Self-Relevant Object Recognition

    ERIC Educational Resources Information Center

    Miyakoshi, Makoto; Nomura, Michio; Ohira, Hideki

    2007-01-01

    We performed an event-related potential study to investigate the self-relevance effect in object recognition. Three stimulus categories were prepared: SELF (participant's own objects), FAMILIAR (disposable and public objects, defined as objects with less-self-relevant familiarity), and UNFAMILIAR (others' objects). The participants' task was to…

  16. Spatial-heterodyne interferometry for transmission (SHIFT) measurements

    DOEpatents

    Bingham, Philip R.; Hanson, Gregory R.; Tobin, Ken W.

    2006-10-10

    Systems and methods are described for spatial-heterodyne interferometry for transmission (SHIFT) measurements. A method includes digitally recording a spatially-heterodyned hologram including spatial heterodyne fringes for Fourier analysis using a reference beam, and an object beam that is transmitted through an object that is at least partially translucent; Fourier analyzing the digitally recorded spatially-heterodyned hologram, by shifting an original origin of the digitally recorded spatially-heterodyned hologram to sit on top of a spatial-heterodyne carrier frequency defined by an angle between the reference beam and the object beam, to define an analyzed image; digitally filtering the analyzed image to cut off signals around the original origin to define a result; and performing an inverse Fourier transform on the result.

  17. Psychological status as a function of residual scarring and facial asymmetry after surgical repair of cleft lip and palate.

    PubMed

    Millar, Keith; Bell, Aileen; Bowman, Adrian; Brown, Denise; Lo, Tsz-Wai; Siebert, Paul; Simmons, David; Ayoub, Ashraf

    2013-03-01

    Objective : Objective measure of scarring and three-dimensional (3D) facial asymmetry after surgical correction of unilateral cleft lip (UCL) and unilateral cleft lip (UCLP). It was hypothesized that the degree of scarring or asymmetry would be correlated with poorer psychological function. Design : In a cross-sectional design, children underwent 3D imaging of the face and completed standardized assessments of self-esteem, depression, and state and trait anxiety. Parents rated children's adjustment with a standard scale. Setting : Glasgow Dental School, School of Medicine, College of Medical, Veterinary and Life Sciences. Patients : Fifty-one children aged 10 years with UCLP and 43 with UCL were recruited from the cohort treated with the surgical protocol of the CLEFTSIS managed clinical network in Scotland. Methods : Objective assessment to determine the luminance and redness of the scar and facial asymmetry. Depression, anxiety, and a self-esteem assessment battery were used for the psychological analysis. Results : Cleft cases showed superior psychological adjustment when compared with normative data. Prevalence of depression matched the population norm. The visibility of the scar (luminance ratio) was significantly correlated with lower self-esteem and higher trait anxiety in UCLP children (P  =  .004). Similar but nonsignificant trends were seen in the UCL group. Parental ratings of poorer adjustment also correlated with greater luminance of the scar. Conclusions : The objectively defined degree of postoperative cleft scarring was associated with subclinical symptoms of anxiety, depression, and low self-esteem.

  18. Expert system technologies for Space Shuttle decision support: Two case studies

    NASA Technical Reports Server (NTRS)

    Ortiz, Christopher J.; Hasan, David A.

    1994-01-01

    This paper addresses the issue of integrating the C Language Integrated Production System (CLIPS) into distributed data acquisition environments. In particular, it presents preliminary results of some ongoing software development projects aimed at exploiting CLIPS technology in the new mission control center (MCC) being built at NASA Johnson Space Center. One interesting aspect of the control center is its distributed architecture; it consists of networked workstations which acquire and share data through the NASA/JSC-developed information sharing protocol (ISP). This paper outlines some approaches taken to integrate CLIPS and ISP in order to permit the development of intelligent data analysis applications which can be used in the MCC. Three approaches to CLIPS/IPS integration are discussed. The initial approach involves clearly separating CLIPS from ISP using user-defined functions for gathering and sending data to and from a local storage buffer. Memory and performance drawbacks of this design are summarized. The second approach involves taking full advantage of CLIPS and the CLIPS Object-Oriented Language (COOL) by using objects to directly transmit data and state changes from ISP to COOL. Any changes within the object slots eliminate the need for both a data structure and external function call thus taking advantage of the object matching capabilities within CLIPS 6.0. The final approach is to treat CLIPS and ISP as peer toolkits. Neither is embedded in the other; rather the application interweaves calls to each directly in the application source code.

  19. An interactive visualization tool for mobile objects

    NASA Astrophysics Data System (ADS)

    Kobayashi, Tetsuo

    Recent advancements in mobile devices---such as Global Positioning System (GPS), cellular phones, car navigation system, and radio-frequency identification (RFID)---have greatly influenced the nature and volume of data about individual-based movement in space and time. Due to the prevalence of mobile devices, vast amounts of mobile objects data are being produced and stored in databases, overwhelming the capacity of traditional spatial analytical methods. There is a growing need for discovering unexpected patterns, trends, and relationships that are hidden in the massive mobile objects data. Geographic visualization (GVis) and knowledge discovery in databases (KDD) are two major research fields that are associated with knowledge discovery and construction. Their major research challenges are the integration of GVis and KDD, enhancing the ability to handle large volume mobile objects data, and high interactivity between the computer and users of GVis and KDD tools. This dissertation proposes a visualization toolkit to enable highly interactive visual data exploration for mobile objects datasets. Vector algebraic representation and online analytical processing (OLAP) are utilized for managing and querying the mobile object data to accomplish high interactivity of the visualization tool. In addition, reconstructing trajectories at user-defined levels of temporal granularity with time aggregation methods allows exploration of the individual objects at different levels of movement generality. At a given level of generality, individual paths can be combined into synthetic summary paths based on three similarity measures, namely, locational similarity, directional similarity, and geometric similarity functions. A visualization toolkit based on the space-time cube concept exploits these functionalities to create a user-interactive environment for exploring mobile objects data. Furthermore, the characteristics of visualized trajectories are exported to be utilized for data mining, which leads to the integration of GVis and KDD. Case studies using three movement datasets (personal travel data survey in Lexington, Kentucky, wild chicken movement data in Thailand, and self-tracking data in Utah) demonstrate the potential of the system to extract meaningful patterns from the otherwise difficult to comprehend collections of space-time trajectories.

  20. Hand2 loss-of-function in Hand1-expressing Cells Reveals Distinct Roles In Epicardial And Coronary Vessel Development

    PubMed Central

    Barnes, Ralston M.; Firulli, Beth A.; VanDusen, Nathan J.; Morikawa, Yuka; Conway, Simon J.; Cserjesi, Peter; Vincentz, Joshua W.; Firulli, Anthony B.

    2011-01-01

    Rationale The bHLH transcription factors Hand1 and Hand2 are essential for embryonic development. Given their requirement for cardiogenesis, it is imperative to determine their impact on cardiovascular function. Objective Deduce the role of Hand2 within the epicardium. Method & Results We engineered a Hand1 allele expressing Cre recombinase. Cardiac Hand1 expression is largely limited to cells of the primary heart field, overlapping little with Hand2 expression. Hand1 is expressed within the septum transversum (ST) and the Hand1-lineage marks the proepicardial organ and epicardium. To examine Hand factor functional overlap, we conditionally deleted Hand2 from Hand1-expressing cells. Hand2 mutants display defective epicardialization and fail to form coronary arteries, coincident with altered ECM deposition and Pdgfr expression. Conclusion These data demonstrate a hierarchal relationship whereby transient Hand1 ST expression defines epicardial precursors that are subsequently dependent upon Hand2 function. PMID:21350214

  1. Toward Optimal Transport Networks

    NASA Technical Reports Server (NTRS)

    Alexandrov, Natalia; Kincaid, Rex K.; Vargo, Erik P.

    2008-01-01

    Strictly evolutionary approaches to improving the air transport system a highly complex network of interacting systems no longer suffice in the face of demand that is projected to double or triple in the near future. Thus evolutionary approaches should be augmented with active design methods. The ability to actively design, optimize and control a system presupposes the existence of predictive modeling and reasonably well-defined functional dependences between the controllable variables of the system and objective and constraint functions for optimization. Following recent advances in the studies of the effects of network topology structure on dynamics, we investigate the performance of dynamic processes on transport networks as a function of the first nontrivial eigenvalue of the network's Laplacian, which, in turn, is a function of the network s connectivity and modularity. The last two characteristics can be controlled and tuned via optimization. We consider design optimization problem formulations. We have developed a flexible simulation of network topology coupled with flows on the network for use as a platform for computational experiments.

  2. Adaptive critic neural network-based object grasping control using a three-finger gripper.

    PubMed

    Jagannathan, S; Galan, Gustavo

    2004-03-01

    Grasping of objects has been a challenging task for robots. The complex grasping task can be defined as object contact control and manipulation subtasks. In this paper, object contact control subtask is defined as the ability to follow a trajectory accurately by the fingers of a gripper. The object manipulation subtask is defined in terms of maintaining a predefined applied force by the fingers on the object. A sophisticated controller is necessary since the process of grasping an object without a priori knowledge of the object's size, texture, softness, gripper, and contact dynamics is rather difficult. Moreover, the object has to be secured accurately and considerably fast without damaging it. Since the gripper, contact dynamics, and the object properties are not typically known beforehand, an adaptive critic neural network (NN)-based hybrid position/force control scheme is introduced. The feedforward action generating NN in the adaptive critic NN controller compensates the nonlinear gripper and contact dynamics. The learning of the action generating NN is performed on-line based on a critic NN output signal. The controller ensures that a three-finger gripper tracks a desired trajectory while applying desired forces on the object for manipulation. Novel NN weight tuning updates are derived for the action generating and critic NNs so that Lyapunov-based stability analysis can be shown. Simulation results demonstrate that the proposed scheme successfully allows fingers of a gripper to secure objects without the knowledge of the underlying gripper and contact dynamics of the object compared to conventional schemes.

  3. Long-term outcome after resection of brainstem hemangioblastomas in von Hippel-Lindau disease

    PubMed Central

    Wind, Joshua J.; Bakhtian, Kamran D.; Sweet, Jennifer A.; Mehta, Gautam U.; Thawani, Jayesh P.; Asthagiri, Ashok R.; Oldfield, Edward H.; Lonser, Russell R.

    2016-01-01

    Object Brainstem hemangioblastomas are frequently encountered in patients with von Hippel-Lindau (VHL) disease. These tumors can cause significant morbidity, and their optimal management has not been defined. To better define the outcome and management of these tumors, the authors analyzed the long-term results in patients who underwent resection of brainstem hemangioblastomas. Methods Consecutive patients with VHL disease who underwent resection of brainstem hemangioblastomas with a follow-up of 12 months or more were included in this study. Serial functional assessments, radiographic examinations, and operative records were analyzed. Results Forty-four patients (17 male and 27 female) underwent 51 operations for resection of 71 brainstem hemangioblastomas. The most common presenting symptoms were headache, swallowing difficulties, singultus, gait difficulties, and sensory abnormalities. The mean follow-up was 5.9 ± 5.0 years (range 1.0–20.8 years). Immediately after 34 operations (66.7%), the patients remained at their preoperative functional status; they improved after 8 operations (15.7%) and worsened after 9 operations (17.6%) as measured by the McCormick scale. Eight (88.9%) of the 9 patients who were worse immediately after resection returned to their preoperative status within 6 months. Two patients experienced functional decline during long-term follow-up (beginning at 2.5 and 5 years postoperatively) caused by extensive VHL disease–associated CNS disease. Conclusions Generally, resection of symptomatic brainstem hemangioblastomas is a safe and effective management strategy in patients with VHL disease. Most patients maintain their preoperative functional status, although long-term decline in functional status may occur due to VHL disease–associated progression. PMID:20932100

  4. Physical function improvements and relief from fatigue and pain are associated with increased productivity at work and at home in rheumatoid arthritis patients treated with certolizumab pegol

    PubMed Central

    Taylor, Peter; Strand, Vibeke; Purcaru, Oana; Coteur, Geoffroy; Mease, Philip

    2010-01-01

    Objectives. To evaluate the association between improvements in physical function, fatigue and pain and improvements in productivity at work and at home in patients treated with certolizumab pegol (CZP) in combination with MTX. Methods. Physical function, fatigue and pain were assessed in two CZP clinical trials (Rheumatoid Arthritis PreventIon of structural Damage 1 and 2) using the HAQ-Disability Index (HAQ-DI), Fatigue Assessment Scale (FAS) and Patient Assessment of Pain, with minimal clinically important differences (MCIDs) defined as ≥0.22, ≥1 and ≥10 points, respectively. Work and home productivity were evaluated using the RA-specific Work Productivity Survey (WPS-RA). The odds of achieving an HAQ-DI, FAS or pain ‘response’ at Week 12, defined as improvements ≥MCID, were compared between CZP and control groups. Improvements in productivity at Week 12 were compared between CZP-treated HAQ-DI, FAS or pain responders and non-responders. Results. The odds of achieving improvements ≥MCID were five times higher for pain, and two to three times higher for physical function and fatigue, in patients receiving CZP vs control. Per month, responders reported significantly greater improvements in productivity at work and reduced interference of RA with their work productivity than non-responders. Responders also reported significantly greater improvements in productivity at home and participation in family, social and leisure activities. Conclusions. This study demonstrated a clear association between patient-reported improvements in physical function, fatigue and pain, and improvements in productivity both at work and home. PMID:20547658

  5. Second-order optimality conditions for problems with C1 data

    NASA Astrophysics Data System (ADS)

    Ginchev, Ivan; Ivanov, Vsevolod I.

    2008-04-01

    In this paper we obtain second-order optimality conditions of Karush-Kuhn-Tucker type and Fritz John one for a problem with inequality constraints and a set constraint in nonsmooth settings using second-order directional derivatives. In the necessary conditions we suppose that the objective function and the active constraints are continuously differentiable, but their gradients are not necessarily locally Lipschitz. In the sufficient conditions for a global minimum we assume that the objective function is differentiable at and second-order pseudoconvex at , a notion introduced by the authors [I. Ginchev, V.I. Ivanov, Higher-order pseudoconvex functions, in: I.V. Konnov, D.T. Luc, A.M. Rubinov (Eds.), Generalized Convexity and Related Topics, in: Lecture Notes in Econom. and Math. Systems, vol. 583, Springer, 2007, pp. 247-264], the constraints are both differentiable and quasiconvex at . In the sufficient conditions for an isolated local minimum of order two we suppose that the problem belongs to the class C1,1. We show that they do not hold for C1 problems, which are not C1,1 ones. At last a new notion parabolic local minimum is defined and it is applied to extend the sufficient conditions for an isolated local minimum from problems with C1,1 data to problems with C1 one.

  6. Exploring oxidative ageing behaviour of hydrocarbons using ab initio molecular dynamics analysis

    NASA Astrophysics Data System (ADS)

    Pan, Tongyan; Cheng, Cheng

    2016-06-01

    With a proper approximate solution to the Schrödinger Equation of a multi-electron system, the method of ab initio molecular dynamics (AIMD) performs first-principles molecular dynamics analysis without pre-defining interatomic potentials as are mandatory in traditional molecular dynamics analyses. The objective of this study is to determine the oxidative-ageing pathway of petroleum asphalt as a typical hydrocarbon system, using the AIMD method. This objective was accomplished in three steps, including (1) identifying a group of representative asphalt molecules to model, (2) determining an atomistic modelling method that can effectively simulate the production of critical functional groups in oxidative ageing of hydrocarbons and (3) evaluating the oxidative-ageing pathway of a hydrocarbon system. The determination of oxidative-ageing pathway of hydrocarbons was done by tracking the generations of critical functional groups in the course of oxidative ageing. The chemical elements of carbon, nitrogen and sulphur all experience oxidative reactions, producing polarised functional groups such as ketones, aldehydes or carboxylic acids, pyrrolic groups and sulphoxides. The electrostatic forces of the polarised groups generated in oxidation are responsible for the behaviour of aged hydrocarbons. The developed AIMD model can be used for modelling the ageing of generic hydrocarbon polymers and developing antioxidants without running expensive experiments.

  7. Target objects defined by a conjunction of colour and shape can be selected independently and in parallel.

    PubMed

    Jenkins, Michael; Grubert, Anna; Eimer, Martin

    2017-11-01

    It is generally assumed that during search for targets defined by a feature conjunction, attention is allocated sequentially to individual objects. We tested this hypothesis by tracking the time course of attentional processing biases with the N2pc component in tasks where observers searched for two targets defined by a colour/shape conjunction. In Experiment 1, two displays presented in rapid succession (100 ms or 10 ms SOA) each contained a target and a colour-matching or shape-matching distractor on opposite sides. Target objects in both displays elicited N2pc components of similar size that overlapped in time when the SOA was 10 ms, suggesting that attention was allocated in parallel to both targets. Analogous results were found in Experiment 2, where targets and partially matching distractors were both accompanied by an object without target-matching features. Colour-matching and shape-matching distractors also elicited N2pc components, and the target N2pc was initially identical to the sum of the two distractor N2pcs, suggesting that the initial phase of attentional object selection was guided independently by feature templates for target colour and shape. Beyond 230 ms after display onset, the target N2pc became superadditive, indicating that attentional selection processes now started to be sensitive to the presence of feature conjunctions. Results show that independent attentional selection processes can be activated in parallel by two target objects in situations where these objects are defined by a feature conjunction.

  8. Multispectral imaging with vertical silicon nanowires

    PubMed Central

    Park, Hyunsung; Crozier, Kenneth B.

    2013-01-01

    Multispectral imaging is a powerful tool that extends the capabilities of the human eye. However, multispectral imaging systems generally are expensive and bulky, and multiple exposures are needed. Here, we report the demonstration of a compact multispectral imaging system that uses vertical silicon nanowires to realize a filter array. Multiple filter functions covering visible to near-infrared (NIR) wavelengths are simultaneously defined in a single lithography step using a single material (silicon). Nanowires are then etched and embedded into polydimethylsiloxane (PDMS), thereby realizing a device with eight filter functions. By attaching it to a monochrome silicon image sensor, we successfully realize an all-silicon multispectral imaging system. We demonstrate visible and NIR imaging. We show that the latter is highly sensitive to vegetation and furthermore enables imaging through objects opaque to the eye. PMID:23955156

  9. Protein function in precision medicine: deep understanding with machine learning.

    PubMed

    Rost, Burkhard; Radivojac, Predrag; Bromberg, Yana

    2016-08-01

    Precision medicine and personalized health efforts propose leveraging complex molecular, medical and family history, along with other types of personal data toward better life. We argue that this ambitious objective will require advanced and specialized machine learning solutions. Simply skimming some low-hanging results off the data wealth might have limited potential. Instead, we need to better understand all parts of the system to define medically relevant causes and effects: how do particular sequence variants affect particular proteins and pathways? How do these effects, in turn, cause the health or disease-related phenotype? Toward this end, deeper understanding will not simply diffuse from deeper machine learning, but from more explicit focus on understanding protein function, context-specific protein interaction networks, and impact of variation on both. © 2016 Federation of European Biochemical Societies.

  10. Synchronous Parallel Emulation and Discrete Event Simulation System with Self-Contained Simulation Objects and Active Event Objects

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S. (Inventor)

    1998-01-01

    The present invention is embodied in a method of performing object-oriented simulation and a system having inter-connected processor nodes operating in parallel to simulate mutual interactions of a set of discrete simulation objects distributed among the nodes as a sequence of discrete events changing state variables of respective simulation objects so as to generate new event-defining messages addressed to respective ones of the nodes. The object-oriented simulation is performed at each one of the nodes by assigning passive self-contained simulation objects to each one of the nodes, responding to messages received at one node by generating corresponding active event objects having user-defined inherent capabilities and individual time stamps and corresponding to respective events affecting one of the passive self-contained simulation objects of the one node, restricting the respective passive self-contained simulation objects to only providing and receiving information from die respective active event objects, requesting information and changing variables within a passive self-contained simulation object by the active event object, and producing corresponding messages specifying events resulting therefrom by the active event objects.

  11. Defining the Diverse Cell Populations Contributing to Lignification in Arabidopsis Stems.

    PubMed

    Smith, Rebecca A; Schuetz, Mathias; Karlen, Steven D; Bird, David; Tokunaga, Naohito; Sato, Yasushi; Mansfield, Shawn D; Ralph, John; Samuels, A Lacey

    2017-06-01

    Many land plants evolved tall and sturdy growth habits due to specialized cells with thick lignified cell walls: tracheary elements that function in water transport and fibers that function in structural support. The objective of this study was to define how and when diverse cell populations contribute lignin precursors, monolignols, to secondary cell walls during lignification of the Arabidopsis ( Arabidopsis thaliana ) inflorescence stem. Previous work demonstrated that, when lignin biosynthesis is suppressed in fiber and tracheary element cells with thickened walls, fibers become lignin-depleted while vascular bundles still lignify, suggesting that nonlignifying neighboring xylem cells are contributing to lignification. In this work, we dissect the contributions of different cell types, specifically xylary parenchyma and fiber cells, to lignification of the stem using cell-type-specific promoters to either knock down an essential monolignol biosynthetic gene or to introduce novel monolignol conjugates. Analysis of either reductions in lignin in knockdown lines, or the addition of novel monolignol conjugates, directly identifies the xylary parenchyma and fiber cell populations that contribute to the stem lignification and the developmental timing at which each contribution is most important. © 2017 American Society of Plant Biologists. All Rights Reserved.

  12. Relationship Between Psychomotor Efficiency and Sensation Seeking of People Exposed to Noise and Low Frequency Vibration Stimuli

    NASA Astrophysics Data System (ADS)

    Korchut, Aleksander; Kowalska-Koczwara, Alicja; Romanska – Zapała, Anna; Stypula, Krzysztof

    2017-10-01

    At the workplace of the machine operator, low frequency whole body and hand- arm vibrations are observed. They occur together with noise. Whole body vibration in the range of 3-25 Hz are detrimental to the human body due to the location of the resonant frequency of large organs of the human body in this range. It can be assumed that for this reason people working every day in such conditions can have reduced working efficiency. The influence of low frequency vibration and noise on the human body leads to both physiological and functional changes. The result of the impact of noise and vibration stimuli depends largely on the specific characteristics of the objects, which include among other personality traits, temperament and emotional factor. The pilot study conducted in the laboratory was attended by 30 young men. The aim of the study was to look for correlations between the need for stimulation of the objects and their psychomotor efficiency in case of vibration exposure and vibration together with noise exposure in variable conditions task. The need for stimulation of the objects as defined in the study is based on theoretical assumptions of one dimensional model of temperament developed by Marvin Zuckerman. This theory defines the need for stimulation as the search for different, new, complex and intense sensations, as well as the willingness to take risks. The aim of research was to verify if from four factors such as: the search for adventure and horror, sensation seeking, disinhibition and susceptibility to boredom, we can choose the ones that in conjunction with varying operating conditions, may significantly determine the efficiency of the task situation. The objects performed the test evaluation of their motor skills which consisted in keeping the cursor controlled by a joystick through the path. The number of exceeds of the cursor beyond the path and its maximum deviation was recorded. The collected data were used to determine the correlation between the working efficiency and the need for stimulation of the objects under the influence of vibroacoustic factors. The analysis of the results allowed to define a set of criteria that make up the arduous working conditions. The obtained results indicate the need for the continuation of the research.

  13. Case management information systems: how to put the pieces together now and beyond year 2000.

    PubMed

    Matthews, P

    1999-01-01

    Healthcare organizations must establish the goals and objectives of their case management processes before functional and system requirements can be defined. A gap analysis will identify existing systems that can be used to support case management as well as areas in need of systems support. The gap analysis will also identify short-term tactical projects and long-term strategic initiatives supporting the automation of case management. The projects resulting from the gap analysis must be incorporated into the organization's business and information systems plan and budget to ensure appropriate funding and prioritization.

  14. Structural tailoring of advanced turboprops

    NASA Technical Reports Server (NTRS)

    Brown, K. W.; Hopkins, Dale A.

    1988-01-01

    The Structural Tailoring of Advanced Turboprops (STAT) computer program was developed to perform numerical optimization on highly swept propfan blades. The optimization procedure seeks to minimize an objective function defined as either: (1) direct operating cost of full scale blade or, (2) aeroelastic differences between a blade and its scaled model, by tuning internal and external geometry variables that must satisfy realistic blade design constraints. The STAT analysis system includes an aerodynamic efficiency evaluation, a finite element stress and vibration analysis, an acoustic analysis, a flutter analysis, and a once-per-revolution forced response life prediction capability. STAT includes all relevant propfan design constraints.

  15. Definition of avionics concepts for a heavy lift cargo vehicle, volume 2

    NASA Technical Reports Server (NTRS)

    1989-01-01

    A cost effective, multiuser simulation, test, and demonstration facility to support the development of avionics systems for future space vehicles is defined. The technology needs and requirements of future Heavy Lift Cargo Vehicles (HLCVs) are analyzed and serve as the basis for sizing of the avionics facility although the lab is not limited in use to support of HLCVs. Volume 2 is the technical volume and provides the results of the vehicle avionics trade studies, the avionics lab objectives, the lab's functional requirements and design, physical facility considerations, and a summary cost estimate.

  16. Integrated multidisciplinary design optimization of rotorcraft

    NASA Technical Reports Server (NTRS)

    Adelman, Howard M.; Mantay, Wayne R.

    1989-01-01

    The NASA/Army research plan for developing the logic elements for helicopter rotor design optimization by integrating appropriate disciplines and accounting for important interactions among the disciplines is discussed. The paper describes the optimization formulation in terms of the objective function, design variables, and constraints. The analysis aspects are discussed, and an initial effort at defining the interdisciplinary coupling is summarized. Results are presented on the achievements made in the rotor aerodynamic performance optimization for minimum hover horsepower, rotor dynamic optimization for vibration reduction, rotor structural optimization for minimum weight, and integrated aerodynamic load/dynamics optimization for minimum vibration and weight.

  17. Hard exclusive pion electroproduction at backward angles with CLAS

    NASA Astrophysics Data System (ADS)

    Park, K.; Guidal, M.; Gothe, R. W.; Pire, B.; Semenov-Tian-Shansky, K.; Laget, J.-M.; Adhikari, K. P.; Adhikari, S.; Akbar, Z.; Avakian, H.; Ball, J.; Balossino, I.; Baltzell, N. A.; Barion, L.; Battaglieri, M.; Bedlinskiy, I.; Biselli, A. S.; Briscoe, W. J.; Brooks, W. K.; Burkert, V. D.; Cao, F. T.; Carman, D. S.; Celentano, A.; Charles, G.; Chetry, T.; Ciullo, G.; Clark, L.; Cole, P. L.; Contalbrigo, M.; Crede, V.; D'Angelo, A.; Dashyan, N.; De Vita, R.; De Sanctis, E.; Defurne, M.; Deur, A.; Djalali, C.; Dupre, R.; Egiyan, H.; El Alaoui, A.; El Fassi, L.; Elouadrhiri, L.; Eugenio, P.; Fedotov, G.; Fersch, R.; Filippi, A.; Garçon, M.; Ghandilyan, Y.; Gilfoyle, G. P.; Girod, F. X.; Golovatch, E.; Griffioen, K. A.; Guo, L.; Hafidi, K.; Hakobyan, H.; Hanretty, C.; Harrison, N.; Hattawy, M.; Heddle, D.; Hicks, K.; Holtrop, M.; Hyde, C. E.; Ilieva, Y.; Ireland, D. G.; Ishkhanov, B. S.; Isupov, E. L.; Jenkins, D.; Johnston, S.; Joo, K.; Kabir, M. L.; Keller, D.; Khachatryan, G.; Khachatryan, M.; Khandaker, M.; Kim, W.; Klein, F. J.; Kubarovsky, V.; Kuhn, S. E.; Lanza, L.; Livingston, K.; MacGregor, I. J. D.; Markov, N.; McKinnon, B.; Mirazita, M.; Mokeev, V.; Montgomery, R. A.; Munoz Camacho, C.; Nadel-Turonski, P.; Niccolai, S.; Niculescu, G.; Osipenko, M.; Paolone, M.; Paremuzyan, R.; Pasyuk, E.; Phelps, W.; Pogorelko, O.; Poudel, J.; Price, J. W.; Prok, Y.; Protopopescu, D.; Ripani, M.; Rizzo, A.; Rossi, P.; Sabatié, F.; Salgado, C.; Schumacher, R. A.; Sharabian, Y.; Skorodumina, Iu.; Smith, G. D.; Sokhan, D.; Sparveris, N.; Stepanyan, S.; Strakovsky, I. I.; Strauch, S.; Taiuti, M.; Tan, J. A.; Ungaro, M.; Voskanyan, H.; Voutier, E.; Wei, X.; Zachariou, N.; Zhang, J.

    2018-05-01

    We report on the first measurement of cross sections for exclusive deeply virtual pion electroproduction off the proton, ep →e‧ nπ+, above the resonance region at backward pion center-of-mass angles. The φπ* -dependent cross sections were measured, from which we extracted three combinations of structure functions of the proton. Our results are compatible with calculations based on nucleon-to-pion transition distribution amplitudes (TDAs). These non-perturbative objects are defined as matrix elements of three-quark-light-cone-operators and characterize partonic correlations with a particular emphasis on baryon charge distribution inside a nucleon.

  18. Structural Tailoring of Advanced Turboprops (STAT)

    NASA Technical Reports Server (NTRS)

    Brown, Kenneth W.

    1988-01-01

    This interim report describes the progress achieved in the structural Tailoring of Advanced Turboprops (STAT) program which was developed to perform numerical optimizations on highly swept propfan blades. The optimization procedure seeks to minimize an objective function, defined as either direct operating cost or aeroelastic differences between a blade and its scaled model, by tuning internal and external geometry variables that must satisfy realistic blade design constraints. This report provides a detailed description of the input, optimization procedures, approximate analyses and refined analyses, as well as validation test cases for the STAT program. In addition, conclusions and recommendations are summarized.

  19. Patient satisfaction with anaesthesia - Part 1: satisfaction as part of outcome - and what satisfies patients.

    PubMed

    Heidegger, T; Saal, D; Nübling, M

    2013-11-01

    Patients' involvement in all decision processes is becoming increasingly important in modern healthcare. Patient satisfaction is a sensitive measure of a well-functioning health service system. The objective of this review is to discuss patient satisfaction as part of outcome quality, to define the somewhat abstract term 'satisfaction', and to discuss the role of surrogate markers within the field of satisfaction with anaesthesia care. We critically discuss what is relevant to satisfy patients with anaesthesia care, and we provide guidance on improving satisfaction. © 2013 The Association of Anaesthetists of Great Britain and Ireland.

  20. Common object request broker architecture (CORBA)-based security services for the virtual radiology environment.

    PubMed

    Martinez, R; Cole, C; Rozenblit, J; Cook, J F; Chacko, A K

    2000-05-01

    The US Army Great Plains Regional Medical Command (GPRMC) has a requirement to conform to Department of Defense (DoD) and Army security policies for the Virtual Radiology Environment (VRE) Project. Within the DoD, security policy is defined as the set of laws, rules, and practices that regulate how an organization manages, protects, and distributes sensitive information. Security policy in the DoD is described by the Trusted Computer System Evaluation Criteria (TCSEC), Army Regulation (AR) 380-19, Defense Information Infrastructure Common Operating Environment (DII COE), Military Health Services System Automated Information Systems Security Policy Manual, and National Computer Security Center-TG-005, "Trusted Network Interpretation." These documents were used to develop a security policy that defines information protection requirements that are made with respect to those laws, rules, and practices that are required to protect the information stored and processed in the VRE Project. The goal of the security policy is to provide for a C2-level of information protection while also satisfying the functional needs of the GPRMC's user community. This report summarizes the security policy for the VRE and defines the CORBA security services that satisfy the policy. In the VRE, the information to be protected is embedded into three major information components: (1) Patient information consists of Digital Imaging and Communications in Medicine (DICOM)-formatted fields. The patient information resides in the digital imaging network picture archiving and communication system (DIN-PACS) networks in the database archive systems and includes (a) patient demographics; (b) patient images from x-ray, computed tomography (CT), magnetic resonance imaging (MRI), and ultrasound (US); and (c) prior patient images and related patient history. (2) Meta-Manager information to be protected consists of several data objects. This information is distributed to the Meta-Manager nodes and includes (a) radiologist schedules; (b) modality worklists; (c) routed case information; (d) DIN-PACS and Composite Health Care system (CHCS) messages, and Meta-Manager administrative and security information; and (e) patient case information. (3) Access control and communications security is required in the VRE to control who uses the VRE and Meta-Manager facilities and to secure the messages between VRE components. The CORBA Security Service Specification version 1.5 is designed to allow up to TCSEC's B2-level security for distributed objects. The CORBA Security Service Specification defines the functionality of several security features: identification and authentication, authorization and access control, security auditing, communication security, nonrepudiation, and security administration. This report describes the enhanced security features for the VRE and their implementation using commercial CORBA Security Service software products.

  1. Recording multiple spatially-heterodyned direct to digital holograms in one digital image

    DOEpatents

    Hanson, Gregory R [Clinton, TN; Bingham, Philip R [Knoxville, TN

    2008-03-25

    Systems and methods are described for recording multiple spatially-heterodyned direct to digital holograms in one digital image. A method includes digitally recording, at a first reference beam-object beam angle, a first spatially-heterodyned hologram including spatial heterodyne fringes for Fourier analysis; Fourier analyzing the recorded first spatially-heterodyned hologram by shifting a first original origin of the recorded first spatially-heterodyned hologram to sit on top of a first spatial-heterodyne carrier frequency defined by the first reference beam-object beam angle; digitally recording, at a second reference beam-object beam angle, a second spatially-heterodyned hologram including spatial heterodyne fringes for Fourier analysis; Fourier analyzing the recorded second spatially-heterodyned hologram by shifting a second original origin of the recorded second spatially-heterodyned hologram to sit on top of a second spatial-heterodyne carrier frequency defined by the second reference beam-object beam angle; applying a first digital filter to cut off signals around the first original origin and define a first result; performing a first inverse Fourier transform on the first result; applying a second digital filter to cut off signals around the second original origin and define a second result; and performing a second inverse Fourier transform on the second result, wherein the first reference beam-object beam angle is not equal to the second reference beam-object beam angle and a single digital image includes both the first spatially-heterodyned hologram and the second spatially-heterodyned hologram.

  2. Reversible preoperative renal dysfunction does not add to the risk of postoperative acute kidney injury after cardiac valve surgery

    PubMed Central

    Xu, Jia-Rui; Zhuang, Ya-Min; Liu, Lan; Shen, Bo; Wang, Yi-Mei; Luo, Zhe; Teng, Jie; Wang, Chun-Sheng; Ding, Xiao-Qiang

    2017-01-01

    Objective To evaluate the impact of the renal dysfunction (RD) type and change of postoperative cardiac function on the risk of developing acute kidney injury (AKI) in patients who underwent cardiac valve surgery. Method Reversible renal dysfunction (RRD) was defined as preoperative RD in patients who had not been initially diagnosed with chronic kidney disease (CKD). Cardiac function improvement (CFI) was defined as postoperative left ventricular ejection function – preoperative left ventricular ejection function (ΔEF) >0%, and cardiac function not improved (CFNI) as ΔEF ≤0%. Results Of the 4,805 (94%) cardiac valve surgery patients, 301 (6%) were RD cases. The AKI incidence in the RRD group (n=252) was significantly lower than in the CKD group (n=49) (36.5% vs 63.3%, P=0.018). The AKI and renal replacement therapy incidences in the CFI group (n=174) were significantly lower than in the CFNI group (n=127) (33.9% vs 50.4%, P=0.004; 6.3% vs 13.4%, P=0.037). After adjustment for age, gender, and other confounding factors, CKD and CKD + CFNI were identified as independent risk factors for AKI in all patients after cardiac valve surgery. Multivariate logistic regression analysis showed that the risk factors for postoperative AKI in preoperative RD patients were age, gender (male), hypertension, diabetes, chronic heart failure, cardiopulmonary bypass time (every 1 min added), and intraoperative hypotension, while CFI after surgery could reduce the risk. Conclusion For cardiac valve surgery patients, preoperative CKD was an independent risk factor for postoperative AKI, but RRD did not add to the risk. Improved postoperative cardiac function can significantly reduce the risk of postoperative AKI. PMID:29184415

  3. One-Year Randomized Controlled Trial and Follow-Up of Integrated Neurocognitive Therapy for Schizophrenia Outpatients

    PubMed Central

    Mueller, Daniel R.; Schmidt, Stefanie J.; Roder, Volker

    2015-01-01

    Objective: Cognitive remediation (CR) approaches have demonstrated to be effective in improving cognitive functions in schizophrenia. However, there is a lack of integrated CR approaches that target multiple neuro- and social-cognitive domains with a special focus on the generalization of therapy effects to functional outcome. Method: This 8-site randomized controlled trial evaluated the efficacy of a novel CR group therapy approach called integrated neurocognitive therapy (INT). INT includes well-defined exercises to improve all neuro- and social-cognitive domains as defined by the Measurement And Treatment Research to Improve Cognition in Schizophrenia (MATRICS) initiative by compensation and restitution. One hundred and fifty-six outpatients with a diagnosis of schizophrenia or schizoaffective disorder according to DSM-IV-TR or ICD-10 were randomly assigned to receive 15 weeks of INT or treatment as usual (TAU). INT patients received 30 bi-weekly therapy sessions. Each session lasted 90min. Mixed models were applied to assess changes in neurocognition, social cognition, symptoms, and functional outcome at post-treatment and at 9-month follow-up. Results: In comparison to TAU, INT patients showed significant improvements in several neuro- and social-cognitive domains, negative symptoms, and functional outcome after therapy and at 9-month follow-up. Number-needed-to-treat analyses indicate that only 5 INT patients are necessary to produce durable and meaningful improvements in functional outcome. Conclusions: Integrated interventions on neurocognition and social cognition have the potential to improve not only cognitive performance but also functional outcome. These findings are important as treatment guidelines for schizophrenia have criticized CR for its poor generalization effects. PMID:25713462

  4. 3D shape recovery of a newborn skull using thin-plate splines.

    PubMed

    Lapeer, R J; Prager, R W

    2000-01-01

    The objective of this paper is to construct a mesh-model of a newborn skull for finite element analysis to study its deformation when subjected to the forces present during labour. The current state of medical imaging technology has reached a level which allows accurate visualisation and shape recovery of biological organs and body-parts. However, a sufficiently large set of medical images cannot always be obtained, often because of practical or ethical reasons, and the requirement to recover the shape of the biological object of interest has to be met by other means. Such is the case for a newborn skull. A method to recover the three-dimensional (3D) shape from (minimum) two orthogonal atlas images of the object of interest and a homologous object is described. This method is based on matching landmarks and curves on the orthogonal images of the object of interest with corresponding landmarks and curves on the homologous or 'master'-object which is fully defined in 3D space. On the basis of this set of corresponding landmarks, a thin-plate spline function can be derived to warp from the 'master'-object space to the 'slave'-object space. This method is applied to recover the 3D shape of a newborn skull. Images from orthogonal view-planes are obtained from an atlas. The homologous object is an adult skull, obtained from CT-images made available by the Visible Human Project. After shape recovery, a mesh-model of the newborn skull is generated.

  5. The visual perception of metal.

    PubMed

    Todd, James T; Norman, J Farley

    2018-03-01

    The present research was designed to examine how the presence or absence of ambient light influences the appearance of metal. The stimuli depicted three possible objects that were illuminated by three possible patterns of illumination. These were generated by a single point light source, two rectangular area lights, or projecting light onto a translucent white box that contained the object (and the camera) so that the object would be illuminated by ambient light in all directions. The materials were simulated using measured parameters of chrome with four different levels of roughness. Observers rated the metallic appearance and shininess of each depicted object using two sliders. The highest rated appearance of metal and shininess occurred for the surfaces with the lowest roughness in the ambient illumination condition, and these ratings dropped systematically as the roughness was increased. For the objects illuminated by point or area lights, the appearance of metal and shininess were significantly less than in the ambient conditions for the lowest roughness value, and significantly greater than in the ambient condition for intermediate values of roughness. We also included a control condition depicting objects with a shiny plastic reflectance function that had both diffuse and specular components. These objects were rated as highly shiny but they did not appear metallic. A theoretical hypothesis is proposed that the defining characteristic of metal (as opposed to black plastic) is the presence of specular sheen over most of the visible surface area.

  6. Collaborative WorkBench for Researchers - Work Smarter, Not Harder

    NASA Technical Reports Server (NTRS)

    Ramachandran, Rahul; Kuo, Kwo-sen; Maskey, Manil; Lynnes, Christopher

    2014-01-01

    It is important to define some commonly used terminology related to collaboration to facilitate clarity in later discussions. We define provisioning as infrastructure capabilities such as computation, storage, data, and tools provided by some agency or similarly trusted institution. Sharing is defined as the process of exchanging data, programs, and knowledge among individuals (often strangers) and groups. Collaboration is a specialized case of sharing. In collaboration, sharing with others (usually known colleagues) is done in pursuit of a common scientific goal or objective. Collaboration entails more dynamic and frequent interactions and can occur at different speeds. Synchronous collaboration occurs in real time such as editing a shared document on the fly, chatting, video conference, etc., and typically requires a peer-to-peer connection. Asynchronous collaboration is episodic in nature based on a push-pull model. Examples of asynchronous collaboration include email exchanges, blogging, repositories, etc. The purpose of a workbench is to provide a customizable framework for different applications. Since the workbench will be common to all the customized tools, it promotes building modular functionality that can be used and reused by multiple tools. The objective of our Collaborative Workbench (CWB) is thus to create such an open and extensible framework for the Earth Science community via a set of plug-ins. Our CWB is based on the Eclipse [2] Integrated Development Environment (IDE), which is designed as a small kernel containing a plug-in loader for hundreds of plug-ins. The kernel itself is an implementation of a known specification to provide an environment for the plug-ins to execute. This design enables modularity, where discrete chunks of functionality can be reused to build new applications. The minimal set of plug-ins necessary to create a client application is called the Eclipse Rich Client Platform (RCP) [3]; The Eclipse RCP also supports thousands of community-contributed plug-ins, making it a popular development platform for many diverse applications including the Science Activity Planner developed at JPL for the Mars rovers [4] and the scientific experiment tool Gumtree [5]. By leveraging the Eclipse RCP to provide an open, extensible framework, a CWB supports customizations via plug-ins to build rich user applications specific for Earth Science. More importantly, CWB plug-ins can be used by existing science tools built off Eclipse such as IDL or PyDev to provide seamless collaboration functionalities.

  7. Defining operating rules for mitigation of drought effects on water supply systems

    NASA Astrophysics Data System (ADS)

    Rossi, G.; Caporali, E.; Garrote, L.; Federici, G. V.

    2012-04-01

    Reservoirs play a pivotal role for water supply systems regulation and management especially during drought periods. Optimization of reservoir releases, related to drought mitigation rules is particularly required. The hydrologic state of the system is evaluated defining some threshold values, expressed in probabilistic terms. Risk deficit curves are used to reduce the ensemble of possible rules for simulation. Threshold values can be linked to specific actions in an operational context in different levels of severity, i.e. normal, pre-alert, alert and emergency scenarios. A simplified model of the water resources system is built to evaluate the threshold values and the management rules. The threshold values are defined considering the probability to satisfy a given fraction of the demand in a certain time horizon, and are validated with a long term simulation that takes into account the characteristics of the evaluated system. The threshold levels determine some curves that define reservoir releases as a function of existing storage volume. A demand reduction is related to each threshold level. The rules to manage the system in drought conditions, the threshold levels and the reductions are optimized using long term simulations with different hypothesized states of the system. Synthetic sequences of flows with the same statistical properties of the historical ones are produced to evaluate the system behaviour. Performances of different values of reduction and different threshold curves are evaluated using different objective function and performances indices. The methodology is applied to the urban area Firenze-Prato-Pistoia in central Tuscany, in Central Italy. The considered demand centres are Firenze and Bagno a Ripoli that have, accordingly to the census ISTAT 2001, a total of 395.000 inhabitants.

  8. Different demographic, genetic, and longitudinal traits in language versus memory Alzheimer's subgroups.

    PubMed

    Mez, Jesse; Cosentino, Stephanie; Brickman, Adam M; Huey, Edward D; Mayeux, Richard

    2013-01-01

    The study's objective was to compare demographics, APOE genotypes, and rate of rise over time in functional impairment in neuropsychologically defined language, typical, and memory subgroups of clinical Alzheimer's disease (AD). 1,368 participants from the National Alzheimer's Coordinating Center database with a diagnosis of probable AD (CDR 0.5-1.0) were included. A language subgroup (n = 229) was defined as having language performance >1 SD worse than memory performance. A memory subgroup (n = 213) was defined as having memory performance >1 SD worse than language performance. A typical subgroup (n = 926) was defined as having a difference in language and memory performance of <1 SD. Compared with the memory subgroup, the language subgroup was 3.7 years older and more frequently self-identified as African American (OR = 3.69). Under a dominant genetic model, the language subgroup had smaller odds of carrying at least one APOEε4 allele relative to the memory subgroup. While this difference was present for all ages, it was more striking at a younger age (OR = 0.19 for youngest tertile; OR = 0.52 for oldest tertile). Compared with the memory subgroup, the language subgroup rose 35% faster on the Functional Assessment Questionnaire and 44% faster on CDR sum of boxes over time. Among a subset of participants who underwent autopsy (n = 98), the language, memory, and typical subgroups were equally likely to have an AD pathologic diagnosis, suggesting that variation in non-AD pathologies across subtypes did not lead to the observed differences. The study demonstrates that a language subgroup of AD has different demographics, genetic profile, and disease course in addition to cognitive phenotype.

  9. Measuring the construct of executive control in schizophrenia: defining and validating translational animal paradigms for discovery research.

    PubMed

    Gilmour, Gary; Arguello, Alexander; Bari, Andrea; Brown, Verity J; Carter, Cameron; Floresco, Stan B; Jentsch, David J; Tait, David S; Young, Jared W; Robbins, Trevor W

    2013-11-01

    Executive control is an aspect of cognitive function known to be impaired in schizophrenia. Previous meetings of the Cognitive Neuroscience Treatment Research to Improve Cognition in Schizophrenia (CNTRICS) group have more precisely defined executive control in terms of two constructs: "rule generation and selection", and "dynamic adjustments of control". Next, human cognitive tasks that may effectively measure performance with regard to these constructs were identified to be developed into practical and reliable measures for use in treatment development. The aim of this round of CNTRICS meetings was to define animal paradigms that have sufficient promise to warrant further investigation for their utility in measuring these constructs. Accordingly, "reversal learning" and the "attentional set-shifting task" were nominated to assess the construct of rule generation and selection, and the "stop signal task" for the construct of dynamic adjustments of control. These tasks are described in more detail here, with a particular focus on their utility for drug discovery efforts. Presently, each assay has strengths and weaknesses with regard to this point and increased emphasis on improving practical aspects of testing, understanding predictive validity, and defining biomarkers of performance represent important objectives in attaining confidence in translational validity here. Crown Copyright © 2012. Published by Elsevier Ltd. All rights reserved.

  10. Defining fire and wilderness objectives: Applying limits of acceptable change

    Treesearch

    David N. Cole

    1995-01-01

    The Limits of Acceptable Change (LAC) planning process was developed to help define objectives for recreation management in wilderness. This process can be applied to fire in wilderness if its conceptual foundation is broadened. LAC would lead decision makers to identify a compromise between the goal of allowing fire to play its natural role in wilderness and various...

  11. Multisensory Self-Motion Compensation During Object Trajectory Judgments

    PubMed Central

    Dokka, Kalpana; MacNeilage, Paul R.; DeAngelis, Gregory C.; Angelaki, Dora E.

    2015-01-01

    Judging object trajectory during self-motion is a fundamental ability for mobile organisms interacting with their environment. This fundamental ability requires the nervous system to compensate for the visual consequences of self-motion in order to make accurate judgments, but the mechanisms of this compensation are poorly understood. We comprehensively examined both the accuracy and precision of observers' ability to judge object trajectory in the world when self-motion was defined by vestibular, visual, or combined visual–vestibular cues. Without decision feedback, subjects demonstrated no compensation for self-motion that was defined solely by vestibular cues, partial compensation (47%) for visually defined self-motion, and significantly greater compensation (58%) during combined visual–vestibular self-motion. With decision feedback, subjects learned to accurately judge object trajectory in the world, and this generalized to novel self-motion speeds. Across conditions, greater compensation for self-motion was associated with decreased precision of object trajectory judgments, indicating that self-motion compensation comes at the cost of reduced discriminability. Our findings suggest that the brain can flexibly represent object trajectory relative to either the observer or the world, but a world-centered representation comes at the cost of decreased precision due to the inclusion of noisy self-motion signals. PMID:24062317

  12. Effects of Inorganic Arsenic, Methylated Arsenicals, and Arsenobetaine on Atherosclerosis in the apoE−/− Mouse Model and the Role of As3mt-Mediated Methylation

    PubMed Central

    Negro Silva, Luis Fernando; Lemaire, Maryse; Lemarié, Catherine A.; Plourde, Dany; Bolt, Alicia M.; Chiavatti, Christopher; Bohle, D. Scott; Slavkovich, Vesna; Graziano, Joseph H.; Lehoux, Stéphanie

    2017-01-01

    Background: Arsenic is metabolized through a series of oxidative methylation reactions by arsenic (3) methyltransferase (As3MT) to yield methylated intermediates. Although arsenic exposure is known to increase the risk of atherosclerosis, the contribution of arsenic methylation and As3MT remains undefined. Objectives: Our objective was to define whether methylated arsenic intermediates were proatherogenic and whether arsenic biotransformation by As3MT was required for arsenic-enhanced atherosclerosis. Methods: We utilized the apoE−/− mouse model to compare atherosclerotic plaque size and composition after inorganic arsenic, methylated arsenical, or arsenobetaine exposure in drinking water. We also generated apoE−/−/As3mt−/− double knockout mice to test whether As3MT-mediated biotransformation was required for the proatherogenic effects of inorganic arsenite. Furthermore, As3MT expression and function were assessed in in vitro cultures of plaque-resident cells. Finally, bone marrow transplantation studies were performed to define the contribution of As3MT-mediated methylation in different cell types to the development of atherosclerosis after inorganic arsenic exposure. Results: We found that methylated arsenicals, but not arsenobetaine, are proatherogenic and that As3MT is required for arsenic to induce reactive oxygen species and promote atherosclerosis. Importantly, As3MT was expressed and functional in multiple plaque-resident cell types, and transplant studies indicated that As3MT is required in extrahepatic tissues to promote atherosclerosis. Conclusion: Taken together, our findings indicate that As3MT acts to promote cardiovascular toxicity of arsenic and suggest that human AS3MT SNPs that correlate with enzyme function could predict those most at risk to develop atherosclerosis among the millions that are exposed to arsenic. https://doi.org/10.1289/EHP806 PMID:28728140

  13. Using Separation-of-Function Mutagenesis To Define the Full Spectrum of Activities Performed by the Est1 Telomerase Subunit in Vivo.

    PubMed

    Lubin, Johnathan W; Tucey, Timothy M; Lundblad, Victoria

    2018-01-01

    A leading objective in biology is to identify the complete set of activities that each gene performs in vivo In this study, we have asked whether a genetic approach can provide an efficient means of achieving this goal, through the identification and analysis of a comprehensive set of separation-of-function ( sof - ) mutations in a gene. Toward this goal, we have subjected the Saccharomyces cerevisiae EST1 gene, which encodes a regulatory subunit of telomerase, to intensive mutagenesis (with an average coverage of one mutation for every 4.5 residues), using strategies that eliminated those mutations that disrupted protein folding/stability. The resulting set of sof - mutations defined four biochemically distinct activities for the Est1 telomerase protein: two temporally separable steps in telomerase holoenzyme assembly, a telomerase recruitment activity, and a fourth newly discovered regulatory function. Although biochemically distinct, impairment of each of these four different activities nevertheless conferred a common phenotype (critically short telomeres) comparable to that of an est1 -∆ null strain. This highlights the limitations of gene deletions, even for nonessential genes; we suggest that employing a representative set of sof - mutations for each gene in future high- and low-throughput investigations will provide deeper insights into how proteins interact inside the cell. Copyright © 2018 by the Genetics Society of America.

  14. Towards Defining Deficient Emotional Self Regulation in Youth with Attention Deficit Hyperactivity Disorder Using the Child Behavior Check List: A Controlled Study

    PubMed Central

    Spencer, Thomas; Faraone, Stephen V.; Surman, Craig B.H.; Petty, Carter; Clarke, Allison; Batchelder, Holly; Wozniak, Janet; Biederman, Joseph

    2013-01-01

    Objective Deficient emotional self regulation (DESR) is characterized by deficits in self-regulating the physiological arousal caused by strong emotions. We examined whether a unique profile of the Child Behavior Check List (CBCL) would help identify DESR in children with Attention- Deficit/ Hyperactivity Disorder (ADHD). Methods Subjects were 197 children with and 224 without ADHD. We defined DESR if a child had an aggregate cut-off score of > 180 but < 210 on the Anxiety/Depression, Aggression, and Attention scales of the CBCL (CBCL-DESR). This profile was selected because of 1) its conceptual congruence with the clinical concept of DESR and 2) because its extreme (>210) form had been previously associated with severe forms of mood and behavioral dysregulation in children with ADHD. All subjects were comprehensively assessed with structured diagnostic interviews and a wide range of functional measures. Results Forty four percent of children with ADHD had a positive CBCL- DESR profile vs. 2% of controls (p<0.001). The CBCL-DESR profile was associated with elevated rates of anxiety and disruptive behavior disorders, as well as significantly more impairments in emotional and interpersonal functioning. Conclusions The CBCL-DESR profile helped identify a subgroup of ADHD children with a psychopathological and functional profile consistent with the clinical concept of DESR. PMID:21904086

  15. Future challenges for vection research: definitions, functional significance, measures, and neural bases

    PubMed Central

    Palmisano, Stephen; Allison, Robert S.; Schira, Mark M.; Barry, Robert J.

    2015-01-01

    This paper discusses four major challenges facing modern vection research. Challenge 1 (Defining Vection) outlines the different ways that vection has been defined in the literature and discusses their theoretical and experimental ramifications. The term vection is most often used to refer to visual illusions of self-motion induced in stationary observers (by moving, or simulating the motion of, the surrounding environment). However, vection is increasingly being used to also refer to non-visual illusions of self-motion, visually mediated self-motion perceptions, and even general subjective experiences (i.e., “feelings”) of self-motion. The common thread in all of these definitions is the conscious subjective experience of self-motion. Thus, Challenge 2 (Significance of Vection) tackles the crucial issue of whether such conscious experiences actually serve functional roles during self-motion (e.g., in terms of controlling or guiding the self-motion). After more than 100 years of vection research there has been surprisingly little investigation into its functional significance. Challenge 3 (Vection Measures) discusses the difficulties with existing subjective self-report measures of vection (particularly in the context of contemporary research), and proposes several more objective measures of vection based on recent empirical findings. Finally, Challenge 4 (Neural Basis) reviews the recent neuroimaging literature examining the neural basis of vection and discusses the hurdles still facing these investigations. PMID:25774143

  16. Automated Surgical Approach Planning for Complex Skull Base Targets: Development and Validation of a Cost Function and Semantic At-las.

    PubMed

    Aghdasi, Nava; Whipple, Mark; Humphreys, Ian M; Moe, Kris S; Hannaford, Blake; Bly, Randall A

    2018-06-01

    Successful multidisciplinary treatment of skull base pathology requires precise preoperative planning. Current surgical approach (pathway) selection for these complex procedures depends on an individual surgeon's experiences and background training. Because of anatomical variation in both normal tissue and pathology (eg, tumor), a successful surgical pathway used on one patient is not necessarily the best approach on another patient. The question is how to define and obtain optimized patient-specific surgical approach pathways? In this article, we demonstrate that the surgeon's knowledge and decision making in preoperative planning can be modeled by a multiobjective cost function in a retrospective analysis of actual complex skull base cases. Two different approaches- weighted-sum approach and Pareto optimality-were used with a defined cost function to derive optimized surgical pathways based on preoperative computed tomography (CT) scans and manually designated pathology. With the first method, surgeon's preferences were input as a set of weights for each objective before the search. In the second approach, the surgeon's preferences were used to select a surgical pathway from the computed Pareto optimal set. Using preoperative CT and magnetic resonance imaging, the patient-specific surgical pathways derived by these methods were similar (85% agreement) to the actual approaches performed on patients. In one case where the actual surgical approach was different, revision surgery was required and was performed utilizing the computationally derived approach pathway.

  17. Continuous representation of tumor microvessel density and detection of angiogenic hotspots in histological whole-slide images.

    PubMed

    Kather, Jakob Nikolas; Marx, Alexander; Reyes-Aldasoro, Constantino Carlos; Schad, Lothar R; Zöllner, Frank Gerrit; Weis, Cleo-Aron

    2015-08-07

    Blood vessels in solid tumors are not randomly distributed, but are clustered in angiogenic hotspots. Tumor microvessel density (MVD) within these hotspots correlates with patient survival and is widely used both in diagnostic routine and in clinical trials. Still, these hotspots are usually subjectively defined. There is no unbiased, continuous and explicit representation of tumor vessel distribution in histological whole slide images. This shortcoming distorts angiogenesis measurements and may account for ambiguous results in the literature. In the present study, we describe and evaluate a new method that eliminates this bias and makes angiogenesis quantification more objective and more efficient. Our approach involves automatic slide scanning, automatic image analysis and spatial statistical analysis. By comparing a continuous MVD function of the actual sample to random point patterns, we introduce an objective criterion for hotspot detection: An angiogenic hotspot is defined as a clustering of blood vessels that is very unlikely to occur randomly. We evaluate the proposed method in N=11 images of human colorectal carcinoma samples and compare the results to a blinded human observer. For the first time, we demonstrate the existence of statistically significant hotspots in tumor images and provide a tool to accurately detect these hotspots.

  18. Quality improvement for integrated management of patients with type 2 diabetes (PRIHTA project stage 1).

    PubMed

    Paccagnella, Agostino; Mauri, Alessandra; Spinella, Nello

    2012-01-01

    The purpose of the study was to show how a different collaborative relationship with family doctors and increasingly specialized diabetologists could lead to a 50% reduction in recurrent appointments due to procedural errors and a 50% reduction in the average waiting times for a specialist medical visit. A qualitative and quantitative definition of the problem was made using the Lean Six Sigma method: (Define); process indicators were observed that might interfere with the objectives of this study (Measure); descriptive statistics were used to confirm the validity and significance of the results (Analyze); and finally strategies were established to intervene on these variables (Improve). Four groups of action led to optimization of the objectives: (1) establishing clinical protocols for primary care physicians for treating hospitalized patients with type 2 diabetes and hyperglycemia; (2) increasing the autonomy of nursing care staff; (3) reorganizing the appointments booking office; and (4) making diabetes clinics more specialized. Thanks to this project, primary care physicians have rediscovered their role and defined their diagnostic-therapeutic function under a shared scientific protocol. The model presented in this study provides scope for reflection on the role of the diabetologist, proposing an "alternative" that concerns only the care of patients with metabolic decompensation.

  19. Object apprehension using vision and touch

    NASA Technical Reports Server (NTRS)

    Bajcsy, R.; Stansfield, S. A.

    1987-01-01

    Researchers define object apprehension as the determination of the properties of an object and the relationships among these properties. They contrast this with recognition, which goes a step further to attach a label to the object as a whole. Apprehension is fundamental to manipulation. This is true whether the manipulation is being carried out by an autonomous robot or is the result of teleoperation involving sensory feedback. Researchers present an apprehension paradigm using both vision and touch. In this model, they define a representation for object apprehension in terms of a set of primitives and features, along with their relationships. This representation is the mechanism by which the data from the two modalities are combined. It is also the mechanism which drives the apprehension process.

  20. A New Glaucoma Severity Score Combining Structural and Functional Defects.

    PubMed

    Wachtl, J; Töteberg-Harms, M; Frimmel, S; Kniestedt, C

    2017-04-01

    Background In order to assess glaucoma severity and to compare the success of surgical and medical therapy and study outcomes, an objective and independent staging tool is necessary. A combination of information from both structural and functional testing is probably the best approach to stage glaucomatous damage. There has been no universally accepted standard for glaucoma staging. The aim of this study was to develop a Glaucoma Severity Score (GSS) for objective assessment of a patient's glaucoma severity, combining both functional and structural information. Materials and methods The Glaucoma Severity Score includes the following 3 criteria: superior and inferior Retinal Nerve Fibre Layer (RNFL) thickness, perimetric mean defect (MD), and agreement of anatomical and perimetric defects, as assessed by two glaucoma specialists. The specialists defined a staging tool for each of the 3 criteria in a consensus process, assigning specific characteristics to a scale value between 0 and 2 or 0 and 3, respectively. The GSS ranges between 0 and 10 points. In a prospective observational study, the data of 112 glaucoma patients were assessed independently by the two specialists according to this staging tool. Results The GSS was applied to 112 eyes and patients (59.8 % female) with a mean age of 66.3 ± 13.1 years. Mean GSS was 4.73 points. Cohen's kappa coefficient was determined to measure inter-rater agreement between glaucoma specialists for the third criterion. With κ = 0.83, the agreement was very good. Thus, all 3 criteria of the GSS may be regarded as objective. Conclusions The Glaucoma Severity Score is an objective tool, combining both structural and functional characteristics, and permitting comparison of different patients, populations and studies. The Glaucoma Severity Score has proven effective in the objective assessment of 112 glaucoma patients and is relatively user-friendly in clinical practice. A comparative study of the GSS with the results of the FORUM® Glaucoma Workplace (Carl Zeiss Meditec AG, Jena, Germany) will be the next step. If outcomes match, the Glaucoma Severity Score can be accepted as a promising tool to stage glaucoma and monitor changes objectively in patients when comparing glaucoma progression in study analyses. Georg Thieme Verlag KG Stuttgart · New York.

  1. Automated solar collector installation design including ability to define heterogeneous design preferences

    DOEpatents

    Wayne, Gary; Frumkin, Alexander; Zaydman, Michael; Lehman, Scott; Brenner, Jules

    2014-04-29

    Embodiments may include systems and methods to create and edit a representation of a worksite, to create various data objects, to classify such objects as various types of pre -defined "features" with attendant properties and layout constraints. As part of or in addition to classification, an embodiment may include systems and methods to create, associate, and edit intrinsic and extrinsic properties to these objects. A design engine may apply of design rules to the features described above to generate one or more solar collectors installation design alternatives, including generation of on-screen and/or paper representations of the physical layout or arrangement of the one or more design alternatives. Embodiments may also include definition of one or more design apertures, each of which may correspond to boundaries in which solar collector layouts should comply with distinct sets of user-defined design preferences. Distinct apertures may provide heterogeneous regions of collector layout according to the user-defined design preferences.

  2. Automated solar collector installation design including ability to define heterogeneous design preferences

    DOEpatents

    Wayne, Gary; Frumkin, Alexander; Zaydman, Michael; Lehman, Scott; Brenner, Jules

    2013-01-08

    Embodiments may include systems and methods to create and edit a representation of a worksite, to create various data objects, to classify such objects as various types of pre-defined "features" with attendant properties and layout constraints. As part of or in addition to classification, an embodiment may include systems and methods to create, associate, and edit intrinsic and extrinsic properties to these objects. A design engine may apply of design rules to the features described above to generate one or more solar collectors installation design alternatives, including generation of on-screen and/or paper representations of the physical layout or arrangement of the one or more design alternatives. Embodiments may also include definition of one or more design apertures, each of which may correspond to boundaries in which solar collector layouts should comply with distinct sets of user-defined design preferences. Distinct apertures may provide heterogeneous regions of collector layout according to the user-defined design preferences.

  3. The software-defined fast post-processing for GEM soft x-ray diagnostics in the Tungsten Environment in Steady-state Tokamak thermal fusion reactor

    NASA Astrophysics Data System (ADS)

    Krawczyk, Rafał Dominik; Czarski, Tomasz; Linczuk, Paweł; Wojeński, Andrzej; Kolasiński, Piotr; GÄ ska, Michał; Chernyshova, Maryna; Mazon, Didier; Jardin, Axel; Malard, Philippe; Poźniak, Krzysztof; Kasprowicz, Grzegorz; Zabołotny, Wojciech; Kowalska-Strzeciwilk, Ewa; Malinowski, Karol

    2018-06-01

    This article presents a novel software-defined server-based solutions that were introduced in the fast, real-time computation systems for soft X-ray diagnostics for the WEST (Tungsten Environment in Steady-state Tokamak) reactor in Cadarache, France. The objective of the research was to provide a fast processing of data at high throughput and with low latencies for investigating the interplay between the particle transport and magnetohydrodynamic activity. The long-term objective is to implement in the future a fast feedback signal in the reactor control mechanisms to sustain the fusion reaction. The implemented electronic measurement device is anticipated to be deployed in the WEST. A standalone software-defined computation engine was designed to handle data collected at high rates in the server back-end of the system. Signals are obtained from the front-end field-programmable gate array mezzanine cards that acquire and perform a selection from the gas electron multiplier detector. A fast, authorial library for plasma diagnostics was written in C++. It originated from reference offline MATLAB implementations. They were redesigned for runtime analysis during the experiment in the novel online modes of operation. The implementation allowed the benchmarking, evaluation, and optimization of plasma processing algorithms with the possibility to check the consistency with reference computations written in MATLAB. The back-end software and hardware architecture are presented with data evaluation mechanisms. The online modes of operation for the WEST are discussed. The results concerning the performance of the processing and the introduced functionality are presented.

  4. Images in quantum entanglement

    NASA Astrophysics Data System (ADS)

    Bowden, G. J.

    2009-08-01

    A system for classifying and quantifying entanglement in spin 1/2 pure states is presented based on simple images. From the image point of view, an entangled state can be described as a linear superposition of separable object wavefunction ΨO plus a portion of its own inverse image. Bell states can be defined in this way: \\Psi = 1/\\sqrt 2 (\\Psi _O \\pm \\Psi _I ). Using the method of images, the three-spin 1/2 system is discussed in some detail. This system can exhibit exclusive three-particle ν123 entanglement, two-particle entanglements ν12, ν13, ν23 and/or mixtures of all four. All four image states are orthogonal both to each other and to the object wavefunction. In general, five entanglement parameters ν12, ν13, ν23, ν123 and phi123 are required to define the general entangled state. In addition, it is shown that there is considerable scope for encoding numbers, at least from the classical point of view but using quantum-mechanical principles. Methods are developed for their extraction. It is shown that concurrence can be used to extract even-partite, but not odd-partite information. Additional relationships are also presented which can be helpful in the decoding process. However, in general, numerical methods are mandatory. A simple roulette method for decoding is presented and discussed. But it is shown that if the encoder chooses to use transcendental numbers for the angles defining the target function (α1, β1), etc, the method rapidly turns into the Devil's roulette, requiring finer and finer angular steps.

  5. A qualitative study of women's views on medical confidentiality

    PubMed Central

    Jenkins, G; Merz, J; Sankar, P

    2005-01-01

    Context: The need to reinvigorate medical confidentiality protections is recognised as an important objective in building patient trust necessary for successful health outcomes. Little is known about patient understanding and expectations from medical confidentiality. Objective: To identify and describe patient views of medical confidentiality and to assess provisionally the range of these views. Design: Qualitative study using indepth, open ended face-to-face interviews. Setting: Southeastern Pennsylvania and southern New Jersey, USA. Participants: A total of 85 women interviewed at two clinical sites and three community/research centres. Main outcome measures: Subjects' understanding of medical confidentiality, beliefs about the handling of confidential information and concerns influencing disclosure of information to doctors. Results: The subjects defined medical confidentiality as the expectation that something done or said would be kept "private" but differed on what information was confidential and the basis and methods for protecting information. Some considered all medical information as confidential and thought confidentiality protections functioned to limit its circulation to medical uses and reimbursement needs. Others defined only sensitive or potentially stigmatising information as confidential. Many of these also defined medical confidentiality as a strict limit prohibiting information release, although some noted that specific permission or urgent need could override this limit. Conclusions: Patients share a basic understanding of confidentiality as protection of information, but some might have expectations that are likely not met by current practice nor anticipated by doctors. Doctors should recognise that patients might have their own medical confidentiality models. They should address divergences from current practice and provide support to those who face emotional or practical obstacles to self-revelation. PMID:16131550

  6. Functional decline after incident wrist fractures—Study of Osteoporotic Fractures: prospective cohort study

    PubMed Central

    Song, Jing; Dunlop, Dorothy D; Fink, Howard A; Cauley, Jane A

    2010-01-01

    Objective To study the effect of an incident wrist fracture on functional status in women enrolled in the Study of Osteoporotic Fractures. Design Prospective cohort study. Setting Baltimore, Minneapolis, Portland, and the Monongahela valley in Pennsylvania, USA Participants 6107 women aged 65 years and older without previous wrist or hip fracture recruited from the community between September 1986 and October 1988. Main outcome measure Clinically important functional decline, defined as a functional deterioration of 5 points in five activities of daily living each scored from 0 to 3 (equivalent to one standard deviation decrease in functional ability). Results Over a mean follow-up of 7.6 years, 268 women had an incident wrist fracture and 41 (15%) of these developed clinically important functional decline. Compared with women without wrist fractures, those with incident wrist fractures had greater annual functional decline after adjustment for age, body mass index, and health status. Occurrence of a wrist fracture increased the odds of having a clinically important functional decline by 48% (odds ratio 1.48, 95% confidence interval 1.04 to 2.12), even after adjustment for age, body mass index, health status, baseline functional status, lifestyle factors, comorbidities, and neuromuscular function. Conclusions Wrist fractures contribute to clinically important functional decline in older women. PMID:20616099

  7. From fields to objects: A review of geographic boundary analysis

    NASA Astrophysics Data System (ADS)

    Jacquez, G. M.; Maruca, S.; Fortin, M.-J.

    Geographic boundary analysis is a relatively new approach unfamiliar to many spatial analysts. It is best viewed as a technique for defining objects - geographic boundaries - on spatial fields, and for evaluating the statistical significance of characteristics of those boundary objects. This is accomplished using null spatial models representative of the spatial processes expected in the absence of boundary-generating phenomena. Close ties to the object-field dialectic eminently suit boundary analysis to GIS data. The majority of existing spatial methods are field-based in that they describe, estimate, or predict how attributes (variables defining the field) vary through geographic space. Such methods are appropriate for field representations but not object representations. As the object-field paradigm gains currency in geographic information science, appropriate techniques for the statistical analysis of objects are required. The methods reviewed in this paper are a promising foundation. Geographic boundary analysis is clearly a valuable addition to the spatial statistical toolbox. This paper presents the philosophy of, and motivations for geographic boundary analysis. It defines commonly used statistics for quantifying boundaries and their characteristics, as well as simulation procedures for evaluating their significance. We review applications of these techniques, with the objective of making this promising approach accessible to the GIS-spatial analysis community. We also describe the implementation of these methods within geographic boundary analysis software: GEM.

  8. Composing, Analyzing and Validating Software Models

    NASA Astrophysics Data System (ADS)

    Sheldon, Frederick T.

    1998-10-01

    This research has been conducted at the Computational Sciences Division of the Information Sciences Directorate at Ames Research Center (Automated Software Engineering Grp). The principle work this summer has been to review and refine the agenda that were carried forward from last summer. Formal specifications provide good support for designing a functionally correct system, however they are weak at incorporating non-functional performance requirements (like reliability). Techniques which utilize stochastic Petri nets (SPNs) are good for evaluating the performance and reliability for a system, but they may be too abstract and cumbersome from the stand point of specifying and evaluating functional behavior. Therefore, one major objective of this research is to provide an integrated approach to assist the user in specifying both functionality (qualitative: mutual exclusion and synchronization) and performance requirements (quantitative: reliability and execution deadlines). In this way, the merits of a powerful modeling technique for performability analysis (using SPNs) can be combined with a well-defined formal specification language. In doing so, we can come closer to providing a formal approach to designing a functionally correct system that meets reliability and performance goals.

  9. Composing, Analyzing and Validating Software Models

    NASA Technical Reports Server (NTRS)

    Sheldon, Frederick T.

    1998-01-01

    This research has been conducted at the Computational Sciences Division of the Information Sciences Directorate at Ames Research Center (Automated Software Engineering Grp). The principle work this summer has been to review and refine the agenda that were carried forward from last summer. Formal specifications provide good support for designing a functionally correct system, however they are weak at incorporating non-functional performance requirements (like reliability). Techniques which utilize stochastic Petri nets (SPNs) are good for evaluating the performance and reliability for a system, but they may be too abstract and cumbersome from the stand point of specifying and evaluating functional behavior. Therefore, one major objective of this research is to provide an integrated approach to assist the user in specifying both functionality (qualitative: mutual exclusion and synchronization) and performance requirements (quantitative: reliability and execution deadlines). In this way, the merits of a powerful modeling technique for performability analysis (using SPNs) can be combined with a well-defined formal specification language. In doing so, we can come closer to providing a formal approach to designing a functionally correct system that meets reliability and performance goals.

  10. Stimulus Dependency of Object-Evoked Responses in Human Visual Cortex: An Inverse Problem for Category Specificity

    PubMed Central

    Graewe, Britta; De Weerd, Peter; Farivar, Reza; Castelo-Branco, Miguel

    2012-01-01

    Many studies have linked the processing of different object categories to specific event-related potentials (ERPs) such as the face-specific N170. Despite reports showing that object-related ERPs are influenced by visual stimulus features, there is consensus that these components primarily reflect categorical aspects of the stimuli. Here, we re-investigated this idea by systematically measuring the effects of visual feature manipulations on ERP responses elicited by both structure-from-motion (SFM)-defined and luminance-defined object stimuli. SFM objects elicited a novel component at 200–250 ms (N250) over parietal and posterior temporal sites. We found, however, that the N250 amplitude was unaffected by restructuring SFM stimuli into meaningless objects based on identical visual cues. This suggests that this N250 peak was not uniquely linked to categorical aspects of the objects, but is strongly determined by visual stimulus features. We provide strong support for this hypothesis by parametrically manipulating the depth range of both SFM- and luminance-defined object stimuli and showing that the N250 evoked by SFM stimuli as well as the well-known N170 to static faces were sensitive to this manipulation. Importantly, this effect could not be attributed to compromised object categorization in low depth stimuli, confirming a strong impact of visual stimulus features on object-related ERP signals. As ERP components linked with visual categorical object perception are likely determined by multiple stimulus features, this creates an interesting inverse problem when deriving specific perceptual processes from variations in ERP components. PMID:22363479

  11. Stimulus dependency of object-evoked responses in human visual cortex: an inverse problem for category specificity.

    PubMed

    Graewe, Britta; De Weerd, Peter; Farivar, Reza; Castelo-Branco, Miguel

    2012-01-01

    Many studies have linked the processing of different object categories to specific event-related potentials (ERPs) such as the face-specific N170. Despite reports showing that object-related ERPs are influenced by visual stimulus features, there is consensus that these components primarily reflect categorical aspects of the stimuli. Here, we re-investigated this idea by systematically measuring the effects of visual feature manipulations on ERP responses elicited by both structure-from-motion (SFM)-defined and luminance-defined object stimuli. SFM objects elicited a novel component at 200-250 ms (N250) over parietal and posterior temporal sites. We found, however, that the N250 amplitude was unaffected by restructuring SFM stimuli into meaningless objects based on identical visual cues. This suggests that this N250 peak was not uniquely linked to categorical aspects of the objects, but is strongly determined by visual stimulus features. We provide strong support for this hypothesis by parametrically manipulating the depth range of both SFM- and luminance-defined object stimuli and showing that the N250 evoked by SFM stimuli as well as the well-known N170 to static faces were sensitive to this manipulation. Importantly, this effect could not be attributed to compromised object categorization in low depth stimuli, confirming a strong impact of visual stimulus features on object-related ERP signals. As ERP components linked with visual categorical object perception are likely determined by multiple stimulus features, this creates an interesting inverse problem when deriving specific perceptual processes from variations in ERP components.

  12. Differential contributions of executive and episodic memory functions to problem solving in younger and older adults.

    PubMed

    Vandermorris, Susan; Sheldon, Signy; Winocur, Gordon; Moscovitch, Morris

    2013-11-01

    The relationship of higher order problem solving to basic neuropsychological processes likely depends on the type of problems to be solved. Well-defined problems (e.g., completing a series of errands) may rely primarily on executive functions. Conversely, ill-defined problems (e.g., navigating socially awkward situations) may, in addition, rely on medial temporal lobe (MTL) mediated episodic memory processes. Healthy young (N = 18; M = 19; SD = 1.3) and old (N = 18; M = 73; SD = 5.0) adults completed a battery of neuropsychological tests of executive and episodic memory function, and experimental tests of problem solving. Correlation analyses and age group comparisons demonstrated differential contributions of executive and autobiographical episodic memory function to well-defined and ill-defined problem solving and evidence for an episodic simulation mechanism underlying ill-defined problem solving efficacy. Findings are consistent with the emerging idea that MTL-mediated episodic simulation processes support the effective solution of ill-defined problems, over and above the contribution of frontally mediated executive functions. Implications for the development of intervention strategies that target preservation of functional independence in older adults are discussed.

  13. Membrane-Based Functions in the Origin of Cellular Life

    NASA Technical Reports Server (NTRS)

    Chipot, Christophe; New, Michael H.; Schweighofer, Karl; Pohorille, Andrew; Wilson, Michael A.

    1999-01-01

    Our objective is to help explain how the earliest ancestors of contemporary cells (protocells) performed their essential functions employing only the molecules available in the protobiological milieu. Our hypothesis is that vesicles, built of amphiphilic, membrane-forming materials, emerged early in protobiological evolution and served as precursors to protocells. We further assume that the cellular functions associated with contemporary membranes, such as capturing and, transducing of energy, signaling, or sequestering organic molecules and ions, evolved in these membrane environments. An alternative hypothesis is that these functions evolved in different environments and were incorporated into membrane-bound structures at some later stage of evolution. We focus on the application of the fundamental principles of physics and chemistry to determine how they apply to the formation of a primitive, functional cell. Rather than attempting to develop specific models for cellular functions and to identify the origin of the molecules which perform these functions, our goal is to define the structural and energetic conditions that any successful model must fulfill, therefore providing physico-chemical boundaries for these models. We do this by carrying out large-scale, molecular level computer simulations on systems of interest.

  14. Applying AN Object-Oriented Database Model to a Scientific Database Problem: Managing Experimental Data at Cebaf.

    NASA Astrophysics Data System (ADS)

    Ehlmann, Bryon K.

    Current scientific experiments are often characterized by massive amounts of very complex data and the need for complex data analysis software. Object-oriented database (OODB) systems have the potential of improving the description of the structure and semantics of this data and of integrating the analysis software with the data. This dissertation results from research to enhance OODB functionality and methodology to support scientific databases (SDBs) and, more specifically, to support a nuclear physics experiments database for the Continuous Electron Beam Accelerator Facility (CEBAF). This research to date has identified a number of problems related to the practical application of OODB technology to the conceptual design of the CEBAF experiments database and other SDBs: the lack of a generally accepted OODB design methodology, the lack of a standard OODB model, the lack of a clear conceptual level in existing OODB models, and the limited support in existing OODB systems for many common object relationships inherent in SDBs. To address these problems, the dissertation describes an Object-Relationship Diagram (ORD) and an Object-oriented Database Definition Language (ODDL) that provide tools that allow SDB design and development to proceed systematically and independently of existing OODB systems. These tools define multi-level, conceptual data models for SDB design, which incorporate a simple notation for describing common types of relationships that occur in SDBs. ODDL allows these relationships and other desirable SDB capabilities to be supported by an extended OODB system. A conceptual model of the CEBAF experiments database is presented in terms of ORDs and the ODDL to demonstrate their functionality and use and provide a foundation for future development of experimental nuclear physics software using an OODB approach.

  15. Fundamental Parameters Of The Lowest Mass Stars To The Highest Mass Planets

    NASA Astrophysics Data System (ADS)

    Filippazzo, Joseph C.

    2016-09-01

    The physical and atmospheric properties of ultracool dwarfs are deeply entangled due to the degenerate effects of mass, age, metallicity, clouds and dust, activity, rotation, and possibly even formation mechanism on observed spectra. Accurate determination of funda- mental parameters for a wide diversity of objects at the low end of the initial mass function (IMF) is thus crucial to testing stellar and planetary formation theories. To determine these quantities, we constructed and flux calibrated nearly-complete spectral energy distributions (SEDs) for 234 M, L, T, and Y dwarfs using published parallaxes and (0.3-40 \\mu m) spectra and photometry. From these homogeneous SEDs, we calculated bolometric luminosity ((L_\\text{bol})), effective temperature ((T_\\text{off})), mass, surface gravity, radius, spectral indexes, synthetic photometry, and bolometric corrections (BCs) for each object. We used these results to derive (L_\\text{bol}), (T_\\text{eff}), and BC polynomial relations across the entire very-low-mass star/brown dwarf/planetary mass regime. We use a subsample of objects with age constraints based on nearby young moving group membership, companionship with a young star, or spectral signatures of low surface gravity to define new age-sensitive diagnostics and characterize the reddening of young substellar atmospheres as a redistribution of flux from the near-infrared (NIR) into the mid-infrared (MIR). Consequently we find the SED flux pivots at K-band, making BCK as a function of spectral type a reliable, age-independent relationship. We find that young L dwarfs are systematically 300 K cooler than field age objects of the same spectral type and up to 600 K cooler than field age objects of the same absolute H magnitude. These findings are used to create prescriptions for the reliable and efficient characterization of new ultracool dwarfs using heterogeneous and limited spectral data.

  16. A Decision-Based Methodology for Object Oriented-Design

    DTIC Science & Technology

    1988-12-16

    willing to take the time to meet together weekly for mutual encouragement and prayer . Their friendship, uncompromising standards, and lifestyle were...assume the validity of the object-oriented and software engineering principles involved, and define and proto- type a generic, language independent...mean- ingful labels for variables, abstraction requires the ability to define new types that relieve the programmer from having to know or mess with

  17. A Bayesian Alternative for Multi-objective Ecohydrological Model Specification

    NASA Astrophysics Data System (ADS)

    Tang, Y.; Marshall, L. A.; Sharma, A.; Ajami, H.

    2015-12-01

    Process-based ecohydrological models combine the study of hydrological, physical, biogeochemical and ecological processes of the catchments, which are usually more complex and parametric than conceptual hydrological models. Thus, appropriate calibration objectives and model uncertainty analysis are essential for ecohydrological modeling. In recent years, Bayesian inference has become one of the most popular tools for quantifying the uncertainties in hydrological modeling with the development of Markov Chain Monte Carlo (MCMC) techniques. Our study aims to develop appropriate prior distributions and likelihood functions that minimize the model uncertainties and bias within a Bayesian ecohydrological framework. In our study, a formal Bayesian approach is implemented in an ecohydrological model which combines a hydrological model (HyMOD) and a dynamic vegetation model (DVM). Simulations focused on one objective likelihood (Streamflow/LAI) and multi-objective likelihoods (Streamflow and LAI) with different weights are compared. Uniform, weakly informative and strongly informative prior distributions are used in different simulations. The Kullback-leibler divergence (KLD) is used to measure the dis(similarity) between different priors and corresponding posterior distributions to examine the parameter sensitivity. Results show that different prior distributions can strongly influence posterior distributions for parameters, especially when the available data is limited or parameters are insensitive to the available data. We demonstrate differences in optimized parameters and uncertainty limits in different cases based on multi-objective likelihoods vs. single objective likelihoods. We also demonstrate the importance of appropriately defining the weights of objectives in multi-objective calibration according to different data types.

  18. Fast and fuzzy multi-objective radiotherapy treatment plan generation for head and neck cancer patients with the lexicographic reference point method (LRPM)

    NASA Astrophysics Data System (ADS)

    van Haveren, Rens; Ogryczak, Włodzimierz; Verduijn, Gerda M.; Keijzer, Marleen; Heijmen, Ben J. M.; Breedveld, Sebastiaan

    2017-06-01

    Previously, we have proposed Erasmus-iCycle, an algorithm for fully automated IMRT plan generation based on prioritised (lexicographic) multi-objective optimisation with the 2-phase ɛ-constraint (2pɛc) method. For each patient, the output of Erasmus-iCycle is a clinically favourable, Pareto optimal plan. The 2pɛc method uses a list of objective functions that are consecutively optimised, following a strict, user-defined prioritisation. The novel lexicographic reference point method (LRPM) is capable of solving multi-objective problems in a single optimisation, using a fuzzy prioritisation of the objectives. Trade-offs are made globally, aiming for large favourable gains for lower prioritised objectives at the cost of only slight degradations for higher prioritised objectives, or vice versa. In this study, the LRPM is validated for 15 head and neck cancer patients receiving bilateral neck irradiation. The generated plans using the LRPM are compared with the plans resulting from the 2pɛc method. Both methods were capable of automatically generating clinically relevant treatment plans for all patients. For some patients, the LRPM allowed large favourable gains in some treatment plan objectives at the cost of only small degradations for the others. Moreover, because of the applied single optimisation instead of multiple optimisations, the LRPM reduced the average computation time from 209.2 to 9.5 min, a speed-up factor of 22 relative to the 2pɛc method.

  19. Program Evaluation: An Overview.

    ERIC Educational Resources Information Center

    McCluskey, Lawrence

    1973-01-01

    Various models of educational evaluation are presented. These include: (1) the classical type model, which contains the following guidelines: formulate objectives, classify objectives, define objectives in behavioral terms, suggest situations in which achievement of objectives will be shown, develop or select appraisal techniques, and gather and…

  20. Understanding New Types of Evidence Ready for Translation into Nursing Informatics.

    PubMed

    McCormick, Kathleen

    2016-01-01

    Nurses are the primary deliverers of patient care and observers of patient side effects to medications. The primary objective of this tutorial is to bring the participants up to date in genomic applications for nursing from birth until death. A secondary objective is to define at least 17 pharmacogenomics evidence guidelines ready for implementation into the Electronic Health Record. The target audience are nurses in practice, implementers of EHRs, nursing in leadership and policy-making positions, those focused on defining new areas for nursing research, and educators who are in need of defining criteria for integrating genomics into nursing education.

Top