Sample records for models identifying objects

  1. Modeling the long-term evolution of space debris

    DOEpatents

    Nikolaev, Sergei; De Vries, Willem H.; Henderson, John R.; Horsley, Matthew A.; Jiang, Ming; Levatin, Joanne L.; Olivier, Scot S.; Pertica, Alexander J.; Phillion, Donald W.; Springer, Harry K.

    2017-03-07

    A space object modeling system that models the evolution of space debris is provided. The modeling system simulates interaction of space objects at simulation times throughout a simulation period. The modeling system includes a propagator that calculates the position of each object at each simulation time based on orbital parameters. The modeling system also includes a collision detector that, for each pair of objects at each simulation time, performs a collision analysis. When the distance between objects satisfies a conjunction criterion, the modeling system calculates a local minimum distance between the pair of objects based on a curve fitting to identify a time of closest approach at the simulation times and calculating the position of the objects at the identified time. When the local minimum distance satisfies a collision criterion, the modeling system models the debris created by the collision of the pair of objects.

  2. Application of a single-objective, hybrid genetic algorithm approach to pharmacokinetic model building.

    PubMed

    Sherer, Eric A; Sale, Mark E; Pollock, Bruce G; Belani, Chandra P; Egorin, Merrill J; Ivy, Percy S; Lieberman, Jeffrey A; Manuck, Stephen B; Marder, Stephen R; Muldoon, Matthew F; Scher, Howard I; Solit, David B; Bies, Robert R

    2012-08-01

    A limitation in traditional stepwise population pharmacokinetic model building is the difficulty in handling interactions between model components. To address this issue, a method was previously introduced which couples NONMEM parameter estimation and model fitness evaluation to a single-objective, hybrid genetic algorithm for global optimization of the model structure. In this study, the generalizability of this approach for pharmacokinetic model building is evaluated by comparing (1) correct and spurious covariate relationships in a simulated dataset resulting from automated stepwise covariate modeling, Lasso methods, and single-objective hybrid genetic algorithm approaches to covariate identification and (2) information criteria values, model structures, convergence, and model parameter values resulting from manual stepwise versus single-objective, hybrid genetic algorithm approaches to model building for seven compounds. Both manual stepwise and single-objective, hybrid genetic algorithm approaches to model building were applied, blinded to the results of the other approach, for selection of the compartment structure as well as inclusion and model form of inter-individual and inter-occasion variability, residual error, and covariates from a common set of model options. For the simulated dataset, stepwise covariate modeling identified three of four true covariates and two spurious covariates; Lasso identified two of four true and 0 spurious covariates; and the single-objective, hybrid genetic algorithm identified three of four true covariates and one spurious covariate. For the clinical datasets, the Akaike information criterion was a median of 22.3 points lower (range of 470.5 point decrease to 0.1 point decrease) for the best single-objective hybrid genetic-algorithm candidate model versus the final manual stepwise model: the Akaike information criterion was lower by greater than 10 points for four compounds and differed by less than 10 points for three compounds. The root mean squared error and absolute mean prediction error of the best single-objective hybrid genetic algorithm candidates were a median of 0.2 points higher (range of 38.9 point decrease to 27.3 point increase) and 0.02 points lower (range of 0.98 point decrease to 0.74 point increase), respectively, than that of the final stepwise models. In addition, the best single-objective, hybrid genetic algorithm candidate models had successful convergence and covariance steps for each compound, used the same compartment structure as the manual stepwise approach for 6 of 7 (86 %) compounds, and identified 54 % (7 of 13) of covariates included by the manual stepwise approach and 16 covariate relationships not included by manual stepwise models. The model parameter values between the final manual stepwise and best single-objective, hybrid genetic algorithm models differed by a median of 26.7 % (q₁ = 4.9 % and q₃ = 57.1 %). Finally, the single-objective, hybrid genetic algorithm approach was able to identify models capable of estimating absorption rate parameters for four compounds that the manual stepwise approach did not identify. The single-objective, hybrid genetic algorithm represents a general pharmacokinetic model building methodology whose ability to rapidly search the feasible solution space leads to nearly equivalent or superior model fits to pharmacokinetic data.

  3. A dual-process account of auditory change detection.

    PubMed

    McAnally, Ken I; Martin, Russell L; Eramudugolla, Ranmalee; Stuart, Geoffrey W; Irvine, Dexter R F; Mattingley, Jason B

    2010-08-01

    Listeners can be "deaf" to a substantial change in a scene comprising multiple auditory objects unless their attention has been directed to the changed object. It is unclear whether auditory change detection relies on identification of the objects in pre- and post-change scenes. We compared the rates at which listeners correctly identify changed objects with those predicted by change-detection models based on signal detection theory (SDT) and high-threshold theory (HTT). Detected changes were not identified as accurately as predicted by models based on either theory, suggesting that some changes are detected by a process that does not support change identification. Undetected changes were identified as accurately as predicted by the HTT model but much less accurately than predicted by the SDT models. The process underlying change detection was investigated further by determining receiver-operating characteristics (ROCs). ROCs did not conform to those predicted by either a SDT or a HTT model but were well modeled by a dual-process that incorporated HTT and SDT components. The dual-process model also accurately predicted the rates at which detected and undetected changes were correctly identified.

  4. Model Based Usability Heuristics for Constructivist E-Learning

    ERIC Educational Resources Information Center

    Katre, Dinesh S.

    2007-01-01

    Many e-learning applications and games have been studied to identify the common interaction models of constructivist learning, namely: 1. Move the object to appropriate location; 2. Place objects in appropriate order and location(s); 3. Click to identify; 4. Change the variable factors to observe the effects; and 5. System personification and…

  5. Modeling of turbulence and transition

    NASA Technical Reports Server (NTRS)

    Shih, Tsan-Hsing

    1992-01-01

    The first objective is to evaluate current two-equation and second order closure turbulence models using available direct numerical simulations and experiments, and to identify the models which represent the state of the art in turbulence modeling. The second objective is to study the near-wall behavior of turbulence, and to develop reliable models for an engineering calculation of turbulence and transition. The third objective is to develop a two-scale model for compressible turbulence.

  6. Preliminary design specifications of a calcium model

    NASA Technical Reports Server (NTRS)

    1978-01-01

    A list of objectives, requirements, and guidelines are given for a calcium model. Existing models are reviewed and evaluated in relation to the stated objectives and requirements. The reviewed models were either too abstract or apparently invalidated. A technical approach to the design of a desirable model is identified.

  7. Research on Objectives for High-School Biology

    ERIC Educational Resources Information Center

    Korgan, John J., Jr.; Wilson, John T.

    1973-01-01

    Describes procedures to develop instructional objectives for high school biology. Two kinds of objectives are identified as pre-objectives and performance objectives. Models to classify these in branches of biology and to ensure quality control are provided. (PS)

  8. Managing the travel model process : small and medium-sized MPOs. Instructor guide.

    DOT National Transportation Integrated Search

    2013-09-01

    The learning objectives of this course were to: explain fundamental travel model concepts; describe the model development process; identify key inputs and describe the quality control process; and identify and manage resources.

  9. Managing the travel model process : small and medium-sized MPOs. Participant handbook.

    DOT National Transportation Integrated Search

    2013-09-01

    The learning objectives of this course were to: explain fundamental travel model concepts; describe the model development process; identify key inputs and describe the quality control process; and identify and manage resources.

  10. Using Model Point Spread Functions to Identifying Binary Brown Dwarf Systems

    NASA Astrophysics Data System (ADS)

    Matt, Kyle; Stephens, Denise C.; Lunsford, Leanne T.

    2017-01-01

    A Brown Dwarf (BD) is a celestial object that is not massive enough to undergo hydrogen fusion in its core. BDs can form in pairs called binaries. Due to the great distances between Earth and these BDs, they act as point sources of light and the angular separation between binary BDs can be small enough to appear as a single, unresolved object in images, according to Rayleigh Criterion. It is not currently possible to resolve some of these objects into separate light sources. Stephens and Noll (2006) developed a method that used model point spread functions (PSFs) to identify binary Trans-Neptunian Objects, we will use this method to identify binary BD systems in the Hubble Space Telescope archive. This method works by comparing model PSFs of single and binary sources to the observed PSFs. We also use a method to compare model spectral data for single and binary fits to determine the best parameter values for each component of the system. We describe these methods, its challenges and other possible uses in this poster.

  11. Investigation of metabolic objectives in cultured hepatocytes.

    PubMed

    Uygun, Korkut; Matthew, Howard W T; Huang, Yinlun

    2007-06-15

    Using optimization based methods to predict fluxes in metabolic flux balance models has been a successful approach for some microorganisms, enabling construction of in silico models and even inference of some regulatory motifs. However, this success has not been translated to mammalian cells. The lack of knowledge about metabolic objectives in mammalian cells is a major obstacle that prevents utilization of various metabolic engineering tools and methods for tissue engineering and biomedical purposes. In this work, we investigate and identify possible metabolic objectives for hepatocytes cultured in vitro. To achieve this goal, we present a special data-mining procedure for identifying metabolic objective functions in mammalian cells. This multi-level optimization based algorithm enables identifying the major fluxes in the metabolic objective from MFA data in the absence of information about critical active constraints of the system. Further, once the objective is determined, active flux constraints can also be identified and analyzed. This information can be potentially used in a predictive manner to improve cell culture results or clinical metabolic outcomes. As a result of the application of this method, it was found that in vitro cultured hepatocytes maximize oxygen uptake, coupling of urea and TCA cycles, and synthesis of serine and urea. Selection of these fluxes as the metabolic objective enables accurate prediction of the flux distribution in the system given a limited amount of flux data; thus presenting a workable in silico model for cultured hepatocytes. It is observed that an overall homeostasis picture is also emergent in the findings.

  12. Trajectory Recognition as the Basis for Object Individuation: A Functional Model of Object File Instantiation and Object-Token Encoding

    PubMed Central

    Fields, Chris

    2011-01-01

    The perception of persisting visual objects is mediated by transient intermediate representations, object files, that are instantiated in response to some, but not all, visual trajectories. The standard object file concept does not, however, provide a mechanism sufficient to account for all experimental data on visual object persistence, object tracking, and the ability to perceive spatially disconnected stimuli as continuously existing objects. Based on relevant anatomical, functional, and developmental data, a functional model is constructed that bases visual object individuation on the recognition of temporal sequences of apparent center-of-mass positions that are specifically identified as trajectories by dedicated “trajectory recognition networks” downstream of the medial–temporal motion-detection area. This model is shown to account for a wide range of data, and to generate a variety of testable predictions. Individual differences in the recognition, abstraction, and encoding of trajectory information are expected to generate distinct object persistence judgments and object recognition abilities. Dominance of trajectory information over feature information in stored object tokens during early infancy, in particular, is expected to disrupt the ability to re-identify human and other individuals across perceptual episodes, and lead to developmental outcomes with characteristics of autism spectrum disorders. PMID:21716599

  13. On Multi-Objective Based Constitutive Modelling Methodology and Numerical Validation in Small-Hole Drilling of Al6063/SiCp Composites

    PubMed Central

    Xiang, Junfeng; Xie, Lijing; Gao, Feinong; Zhang, Yu; Yi, Jie; Wang, Tao; Pang, Siqin; Wang, Xibin

    2018-01-01

    Discrepancies in capturing material behavior of some materials, such as Particulate Reinforced Metal Matrix Composites, by using conventional ad hoc strategy make the applicability of Johnson-Cook constitutive model challenged. Despites applicable efforts, its extended formalism with more fitting parameters would increase the difficulty in identifying constitutive parameters. A weighted multi-objective strategy for identifying any constitutive formalism is developed to predict mechanical behavior in static and dynamic loading conditions equally well. These varying weighting is based on the Gaussian-distributed noise evaluation of experimentally obtained stress-strain data in quasi-static or dynamic mode. This universal method can be used to determine fast and directly whether the constitutive formalism is suitable to describe the material constitutive behavior by measuring goodness-of-fit. A quantitative comparison of different fitting strategies on identifying Al6063/SiCp’s material parameters is made in terms of performance evaluation including noise elimination, correlation, and reliability. Eventually, a three-dimensional (3D) FE model in small-hole drilling of Al6063/SiCp composites, using multi-objective identified constitutive formalism, is developed. Comparison with the experimental observations in thrust force, torque, and chip morphology provides valid evidence on the applicability of the developed multi-objective identification strategy in identifying constitutive parameters. PMID:29324688

  14. Four Single-Page Learning Models.

    ERIC Educational Resources Information Center

    Hlynka, Denis

    1979-01-01

    Identifies four models of single-page learning systems that can streamline lengthy, complex prose: Information Mapping, Focal Press Model, Behavioral Objectives Model, and School Mathematics Model. (CMV)

  15. The use of music therapy within the SCERTS model for children with Autism Spectrum Disorder.

    PubMed

    Walworth, Darcy DeLoach

    2007-01-01

    The SCERTS model is a new, comprehensive curriculum designed to assess and identify treatment goals and objectives within a multidisciplinary team of clinicians and educators for children with Autism Spectrum Disorders (ASD). This model is an ongoing assessment tool with resulting goals and objectives derived there from. Because music therapy offers a unique interaction setting for children with ASD to elicit communication skills, music therapists will need to be an integral part of the multidisciplinary assessment team using the SCERTS model which is projected to become the primary nation wide curriculum for children with ASD. The purpose of this paper is to assist music therapists in transitioning to this model by providing an overview and explanation of the SCERTS model and by identifying how music therapists are currently providing clinical services incorporated in the SCERTS Model for children with ASD. In order to formulate comprehensive transitional suggestions, a national survey of music therapists working with clients at risk or diagnosed with ASD was conducted to: (a) identify the areas of SCERTS assessment model that music therapists are currently addressing within their written goals for clients with ASD, (b) identify current music therapy activities that address various SCERTS goals and objectives, and (c) provide demographic information about settings, length, and tools used in music therapy interventions for clients with ASD.

  16. A simulation study of detection of weapon of mass destruction based on radar

    NASA Astrophysics Data System (ADS)

    Sharifahmadian, E.; Choi, Y.; Latifi, S.

    2013-05-01

    Typical systems used for detection of Weapon of Mass Destruction (WMD) are based on sensing objects using gamma rays or neutrons. Nonetheless, depending on environmental conditions, current methods for detecting fissile materials have limited distance of effectiveness. Moreover, radiation related to gamma- rays can be easily shielded. Here, detecting concealed WMD from a distance is simulated and studied based on radar, especially WideBand (WB) technology. The WB-based method capitalizes on the fact that electromagnetic waves penetrate through different materials at different rates. While low-frequency waves can pass through objects more easily, high-frequency waves have a higher rate of absorption by objects, making the object recognition easier. Measuring the penetration depth allows one to identify the sensed material. During simulation, radar waves and propagation area including free space, and objects in the scene are modeled. In fact, each material is modeled as a layer with a certain thickness. At start of simulation, a modeled radar wave is radiated toward the layers. At the receiver side, based on the received signals from every layer, each layer can be identified. When an electromagnetic wave passes through an object, the wave's power will be subject to a certain level of attenuation depending of the object's characteristics. Simulation is performed using radar signals with different frequencies (ranges MHz-GHz) and powers to identify different layers.

  17. Strategy Generalization across Orientation Tasks: Testing a Computational Cognitive Model

    DTIC Science & Technology

    2008-07-01

    arranged in groups ( clusters ). The space, itself, was divided into four quadrants, which had 1, 2, 3, and 4 objects, respectively. The arrangement of... clusters , of objects play an important role in the model’s performance, by providing some context for narrowing the search for the target to a portion of the...model uses a hierarchical approach to accomplish this. First, the model identifies a group or cluster of objects that contains the target. The number of

  18. Surface versus Edge-Based Determinants of Visual Recognition.

    ERIC Educational Resources Information Center

    Biederman, Irving; Ju, Ginny

    1988-01-01

    The latency at which objects could be identified by 126 subjects was compared through line drawings (edge-based) or color photography (surface depiction). The line drawing was identified about as quickly as the photograph; primal access to a mental representation of an object can be modeled from an edge-based description. (SLD)

  19. Possible Content Areas for Implementation of the Basic Life Functions Instructional Program Model.

    ERIC Educational Resources Information Center

    Wisconsin State Dept. of Public Instruction, Madison. Div. for Handicapped Children.

    Identified are curricular items intended to develop skills pertinent to the 12 broad instructional objectives of the Basic Life Functions Instructional Program Model, a program for trainable mentally retarded children. The 12 instructional objectives are: communicating ideas, self-understanding, interacting with others, traveling, adapting to and…

  20. Model-based occluded object recognition using Petri nets

    NASA Astrophysics Data System (ADS)

    Zhou, Chuan; Hura, Gurdeep S.

    1998-09-01

    This paper discusses the use of Petri nets to model the process of the object matching between an image and a model under different 2D geometric transformations. This transformation finds its applications in sensor-based robot control, flexible manufacturing system and industrial inspection, etc. A description approach for object structure is presented by its topological structure relation called Point-Line Relation Structure (PLRS). It has been shown how Petri nets can be used to model the matching process, and an optimal or near optimal matching can be obtained by tracking the reachability graph of the net. The experiment result shows that object can be successfully identified and located under 2D transformation such as translations, rotations, scale changes and distortions due to object occluded partially.

  1. A Comparative Analysis of Financial Reporting Models for Private and Public Sector Organizations.

    DTIC Science & Technology

    1995-12-01

    The objective of this thesis was to describe and compare different existing and evolving financial reporting models used in both the public and...private sector. To accomplish the objective, this thesis identified the existing financial reporting models for private sector business organizations...private sector nonprofit organizations, and state and local governments, as well as the evolving financial reporting model for the federal government

  2. Decision Support for Renewal of Wastewater Collection and Water Distribution Systems

    EPA Science Inventory

    The objective of this study was to identify the current decision support methodologies, models and approaches being used for determining how to rehabilitate or replace underground utilities; identify the critical gaps of these current models through comparison with case history d...

  3. Electromagnetic Test-Facility characterization: an identification approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zicker, J.E.; Candy, J.V.

    The response of an object subjected to high energy, transient electromagnetic (EM) fields sometimes called electromagnetic pulses (EMP), is an important issue in the survivability of electronic systems (e.g., aircraft), especially when the field has been generated by a high altitude nuclear burst. The characterization of transient response information is a matter of national concern. In this report we discuss techniques to: (1) improve signal processing at a test facility; and (2) parameterize a particular object response. First, we discuss the application of identification-based signal processing techniques to improve signal levels at the Lawrence Livermore National Laboratory (LLNL) EM Transientmore » Test Facility. We identify models of test equipment and then use these models to deconvolve the input/output sequences for the object under test. A parametric model of the object is identified from this data. The model can be used to extrapolate the response to these threat level EMP. Also discussed is the development of a facility simulator (EMSIM) useful for experimental design and calibration and a deconvolution algorithm (DECONV) useful for removing probe effects from the measured data.« less

  4. Challenges in Requirements Engineering: A Research Agenda for Conceptual Modeling

    NASA Astrophysics Data System (ADS)

    March, Salvatore T.; Allen, Gove N.

    Domains for which information systems are developed deal primarily with social constructions—conceptual objects and attributes created by human intentions and for human purposes. Information systems play an active role in these domains. They document the creation of new conceptual objects, record and ascribe values to their attributes, initiate actions within the domain, track activities performed, and infer conclusions based on the application of rules that govern how the domain is affected when socially-defined and identified causal events occur. Emerging applications of information technologies evaluate such business rules, learn from experience, and adapt to changes in the domain. Conceptual modeling grammars aimed at representing their system requirements must include conceptual objects, socially-defined events, and the rules pertaining to them. We identify challenges to conceptual modeling research and pose an ontology of the artificial as a step toward meeting them.

  5. Using multi-objective robust decision making to support seasonal water management in the Chao Phraya River basin, Thailand

    NASA Astrophysics Data System (ADS)

    Riegels, Niels; Jessen, Oluf; Madsen, Henrik

    2016-04-01

    A multi-objective robust decision making approach is demonstrated that supports seasonal water management in the Chao Phraya River basin in Thailand. The approach uses multi-objective optimization to identify a Pareto-optimal set of management alternatives. Ensemble simulation is used to evaluate how each member of the Pareto set performs under a range of uncertain future conditions, and a robustness criterion is used to select a preferred alternative. Data mining tools are then used to identify ranges of uncertain factor values that lead to unacceptable performance for the preferred alternative. The approach is compared to a multi-criteria scenario analysis approach to estimate whether the introduction of additional complexity has the potential to improve decision making. Dry season irrigation in Thailand is managed through non-binding recommendations about the maximum extent of rice cultivation along with incentives for less water-intensive crops. Management authorities lack authority to prevent river withdrawals for irrigation when rice cultivation exceeds recommendations. In practice, this means that water must be provided to irrigate the actual planted area because of downstream municipal water supply requirements and water quality constraints. This results in dry season reservoir withdrawals that exceed planned withdrawals, reducing carryover storage to hedge against insufficient wet season runoff. The dry season planning problem in Thailand can therefore be framed in terms of decisions, objectives, constraints, and uncertainties. Decisions include recommendations about the maximum extent of rice cultivation and incentives for growing less water-intensive crops. Objectives are to maximize benefits to farmers, minimize the risk of inadequate carryover storage, and minimize incentives. Constraints include downstream municipal demands and water quality requirements. Uncertainties include the actual extent of rice cultivation, dry season precipitation, and precipitation in the following wet season. The multi-objective robust decision making approach is implemented as follows. First, three baseline simulation models are developed, including a crop water demand model, a river basin simulation model, and model of the impact of incentives on cropping patterns. The crop water demand model estimates irrigation water demands; the river basin simulation model estimates reservoir drawdown required to meet demands given forecasts of precipitation, evaporation, and runoff; the model of incentive impacts estimates the cost of incentives as function of marginal changes in rice yields. Optimization is used to find a set of non-dominated alternatives as a function of rice area and incentive decisions. An ensemble of uncertain model inputs is generated to represent uncertain hydrological and crop area forecasts. An ensemble of indicator values is then generated for each of the decision objectives: farmer benefits, end-of-wet-season reservoir storage, and the cost of incentives. A single alternative is selected from the Pareto set using a robustness criterion. Threshold values are defined for each of the objectives to identify ensemble members for which objective values are unacceptable, and the PRIM data mining algorithm is then used to identify input values associated with unacceptable model outcomes.

  6. Automatic pole-like object modeling via 3D part-based analysis of point cloud

    NASA Astrophysics Data System (ADS)

    He, Liu; Yang, Haoxiang; Huang, Yuchun

    2016-10-01

    Pole-like objects, including trees, lampposts and traffic signs, are indispensable part of urban infrastructure. With the advance of vehicle-based laser scanning (VLS), massive point cloud of roadside urban areas becomes applied in 3D digital city modeling. Based on the property that different pole-like objects have various canopy parts and similar trunk parts, this paper proposed the 3D part-based shape analysis to robustly extract, identify and model the pole-like objects. The proposed method includes: 3D clustering and recognition of trunks, voxel growing and part-based 3D modeling. After preprocessing, the trunk center is identified as the point that has local density peak and the largest minimum inter-cluster distance. Starting from the trunk centers, the remaining points are iteratively clustered to the same centers of their nearest point with higher density. To eliminate the noisy points, cluster border is refined by trimming boundary outliers. Then, candidate trunks are extracted based on the clustering results in three orthogonal planes by shape analysis. Voxel growing obtains the completed pole-like objects regardless of overlaying. Finally, entire trunk, branch and crown part are analyzed to obtain seven feature parameters. These parameters are utilized to model three parts respectively and get signal part-assembled 3D model. The proposed method is tested using the VLS-based point cloud of Wuhan University, China. The point cloud includes many kinds of trees, lampposts and other pole-like posters under different occlusions and overlaying. Experimental results show that the proposed method can extract the exact attributes and model the roadside pole-like objects efficiently.

  7. On the effect of model parameters on forecast objects

    NASA Astrophysics Data System (ADS)

    Marzban, Caren; Jones, Corinne; Li, Ning; Sandgathe, Scott

    2018-04-01

    Many physics-based numerical models produce a gridded, spatial field of forecasts, e.g., a temperature map. The field for some quantities generally consists of spatially coherent and disconnected objects. Such objects arise in many problems, including precipitation forecasts in atmospheric models, eddy currents in ocean models, and models of forest fires. Certain features of these objects (e.g., location, size, intensity, and shape) are generally of interest. Here, a methodology is developed for assessing the impact of model parameters on the features of forecast objects. The main ingredients of the methodology include the use of (1) Latin hypercube sampling for varying the values of the model parameters, (2) statistical clustering algorithms for identifying objects, (3) multivariate multiple regression for assessing the impact of multiple model parameters on the distribution (across the forecast domain) of object features, and (4) methods for reducing the number of hypothesis tests and controlling the resulting errors. The final output of the methodology is a series of box plots and confidence intervals that visually display the sensitivities. The methodology is demonstrated on precipitation forecasts from a mesoscale numerical weather prediction model.

  8. Using object-oriented analysis techniques to support system testing

    NASA Astrophysics Data System (ADS)

    Zucconi, Lin

    1990-03-01

    Testing of real-time control systems can be greatly facilitated by use of object-oriented and structured analysis modeling techniques. This report describes a project where behavior, process and information models built for a real-time control system were used to augment and aid traditional system testing. The modeling techniques used were an adaptation of the Ward/Mellor method for real-time systems analysis and design (Ward85) for object-oriented development. The models were used to simulate system behavior by means of hand execution of the behavior or state model and the associated process (data and control flow) and information (data) models. The information model, which uses an extended entity-relationship modeling technique, is used to identify application domain objects and their attributes (instance variables). The behavioral model uses state-transition diagrams to describe the state-dependent behavior of the object. The process model uses a transformation schema to describe the operations performed on or by the object. Together, these models provide a means of analyzing and specifying a system in terms of the static and dynamic properties of the objects which it manipulates. The various models were used to simultaneously capture knowledge about both the objects in the application domain and the system implementation. Models were constructed, verified against the software as-built and validated through informal reviews with the developer. These models were then hand-executed.

  9. Modeling Of Object- And Scene-Prototypes With Hierarchically Structured Classes

    NASA Astrophysics Data System (ADS)

    Ren, Z.; Jensch, P.; Ameling, W.

    1989-03-01

    The success of knowledge-based image analysis methodology and implementation tools depends largely on an appropriately and efficiently built model wherein the domain-specific context information about and the inherent structure of the observed image scene have been encoded. For identifying an object in an application environment a computer vision system needs to know firstly the description of the object to be found in an image or in an image sequence, secondly the corresponding relationships between object descriptions within the image sequence. This paper presents models of image objects scenes by means of hierarchically structured classes. Using the topovisual formalism of graph and higraph, we are currently studying principally the relational aspect and data abstraction of the modeling in order to visualize the structural nature resident in image objects and scenes, and to formalize. their descriptions. The goal is to expose the structure of image scene and the correspondence of image objects in the low level image interpretation. process. The object-based system design approach has been applied to build the model base. We utilize the object-oriented programming language C + + for designing, testing and implementing the abstracted entity classes and the operation structures which have been modeled topovisually. The reference images used for modeling prototypes of objects and scenes are from industrial environments as'well as medical applications.

  10. A practical approach to object based requirements analysis

    NASA Technical Reports Server (NTRS)

    Drew, Daniel W.; Bishop, Michael

    1988-01-01

    Presented here is an approach developed at the Unisys Houston Operation Division, which supports the early identification of objects. This domain oriented analysis and development concept is based on entity relationship modeling and object data flow diagrams. These modeling techniques, based on the GOOD methodology developed at the Goddard Space Flight Center, support the translation of requirements into objects which represent the real-world problem domain. The goal is to establish a solid foundation of understanding before design begins, thereby giving greater assurance that the system will do what is desired by the customer. The transition from requirements to object oriented design is also promoted by having requirements described in terms of objects. Presented is a five step process by which objects are identified from the requirements to create a problem definition model. This process involves establishing a base line requirements list from which an object data flow diagram can be created. Entity-relationship modeling is used to facilitate the identification of objects from the requirements. An example is given of how semantic modeling may be used to improve the entity-relationship model and a brief discussion on how this approach might be used in a large scale development effort.

  11. Parameter Estimation of Computationally Expensive Watershed Models Through Efficient Multi-objective Optimization and Interactive Decision Analytics

    NASA Astrophysics Data System (ADS)

    Akhtar, Taimoor; Shoemaker, Christine

    2016-04-01

    Watershed model calibration is inherently a multi-criteria problem. Conflicting trade-offs exist between different quantifiable calibration criterions indicating the non-existence of a single optimal parameterization. Hence, many experts prefer a manual approach to calibration where the inherent multi-objective nature of the calibration problem is addressed through an interactive, subjective, time-intensive and complex decision making process. Multi-objective optimization can be used to efficiently identify multiple plausible calibration alternatives and assist calibration experts during the parameter estimation process. However, there are key challenges to the use of multi objective optimization in the parameter estimation process which include: 1) multi-objective optimization usually requires many model simulations, which is difficult for complex simulation models that are computationally expensive; and 2) selection of one from numerous calibration alternatives provided by multi-objective optimization is non-trivial. This study proposes a "Hybrid Automatic Manual Strategy" (HAMS) for watershed model calibration to specifically address the above-mentioned challenges. HAMS employs a 3-stage framework for parameter estimation. Stage 1 incorporates the use of an efficient surrogate multi-objective algorithm, GOMORS, for identification of numerous calibration alternatives within a limited simulation evaluation budget. The novelty of HAMS is embedded in Stages 2 and 3 where an interactive visual and metric based analytics framework is available as a decision support tool to choose a single calibration from the numerous alternatives identified in Stage 1. Stage 2 of HAMS provides a goodness-of-fit measure / metric based interactive framework for identification of a small subset (typically less than 10) of meaningful and diverse set of calibration alternatives from the numerous alternatives obtained in Stage 1. Stage 3 incorporates the use of an interactive visual analytics framework for decision support in selection of one parameter combination from the alternatives identified in Stage 2. HAMS is applied for calibration of flow parameters of a SWAT model, (Soil and Water Assessment Tool) designed to simulate flow in the Cannonsville watershed in upstate New York. Results from the application of HAMS to Cannonsville indicate that efficient multi-objective optimization and interactive visual and metric based analytics can bridge the gap between the effective use of both automatic and manual strategies for parameter estimation of computationally expensive watershed models.

  12. An integrated approach to engineering curricula improvement with multi-objective decision modeling and linear programming

    NASA Astrophysics Data System (ADS)

    Shea, John E.

    The structure of engineering curricula currently in place at most colleges and universities has existed since the early 1950's, and reflects an historical emphasis on a solid foundation in math, science, and engineering science. However, there is often not a close match between elements of the traditional engineering education, and the skill sets that graduates need to possess for success in the industrial environment. Considerable progress has been made to restructure engineering courses and curricula. What is lacking, however, are tools and methodologies that incorporate the many dimensions of college courses, and how they are structured to form a curriculum. If curriculum changes are to be made, the first objective must be to determine what knowledge and skills engineering graduates need to possess. To accomplish this, a set of engineering competencies was developed from existing literature, and used in the development of a comprehensive mail survey of alumni, employers, students and faculty. Respondents proposed some changes to the topics in the curriculum and recommended that work to improve the curriculum be focused on communication, problem solving and people skills. The process of designing a curriculum is similar to engineering design, with requirements that must be met, and objectives that must be optimized. From this similarity came the idea for developing a linear, additive, multi-objective model that identifies the objectives that must be considered when designing a curriculum, and contains the mathematical relationships necessary to quantify the value of a specific alternative. The model incorporates the three primary objectives of engineering topics, skills, and curriculum design principles and uses data from the survey. It was used to design new courses, to evaluate various curricula alternatives, and to conduct sensitivity analysis to better understand their differences. Using the multi-objective model to identify the highest scoring curriculum from a catalog of courses is difficult because of the many factors being considered. To assist this process, the multi-objective model and the curriculum requirements were incorporated in a linear program to select the "optimum" curriculum. The application of this tool was also beneficial in identifying the active constraints that limit curriculum development and content.

  13. Modeling Real-Time Human-Automation Collaborative Scheduling of Unmanned Vehicles

    DTIC Science & Technology

    2013-06-01

    that they can only take into account those quantifiable variables, parameters, objectives, and constraints identified in the design stages that were... account those quantifiable variables, parameters, objectives, and constraints identified in the design stages that were deemed to be critical. Previous...increased training and operating costs (Haddal & Gertler, 2010) and challenges in meeting the ever-increasing demand for more UV operations (U.S. Air

  14. Training Module on the Development of Best Modeling Practices

    EPA Pesticide Factsheets

    This module continues the fundamental concepts outlined in the previous modules. Objectives are to identify the ‘best modeling practices’ and strategies for the Development Stage of the model life-cycle and define the steps of model development.

  15. Rethinking Trends in Instructional Objectives: Exploring the Alignment of Objectives with Activities and Assessment in Higher Education--A Case Study

    ERIC Educational Resources Information Center

    Yamanaka, Akio; Wu, Leon Yufeng

    2014-01-01

    This study explored higher education level syllabi to identify trends in educational objectives. Bloom's Taxonomy and various strategic models were used to classify 714 objectives from 114 sections of courses administered through a Midwest teacher education institution in the United States. 1229 verbs and verb phrases were classified through the…

  16. An object-based approach to weather analysis and its applications

    NASA Astrophysics Data System (ADS)

    Troemel, Silke; Diederich, Malte; Horvath, Akos; Simmer, Clemens; Kumjian, Matthew

    2013-04-01

    The research group 'Object-based Analysis and SEamless prediction' (OASE) within the Hans Ertel Centre for Weather Research programme (HErZ) pursues an object-based approach to weather analysis. The object-based tracking approach adopts the Lagrange perspective by identifying and following the development of convective events over the course of their lifetime. Prerequisites of the object-based analysis are a high-resolved observational data base and a tracking algorithm. A near real-time radar and satellite remote sensing-driven 3D observation-microphysics composite covering Germany, currently under development, contains gridded observations and estimated microphysical quantities. A 3D scale-space tracking identifies convective rain events in the dual-composite and monitors the development over the course of their lifetime. The OASE-group exploits the object-based approach in several fields of application: (1) For a better understanding and analysis of precipitation processes responsible for extreme weather events, (2) in nowcasting, (3) as a novel approach for validation of meso-γ atmospheric models, and (4) in data assimilation. Results from the different fields of application will be presented. The basic idea of the object-based approach is to identify a small set of radar- and satellite derived descriptors which characterize the temporal development of precipitation systems which constitute the objects. So-called proxies of the precipitation process are e.g. the temporal change of the brightband, vertically extensive columns of enhanced differential reflectivity ZDR or the cloud top temperature and heights identified in the 4D field of ground-based radar reflectivities and satellite retrievals generated by a cell during its life time. They quantify (micro-) physical differences among rain events and relate to the precipitation yield. Analyses on the informative content of ZDR columns as precursor for storm evolution for example will be presented to demonstrate the use of such system-oriented predictors for nowcasting. Columns of differential reflectivity ZDR measured by polarimetric weather radars are prominent signatures associated with thunderstorm updrafts. Since greater vertical velocities can loft larger drops and water-coated ice particles to higher altitudes above the environmental freezing level, the integrated ZDR column above the freezing level increases with increasing updraft intensity. Validation of atmospheric models concerning precipitation representation or prediction is usually confined to comparisons of precipitation fields or their temporal and spatial statistics. A comparison of the rain rates alone, however, does not immediately explain discrepancies between models and observations, because similar rain rates might be produced by different processes. Within the event-based approach for validation of models both observed and modeled rain events are analyzed by means of proxies of the precipitation process. Both sets of descriptors represent the basis for model validation since different leading descriptors - in a statistical sense- hint at process formulations potentially responsible for model failures.

  17. Chapter Innovators Guide, 2000: Models of Innovation Award Winners.

    ERIC Educational Resources Information Center

    National FFA Organization, Indianapolis, IN.

    This guide presents the Future Farmers of America (FFA) 2000 Model of Innovation award winners' projects. Chapters demonstrated abilities to identify goals and objectives, create a workable plan of action, attain and evaluate results, and identify items learned and ways to improve. Chapter 1 discusses the FFA National Chapter Award program that…

  18. Training Module on the Evaluation of Best Modeling Practices

    EPA Pesticide Factsheets

    Building upon the fundamental concepts outlined in previous modules, the objectives of this module are to explore the topic of model evaluation and identify the 'best modeling practices' and strategies for the Evaluation Stage of the model life-cycle.

  19. Data Relationships: Towards a Conceptual Model of Scientific Data Catalogs

    NASA Astrophysics Data System (ADS)

    Hourcle, J. A.

    2008-12-01

    As the amount of data, types of processing and storage formats increase, the total number of record permutations increase dramatically. The result is an overwhelming number of records that make identifying the best data object to answer a user's needs more difficult. The issue is further complicated as each archive's data catalog may be designed around different concepts - - anything from individual files to be served, series of similarly generated and processed data, or something entirely different. Catalogs may not only be flat tables, but may be structured as multiple tables with each table being a different data series, or a normalized structure of the individual data files. Merging federated search results from archives with different catalog designs can create situations where the data object of interest is difficult to find due to an overwhelming number of seemingly similar or entirely unwanted records. We present a reference model for discussing data catalogs and the complex relationships between similar data objects. We show how the model can be used to improve scientist's ability to quickly identify the best data object for their purposes and discuss technical issues required to use this model in a federated system.

  20. Objective Model Selection for Identifying the Human Feedforward Response in Manual Control.

    PubMed

    Drop, Frank M; Pool, Daan M; van Paassen, Marinus Rene M; Mulder, Max; Bulthoff, Heinrich H

    2018-01-01

    Realistic manual control tasks typically involve predictable target signals and random disturbances. The human controller (HC) is hypothesized to use a feedforward control strategy for target-following, in addition to feedback control for disturbance-rejection. Little is known about human feedforward control, partly because common system identification methods have difficulty in identifying whether, and (if so) how, the HC applies a feedforward strategy. In this paper, an identification procedure is presented that aims at an objective model selection for identifying the human feedforward response, using linear time-invariant autoregressive with exogenous input models. A new model selection criterion is proposed to decide on the model order (number of parameters) and the presence of feedforward in addition to feedback. For a range of typical control tasks, it is shown by means of Monte Carlo computer simulations that the classical Bayesian information criterion (BIC) leads to selecting models that contain a feedforward path from data generated by a pure feedback model: "false-positive" feedforward detection. To eliminate these false-positives, the modified BIC includes an additional penalty on model complexity. The appropriate weighting is found through computer simulations with a hypothesized HC model prior to performing a tracking experiment. Experimental human-in-the-loop data will be considered in future work. With appropriate weighting, the method correctly identifies the HC dynamics in a wide range of control tasks, without false-positive results.

  1. Identification of breeding objectives for Begait goat in western Tigray, North Ethiopia.

    PubMed

    Abraham, Hagos; Gizaw, Solomon; Urge, Mengistu

    2018-06-21

    A sound breeding objective is the basis for genetic improvement in overall economic merit of farm animals. Begait goat is one of the identified breeds in Ethiopia, which is a multipurpose breed as it serves as source of cash income and source of food (meat and milk). Despite its importance, no formal breeding objectives exist for Begait goat. The objective of the present study was to identify breeding objectives for the breed through two approaches: using own-flock ranking experiment and developing deterministic bio-economic models as a preliminary step towards designing sustainable breeding programs for the breed. In the own-flock ranking experiment, a total of 45 households were visited at their homesteads and were asked to select, with reasons, the first best, second best, third best, and the most inferior does from their own flock. Age, previous reproduction, and production information of the identified animals were inquired; live body weight and some linear body measurements were taken. The bio-economic model included performance traits (weights, daily weight gain, kidding interval, litter size, milk yield, kid mortality, pregnancy, and replacement rates) and economic (revenue and costs) parameters. It was observed that there was close agreement between the farmers' ranking and bio-economic model results. In general, the results of the present study indicated that Begait goat owners could improve performance of their goats and profitability of their farms by selecting for 6-month weight, litter size, pre-weaning kid survival rate, and milk yield.

  2. Releases of whooping cranes to the Florida nonmigratory flock: a structured decision-making approach: report to the International Whooping Crane Recovery Team, September 22, 2008

    USGS Publications Warehouse

    Moore, Clinton T.; Converse, Sarah J.; Folk, Martin J.; Boughton, Robin; Brooks, Bill; French, John B.; O'Meara, Timothy; Putnam, Michael; Rodgers, James; Spalding, Marilyn

    2008-01-01

    We used a structured decision-making approach to inform the decision of whether the Florida Fish and Wildlife Conservation Commission should request of the International Whooping Crane Recovery Team that additional whooping crane chicks be released into the Florida Non-Migratory Population (FNMP). Structured decision-making is an application of decision science that strives to produce transparent, replicable, and defensible decisions that recognize the appropriate roles of management policy and science in decision-making. We present a multi-objective decision framework, where management objectives include successful establishment of a whooping crane population in Florida, minimization of costs, positive public relations, information gain, and providing a supply of captive-reared birds to alternative crane release projects, such as the Eastern Migratory Population. We developed models to predict the outcome relative to each of these objectives under 29 different scenarios of the release methodology used from 1993 to 2004, including options of no further releases and variable numbers of releases per year over the next 5-30 years. In particular, we developed a detailed set of population projection models, which make substantially different predictions about the probability of successful establishment of the FNMP. We used expert elicitation to develop prior model weights (measures of confidence in population model predictions); the results of the population model weighting and modelaveraging exercise indicated that the probability of successful establishment of the FNMP ranged from 9% if no additional releases are made, to as high as 41% with additional releases. We also used expert elicitation to develop weights (relative values) on the set of identified objectives, and we then used a formal optimization technique for identifying the optimal decision, which considers the tradeoffs between objectives. The optimal decision was identified as release of 3 cohorts (24 birds) per year over the next 10 years. However, any decision that involved release of 1-3 cohorts (8-24 birds) per year over the next 5 to 20 years, as well as decisions that involve skipping releases in every other year, performed better in our analysis than the alternative of no further releases. These results were driven by the relatively high objective weights that experts placed on the population objective (i.e., successful establishment of the FNMP) and the information gain objective (where releases are expected to accelerate learning on what was identified as a primary uncertainty: the demographic performance of wild-hatched birds). Additional considerations that were not formally integrated into the analysis are also discussed.

  3. Processing Satellite Imagery To Detect Waste Tire Piles

    NASA Technical Reports Server (NTRS)

    Skiles, Joseph; Schmidt, Cynthia; Wuinlan, Becky; Huybrechts, Catherine

    2007-01-01

    A methodology for processing commercially available satellite spectral imagery has been developed to enable identification and mapping of waste tire piles in California. The California Integrated Waste Management Board initiated the project and provided funding for the method s development. The methodology includes the use of a combination of previously commercially available image-processing and georeferencing software used to develop a model that specifically distinguishes between tire piles and other objects. The methodology reduces the time that must be spent to initially survey a region for tire sites, thereby increasing inspectors and managers time available for remediation of the sites. Remediation is needed because millions of used tires are discarded every year, waste tire piles pose fire hazards, and mosquitoes often breed in water trapped in tires. It should be possible to adapt the methodology to regions outside California by modifying some of the algorithms implemented in the software to account for geographic differences in spectral characteristics associated with terrain and climate. The task of identifying tire piles in satellite imagery is uniquely challenging because of their low reflectance levels: Tires tend to be spectrally confused with shadows and deep water, both of which reflect little light to satellite-borne imaging systems. In this methodology, the challenge is met, in part, by use of software that implements the Tire Identification from Reflectance (TIRe) model. The development of the TIRe model included incorporation of lessons learned in previous research on the detection and mapping of tire piles by use of manual/ visual and/or computational analysis of aerial and satellite imagery. The TIRe model is a computational model for identifying tire piles and discriminating between tire piles and other objects. The input to the TIRe model is the georeferenced but otherwise raw satellite spectral images of a geographic region to be surveyed. The TIRe model identifies the darkest objects in the images and, on the basis of spatial and spectral image characteristics, discriminates against other dark objects, which can include vegetation, some bodies of water, and dark soils. The TIRe model can identify piles of as few as 100 tires. The output of the TIRe model is a binary mask showing areas containing suspected tire piles and spectrally similar features. This mask is overlaid on the original satellite imagery and examined by a trained image analyst, who strives to further discriminate against non-tire objects that the TIRe model tentatively identified as tire piles. After the analyst has made adjustments, the mask is used to create a synoptic, geographically accurate tire-pile survey map, which can be overlaid with a road map and/or any other map or set of georeferenced data, according to a customer s preferences.

  4. A biological hierarchical model based underwater moving object detection.

    PubMed

    Shen, Jie; Fan, Tanghuai; Tang, Min; Zhang, Qian; Sun, Zhen; Huang, Fengchen

    2014-01-01

    Underwater moving object detection is the key for many underwater computer vision tasks, such as object recognizing, locating, and tracking. Considering the super ability in visual sensing of the underwater habitats, the visual mechanism of aquatic animals is generally regarded as the cue for establishing bionic models which are more adaptive to the underwater environments. However, the low accuracy rate and the absence of the prior knowledge learning limit their adaptation in underwater applications. Aiming to solve the problems originated from the inhomogeneous lumination and the unstable background, the mechanism of the visual information sensing and processing pattern from the eye of frogs are imitated to produce a hierarchical background model for detecting underwater objects. Firstly, the image is segmented into several subblocks. The intensity information is extracted for establishing background model which could roughly identify the object and the background regions. The texture feature of each pixel in the rough object region is further analyzed to generate the object contour precisely. Experimental results demonstrate that the proposed method gives a better performance. Compared to the traditional Gaussian background model, the completeness of the object detection is 97.92% with only 0.94% of the background region that is included in the detection results.

  5. A Biological Hierarchical Model Based Underwater Moving Object Detection

    PubMed Central

    Shen, Jie; Fan, Tanghuai; Tang, Min; Zhang, Qian; Sun, Zhen; Huang, Fengchen

    2014-01-01

    Underwater moving object detection is the key for many underwater computer vision tasks, such as object recognizing, locating, and tracking. Considering the super ability in visual sensing of the underwater habitats, the visual mechanism of aquatic animals is generally regarded as the cue for establishing bionic models which are more adaptive to the underwater environments. However, the low accuracy rate and the absence of the prior knowledge learning limit their adaptation in underwater applications. Aiming to solve the problems originated from the inhomogeneous lumination and the unstable background, the mechanism of the visual information sensing and processing pattern from the eye of frogs are imitated to produce a hierarchical background model for detecting underwater objects. Firstly, the image is segmented into several subblocks. The intensity information is extracted for establishing background model which could roughly identify the object and the background regions. The texture feature of each pixel in the rough object region is further analyzed to generate the object contour precisely. Experimental results demonstrate that the proposed method gives a better performance. Compared to the traditional Gaussian background model, the completeness of the object detection is 97.92% with only 0.94% of the background region that is included in the detection results. PMID:25140194

  6. SU-E-I-58: Objective Models of Breast Shape Undergoing Mammography and Tomosynthesis Using Principal Component Analysis.

    PubMed

    Feng, Ssj; Sechopoulos, I

    2012-06-01

    To develop an objective model of the shape of the compressed breast undergoing mammographic or tomosynthesis acquisition. Automated thresholding and edge detection was performed on 984 anonymized digital mammograms (492 craniocaudal (CC) view mammograms and 492 medial lateral oblique (MLO) view mammograms), to extract the edge of each breast. Principal Component Analysis (PCA) was performed on these edge vectors to identify a limited set of parameters and eigenvectors that. These parameters and eigenvectors comprise a model that can be used to describe the breast shapes present in acquired mammograms and to generate realistic models of breasts undergoing acquisition. Sample breast shapes were then generated from this model and evaluated. The mammograms in the database were previously acquired for a separate study and authorized for use in further research. The PCA successfully identified two principal components and their corresponding eigenvectors, forming the basis for the breast shape model. The simulated breast shapes generated from the model are reasonable approximations of clinically acquired mammograms. Using PCA, we have obtained models of the compressed breast undergoing mammographic or tomosynthesis acquisition based on objective analysis of a large image database. Up to now, the breast in the CC view has been approximated as a semi-circular tube, while there has been no objectively-obtained model for the MLO view breast shape. Such models can be used for various breast imaging research applications, such as x-ray scatter estimation and correction, dosimetry estimates, and computer-aided detection and diagnosis. © 2012 American Association of Physicists in Medicine.

  7. Object-oriented integrated approach for the design of scalable ECG systems.

    PubMed

    Boskovic, Dusanka; Besic, Ingmar; Avdagic, Zikrija

    2009-01-01

    The paper presents the implementation of Object-Oriented (OO) integrated approaches to the design of scalable Electro-Cardio-Graph (ECG) Systems. The purpose of this methodology is to preserve real-world structure and relations with the aim to minimize the information loss during the process of modeling, especially for Real-Time (RT) systems. We report on a case study of the design that uses the integration of OO and RT methods and the Unified Modeling Language (UML) standard notation. OO methods identify objects in the real-world domain and use them as fundamental building blocks for the software system. The gained experience based on the strongly defined semantics of the object model is discussed and related problems are analyzed.

  8. Innovative travel data collection recommendations : final report.

    DOT National Transportation Integrated Search

    2016-12-06

    This study had the following objectives: : 1. To identify and clarify these two emerging effects real time data and changing culture, : 2. To identify the shifts in data collection and transportation modeling that must take place to : assist in i...

  9. 3D design terrain models for construction plans and GPS control of highway construction equipment.

    DOT National Transportation Integrated Search

    2010-03-01

    Research was conducted with the objectives of 1) identifying and characterizing benefits and technological, institutional, cultural, and : legal impediments associated with adoption of 3D design and construction technologies, identifying strategies t...

  10. Rorschach assessment of cognitive impairment from an object relations perspective.

    PubMed

    Lerner, P M

    1996-01-01

    In 1986, H. Lerner and P. Lerner proposed an object relations model of thinking that integrated Piaget's theory of early cognitive development with Mahler's theory of separation-individuation. They identified three distinct, interdigitated stages, outlined the cognitive task for each stage, detailed the necessary role and function of the stage-specific caregiving object, and suggested potential cognitive impairments associated with the object not fulfilling its function. Herein, this conceptual model is extended to the Rorschach. Rorschach indices of cognitive impairments associated with each stage were developed. The indices are then applied to the Rorschach records of children who were selected as prototypical of specific developmental disorders.

  11. Human sera IgE reacts with a Metarhizium anisopliae fungal catalase

    EPA Science Inventory

    Background: Previous studies have demonstrated that Metarhzium anisopliae extract can induce immune responses in a mouse model that are characteristic of human allergic asthma. Objectives: The objective of this study was to identify and characterize the extract proteins t...

  12. Objects prompt authentic scientific activities among learners in a museum programme

    NASA Astrophysics Data System (ADS)

    Achiam, Marianne; Simony, Leonora; Kramer Lindow, Bent Erik

    2016-04-01

    Although the scientific disciplines conduct practical work in different ways, all consider practical work as the essential way of connecting objects and phenomena with ideas and the abstract. Accordingly, practical work is regarded as central to science education as well. We investigate a practical, object-based palaeontology programme at a natural history museum to identify how palaeontological objects prompt scientific activity among upper secondary school students. We first construct a theoretical framework based on an analysis of the programme's palaeontological content. From this, we build our reference model, which considers the specimens used in the programme, possible palaeontological interpretations of these specimens, and the conditions inherent in the programme. We use the reference model to analyse the activities of programme participants, and illustrate how these activities are palaeontologically authentic. Finally, we discuss our findings, examining the mechanism by which the specimens prompt scientific activities. We also discuss our discipline-based approach, and how it allows us to positively identify participants' activities as authentic. We conclude by discussing the implications of our findings.

  13. Blended near-optimal tools for flexible water resources decision making

    NASA Astrophysics Data System (ADS)

    Rosenberg, David

    2015-04-01

    State-of-the-art systems analysis techniques focus on efficiently finding optimal solutions. Yet an optimal solution is optimal only for the static modelled issues and managers often seek near-optimal alternatives that address un-modelled or changing objectives, preferences, limits, uncertainties, and other issues. Early on, Modelling to Generate Alternatives (MGA) formalized near-optimal as performance within a tolerable deviation from the optimal objective function value and identified a few maximally-different alternatives that addressed select un-modelled issues. This paper presents new stratified, Monte Carlo Markov Chain sampling and parallel coordinate plotting tools that generate and communicate the structure and full extent of the near-optimal region to an optimization problem. Plot controls allow users to interactively explore region features of most interest. Controls also streamline the process to elicit un-modelled issues and update the model formulation in response to elicited issues. Use for a single-objective water quality management problem at Echo Reservoir, Utah identifies numerous and flexible practices to reduce the phosphorus load to the reservoir and maintain close-to-optimal performance. Compared to MGA, the new blended tools generate more numerous alternatives faster, more fully show the near-optimal region, help elicit a larger set of un-modelled issues, and offer managers greater flexibility to cope in a changing world.

  14. A Single-System Model Predicts Recognition Memory and Repetition Priming in Amnesia

    PubMed Central

    Kessels, Roy P.C.; Wester, Arie J.; Shanks, David R.

    2014-01-01

    We challenge the claim that there are distinct neural systems for explicit and implicit memory by demonstrating that a formal single-system model predicts the pattern of recognition memory (explicit) and repetition priming (implicit) in amnesia. In the current investigation, human participants with amnesia categorized pictures of objects at study and then, at test, identified fragmented versions of studied (old) and nonstudied (new) objects (providing a measure of priming), and made a recognition memory judgment (old vs new) for each object. Numerous results in the amnesic patients were predicted in advance by the single-system model, as follows: (1) deficits in recognition memory and priming were evident relative to a control group; (2) items judged as old were identified at greater levels of fragmentation than items judged new, regardless of whether the items were actually old or new; and (3) the magnitude of the priming effect (the identification advantage for old vs new items) overall was greater than that of items judged new. Model evidence measures also favored the single-system model over two formal multiple-systems models. The findings support the single-system model, which explains the pattern of recognition and priming in amnesia primarily as a reduction in the strength of a single dimension of memory strength, rather than a selective explicit memory system deficit. PMID:25122896

  15. Out of Sorts? Some Remedies for Theories of Object Concepts: A Reply to Rhemtulla and Xu (2007)

    ERIC Educational Resources Information Center

    Blok, Sergey V.; Newman, George E.; Rips, Lance J.

    2007-01-01

    Responds to comments made by Rhemtulla and Xu on the current authors' original paper Concepts of individual objects (e.g., a favorite chair or pet) include knowledge that allows people to identify these objects, sometimes after long stretches of time. In an earlier article, the authors set out experimental findings and mathematical modeling to…

  16. Multi-criteria analysis for PM10 planning

    NASA Astrophysics Data System (ADS)

    Pisoni, Enrico; Carnevale, Claudio; Volta, Marialuisa

    To implement sound air quality policies, Regulatory Agencies require tools to evaluate outcomes and costs associated to different emission reduction strategies. These tools are even more useful when considering atmospheric PM10 concentrations due to the complex nonlinear processes that affect production and accumulation of the secondary fraction of this pollutant. The approaches presented in the literature (Integrated Assessment Modeling) are mainly cost-benefit and cost-effective analysis. In this work, the formulation of a multi-objective problem to control particulate matter is proposed. The methodology defines: (a) the control objectives (the air quality indicator and the emission reduction cost functions); (b) the decision variables (precursor emission reductions); (c) the problem constraints (maximum feasible technology reductions). The cause-effect relations between air quality indicators and decision variables are identified tuning nonlinear source-receptor models. The multi-objective problem solution provides to the decision maker a set of not-dominated scenarios representing the efficient trade-off between the air quality benefit and the internal costs (emission reduction technology costs). The methodology has been implemented for Northern Italy, often affected by high long-term exposure to PM10. The source-receptor models used in the multi-objective analysis are identified processing long-term simulations of GAMES multiphase modeling system, performed in the framework of CAFE-Citydelta project.

  17. Variability of Massive Young Stellar Objects in Cygnus-X

    NASA Astrophysics Data System (ADS)

    Thomas, Nancy H.; Hora, J. L.; Smith, H. A.

    2013-01-01

    Young stellar objects (YSOs) are stars in the process of formation. Several recent investigations have shown a high rate of photometric variability in YSOs at near- and mid-infrared wavelengths. Theoretical models for the formation of massive stars (1-10 solar masses) remain highly idealized, and little is known about the mechanisms that produce the variability. An ongoing Spitzer Space Telescope program is studying massive star formation in the Cygnus-X region. In conjunction with the Spitzer observations, we have conducted a ground-based near-infrared observing program of the Cygnus-X DR21 field using PAIRITEL, the automated infrared telescope at Whipple Observatory. Using the Stetson index for variability, we identified variable objects and a number of variable YSOs in our time-series PAIRITEL data of DR21. We have searched for periodicity among our variable objects using the Lomb-Scargle algorithm, and identified periodic variable objects with an average period of 8.07 days. Characterization of these variable and periodic objects will help constrain models of star formation present. This work is supported in part by the NSF REU and DOD ASSURE programs under NSF grant no. 0754568 and by the Smithsonian Institution.

  18. X-ray detection of ingested non-metallic foreign bodies.

    PubMed

    Saps, Miguel; Rosen, John M; Ecanow, Jacob

    2014-05-08

    To determine the utility of X-ray in identifying non-metallic foreign body (FB) and assess inter-radiologist agreement in identifying non-metal FB. Focus groups of nurses, fellows, and attending physicians were conducted to determine commonly ingested objects suitable for inclusion. Twelve potentially ingested objects (clay, plastic bead, crayon, plastic ring, plastic army figure, glass bead, paperclip, drywall anchor, eraser, Lego™, plastic triangle toy, and barrette) were embedded in a gelatin slab placed on top of a water-equivalent phantom to simulate density of a child's abdomen. The items were selected due to wide availability and appropriate size for accidental pediatric ingestion. Plain radiography of the embedded FBs was obtained. Five experienced radiologists blinded to number and types of objects were asked to identify the FBs. The radiologist was first asked to count the number of items that were visible then to identify the shape of each item and describe it to a study investigator who recorded all responses. Overall inter-rater reliability was analyzed using percent agreement and κ coefficient. We calculated P value to assess the probability of error involved in accepting the κ value. Fourteen objects were radiographed including 12 original objects and 2 duplicates. The model's validity was supported by clear identification of a radiolucent paperclip as a positive control, and lack of identification of plastic beads (negative control) despite repeated inclusion. Each radiologist identified 7-9 of the 14 objects (mean 8, 67%). Six unique objects (50%) were identified by all radiologists and four unique objects (33%) were not identified by any radiologist (plastic bead, Lego™, plastic triangle toy, and barrette). Identification of objects that were not present, false-positives, occurred 1-2 times per radiologist (mean 1.4). An additional 17% of unique objects were identified by less than half of the radiologists. Agreement between radiologists was considered almost perfect (kappa 0.86 ± 0.08, P < 0.0001). We demonstrate potential non-identification of commonly ingested non-metal FBs in children. A registry for radiographic visibility of ingested objects should be created to improve clinical decision-making.

  19. A general model-based design of experiments approach to achieve practical identifiability of pharmacokinetic and pharmacodynamic models.

    PubMed

    Galvanin, Federico; Ballan, Carlo C; Barolo, Massimiliano; Bezzo, Fabrizio

    2013-08-01

    The use of pharmacokinetic (PK) and pharmacodynamic (PD) models is a common and widespread practice in the preliminary stages of drug development. However, PK-PD models may be affected by structural identifiability issues intrinsically related to their mathematical formulation. A preliminary structural identifiability analysis is usually carried out to check if the set of model parameters can be uniquely determined from experimental observations under the ideal assumptions of noise-free data and no model uncertainty. However, even for structurally identifiable models, real-life experimental conditions and model uncertainty may strongly affect the practical possibility to estimate the model parameters in a statistically sound way. A systematic procedure coupling the numerical assessment of structural identifiability with advanced model-based design of experiments formulations is presented in this paper. The objective is to propose a general approach to design experiments in an optimal way, detecting a proper set of experimental settings that ensure the practical identifiability of PK-PD models. Two simulated case studies based on in vitro bacterial growth and killing models are presented to demonstrate the applicability and generality of the methodology to tackle model identifiability issues effectively, through the design of feasible and highly informative experiments.

  20. The past and future of modeling forest dynamics: from growth and yield curves to forest landscape models

    Treesearch

    Stephen R. Shifley; Hong S. He; Heike Lischke; Wen J. Wang; Wenchi Jin; Eric J. Gustafson; Jonathan R. Thompson; Frank R. Thompson; William D. Dijak; Jian Yang

    2017-01-01

    Context. Quantitative models of forest dynamics have followed a progression toward methods with increased detail, complexity, and spatial extent. Objectives. We highlight milestones in the development of forest dynamics models and identify future research and application opportunities. Methods. We reviewed...

  1. Integrity Determination for Image Rendering Vision Navigation

    DTIC Science & Technology

    2016-03-01

    identifying an object within a scene, tracking a SIFT feature between frames or matching images and/or features for stereo vision applications. This... object level, either in 2-D or 3-D, versus individual features. There is a breadth of information, largely from the machine vision community...matching or image rendering image correspondence approach is based upon using either 2-D or 3-D object models or templates to perform object detection or

  2. A Bayesian alternative for multi-objective ecohydrological model specification

    NASA Astrophysics Data System (ADS)

    Tang, Yating; Marshall, Lucy; Sharma, Ashish; Ajami, Hoori

    2018-01-01

    Recent studies have identified the importance of vegetation processes in terrestrial hydrologic systems. Process-based ecohydrological models combine hydrological, physical, biochemical and ecological processes of the catchments, and as such are generally more complex and parametric than conceptual hydrological models. Thus, appropriate calibration objectives and model uncertainty analysis are essential for ecohydrological modeling. In recent years, Bayesian inference has become one of the most popular tools for quantifying the uncertainties in hydrological modeling with the development of Markov chain Monte Carlo (MCMC) techniques. The Bayesian approach offers an appealing alternative to traditional multi-objective hydrologic model calibrations by defining proper prior distributions that can be considered analogous to the ad-hoc weighting often prescribed in multi-objective calibration. Our study aims to develop appropriate prior distributions and likelihood functions that minimize the model uncertainties and bias within a Bayesian ecohydrological modeling framework based on a traditional Pareto-based model calibration technique. In our study, a Pareto-based multi-objective optimization and a formal Bayesian framework are implemented in a conceptual ecohydrological model that combines a hydrological model (HYMOD) and a modified Bucket Grassland Model (BGM). Simulations focused on one objective (streamflow/LAI) and multiple objectives (streamflow and LAI) with different emphasis defined via the prior distribution of the model error parameters. Results show more reliable outputs for both predicted streamflow and LAI using Bayesian multi-objective calibration with specified prior distributions for error parameters based on results from the Pareto front in the ecohydrological modeling. The methodology implemented here provides insight into the usefulness of multiobjective Bayesian calibration for ecohydrologic systems and the importance of appropriate prior distributions in such approaches.

  3. Final Report - Enhanced LAW Glass Property - Composition Models - Phase 1 VSL-13R2940-1, Rev. 0, dated 9/27/2013

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kruger, Albert A.; Muller, I.; Gilbo, K.

    2013-11-13

    The objectives of this work are aimed at the development of enhanced LAW propertycomposition models that expand the composition region covered by the models. The models of interest include PCT, VHT, viscosity and electrical conductivity. This is planned as a multi-year effort that will be performed in phases with the objectives listed below for the current phase.  Incorporate property- composition data from the new glasses into the database.  Assess the database and identify composition spaces in the database that need augmentation.  Develop statistically-designed composition matrices to cover the composition regions identified in the above analysis.  Preparemore » crucible melts of glass compositions from the statistically-designed composition matrix and measure the properties of interest.  Incorporate the above property-composition data into the database.  Assess existing models against the complete dataset and, as necessary, start development of new models.« less

  4. Virtual expansion of the technical vision system for smart vehicles based on multi-agent cooperation model

    NASA Astrophysics Data System (ADS)

    Krapukhina, Nina; Senchenko, Roman; Kamenov, Nikolay

    2017-12-01

    Road safety and driving in dense traffic flows poses some challenges in receiving information about surrounding moving object, some of which can be in the vehicle's blind spot. This work suggests an approach to virtual monitoring of the objects in a current road scene via a system with a multitude of cooperating smart vehicles exchanging information. It also describes the intellectual agent model, and provides methods and algorithms of identifying and evaluating various characteristics of moving objects in video flow. Authors also suggest ways for integrating the information from the technical vision system into the model with further expansion of virtual monitoring for the system's objects. Implementation of this approach can help to expand the virtual field of view for a technical vision system.

  5. Enhanced detection and visualization of anomalies in spectral imagery

    NASA Astrophysics Data System (ADS)

    Basener, William F.; Messinger, David W.

    2009-05-01

    Anomaly detection algorithms applied to hyperspectral imagery are able to reliably identify man-made objects from a natural environment based on statistical/geometric likelyhood. The process is more robust than target identification, which requires precise prior knowledge of the object of interest, but has an inherently higher false alarm rate. Standard anomaly detection algorithms measure deviation of pixel spectra from a parametric model (either statistical or linear mixing) estimating the image background. The topological anomaly detector (TAD) creates a fully non-parametric, graph theory-based, topological model of the image background and measures deviation from this background using codensity. In this paper we present a large-scale comparative test of TAD against 80+ targets in four full HYDICE images using the entire canonical target set for generation of ROC curves. TAD will be compared against several statistics-based detectors including local RX and subspace RX. Even a perfect anomaly detection algorithm would have a high practical false alarm rate in most scenes simply because the user/analyst is not interested in every anomalous object. To assist the analyst in identifying and sorting objects of interest, we investigate coloring of the anomalies with principle components projections using statistics computed from the anomalies. This gives a very useful colorization of anomalies in which objects of similar material tend to have the same color, enabling an analyst to quickly sort and identify anomalies of highest interest.

  6. Aggregation in Network Models for Transportation Planning

    DOT National Transportation Integrated Search

    1978-02-01

    This report documents research performed on techniques of aggregation applied to network models used in transportation planning. The central objective of this research has been to identify, extend, and evaluate methods of aggregation so as to improve...

  7. "Failure-to-Identify" Hunting Incidents: A Resilience Engineering Approach.

    PubMed

    Bridges, Karl E; Corballis, Paul M; Hollnagel, Erik

    2018-03-01

    Objective The objective was to develop an understanding, using the Functional Resonance Analysis Method (FRAM), of the factors that could cause a deer hunter to misidentify their intended target. Background Hunting is a popular activity in many communities. However, hunters vary considerably based on training, experience, and expertise. Surprisingly, safety in hunting has not received much attention, especially failure-to-identify hunting incidents. These are incidents in which the hunter, after spotting and targeting their quarry, discharge their firearm only to discover they have been spotting and targeting another human, an inanimate object, or flora by mistake. The hunter must consider environment, target, time of day, weather, and many other factors-continuously evaluating whether the hunt should continue. To understand how these factors can relate to one another is fundamental to begin to understand how incidents happen. Method Workshops with highly experienced and active hunters led to the development of a FRAM model detailing the functions of a "Hunting FRAM." The model was evaluated for correctness based on confidential and anonymous near-miss event submissions by hunters. Results A FRAM model presenting the functions of a hunt was produced, evaluated, and accepted. Using the model, potential sources of incidents or other unintended outcomes were identified, which in turn helped to improve the model. Conclusion Utilizing principles of understanding and visualization tools of the FRAM, the findings create a foundation for safety improvements potentially through training or safety messages based on an increased understanding of the complexity of hunting.

  8. Assessing cognitive dysfunction in Parkinson's disease: An online tool to detect visuo‐perceptual deficits

    PubMed Central

    Schwarzkopf, Dietrich S.; Bahrami, Bahador; Fleming, Stephen M.; Jackson, Ben M.; Goch, Tristam J. C.; Saygin, Ayse P.; Miller, Luke E.; Pappa, Katerina; Pavisic, Ivanna; Schade, Rachel N.; Noyce, Alastair J.; Crutch, Sebastian J.; O'Keeffe, Aidan G.; Schrag, Anette E.; Morris, Huw R.

    2018-01-01

    ABSTRACT Background: People with Parkinson's disease (PD) who develop visuo‐perceptual deficits are at higher risk of dementia, but we lack tests that detect subtle visuo‐perceptual deficits and can be performed by untrained personnel. Hallucinations are associated with cognitive impairment and typically involve perception of complex objects. Changes in object perception may therefore be a sensitive marker of visuo‐perceptual deficits in PD. Objective: We developed an online platform to test visuo‐perceptual function. We hypothesised that (1) visuo‐perceptual deficits in PD could be detected using online tests, (2) object perception would be preferentially affected, and (3) these deficits would be caused by changes in perception rather than response bias. Methods: We assessed 91 people with PD and 275 controls. Performance was compared using classical frequentist statistics. We then fitted a hierarchical Bayesian signal detection theory model to a subset of tasks. Results: People with PD were worse than controls at object recognition, showing no deficits in other visuo‐perceptual tests. Specifically, they were worse at identifying skewed images (P < .0001); at detecting hidden objects (P = .0039); at identifying objects in peripheral vision (P < .0001); and at detecting biological motion (P = .0065). In contrast, people with PD were not worse at mental rotation or subjective size perception. Using signal detection modelling, we found this effect was driven by change in perceptual sensitivity rather than response bias. Conclusions: Online tests can detect visuo‐perceptual deficits in people with PD, with object recognition particularly affected. Ultimately, visuo‐perceptual tests may be developed to identify at‐risk patients for clinical trials to slow PD dementia. © 2018 The Authors. Movement Disorders published by Wiley Periodicals, Inc. on behalf of International Parkinson and Movement Disorder Society. PMID:29473691

  9. Synthetic Helizyme Enzymes.

    DTIC Science & Technology

    1987-08-18

    NOTATION 17. COSATI CODES 18. SUBJECT TERMS (Continue on reverse if necessary and identify by block number) FIELD GROUP SUB-GROUP I Synthetic enzymes...chymotrypsin; molecular modeling; 03 peptide synthesis 19. ABSTRACT (Continue on reverse if necessary and identify by block number) The object of this...for AChE. Additionally, synthetic models ofcL- chymotrypsin built using cyclo- dextrins show catalytic activity over a limited pH range.2 Using L

  10. Brain regions involved in subprocesses of small-space episodic object-location memory: a systematic review of lesion and functional neuroimaging studies.

    PubMed

    Zimmermann, Kathrin; Eschen, Anne

    2017-04-01

    Object-location memory (OLM) enables us to keep track of the locations of objects in our environment. The neurocognitive model of OLM (Postma, A., Kessels, R. P. C., & Van Asselen, M. (2004). The neuropsychology of object-location memory. In G. L. Allen (Ed.), Human spatial memory: Remembering where (pp. 143-160). Mahwah, NJ: Lawrence Erlbaum, Postma, A., Kessels, R. P. C., & Van Asselen, M. (2008). How the brain remembers and forgets where things are: The neurocognition of object-location memory. Neuroscience & Biobehavioral Reviews, 32, 1339-1345. doi: 10.1016/j.neubiorev.2008.05.001 ) proposes that distinct brain regions are specialised for different subprocesses of OLM (object processing, location processing, and object-location binding; categorical and coordinate OLM; egocentric and allocentric OLM). It was based mainly on findings from lesion studies. However, recent episodic memory studies point to a contribution of additional or different brain regions to object and location processing within episodic OLM. To evaluate and update the neurocognitive model of OLM, we therefore conducted a systematic literature search for lesion as well as functional neuroimaging studies contrasting small-space episodic OLM with object memory or location memory. We identified 10 relevant lesion studies and 8 relevant functional neuroimaging studies. We could confirm some of the proposals of the neurocognitive model of OLM, but also differing hypotheses from episodic memory research, about which brain regions are involved in the different subprocesses of small-space episodic OLM. In addition, we were able to identify new brain regions as well as important research gaps.

  11. An export coefficient based inexact fuzzy bi-level multi-objective programming model for the management of agricultural nonpoint source pollution under uncertainty

    NASA Astrophysics Data System (ADS)

    Cai, Yanpeng; Rong, Qiangqiang; Yang, Zhifeng; Yue, Wencong; Tan, Qian

    2018-02-01

    In this research, an export coefficient based inexact fuzzy bi-level multi-objective programming (EC-IFBLMOP) model was developed through integrating export coefficient model (ECM), interval parameter programming (IPP) and fuzzy parameter programming (FPP) within a bi-level multi-objective programming framework. The proposed EC-IFBLMOP model can effectively deal with the multiple uncertainties expressed as discrete intervals and fuzzy membership functions. Also, the complexities in agricultural systems, such as the cooperation and gaming relationship between the decision makers at different levels, can be fully considered in the model. The developed model was then applied to identify the optimal land use patterns and BMP implementing levels for agricultural nonpoint source (NPS) pollution management in a subcatchment in the upper stream watershed of the Miyun Reservoir in north China. The results of the model showed that the desired optimal land use patterns and implementing levels of best management of practices (BMPs) would be obtained. It is the gaming result between the upper- and lower-level decision makers, when the allowable discharge amounts of NPS pollutants were limited. Moreover, results corresponding to different decision scenarios could provide a set of decision alternatives for the upper- and lower-level decision makers to identify the most appropriate management strategy. The model has a good applicability and can be effectively utilized for agricultural NPS pollution management.

  12. Localization and tracking of moving objects in two-dimensional space by echolocation.

    PubMed

    Matsuo, Ikuo

    2013-02-01

    Bats use frequency-modulated echolocation to identify and capture moving objects in real three-dimensional space. Experimental evidence indicates that bats are capable of locating static objects with a range accuracy of less than 1 μs. A previously introduced model estimates ranges of multiple, static objects using linear frequency modulation (LFM) sound and Gaussian chirplets with a carrier frequency compatible with bat emission sweep rates. The delay time for a single object was estimated with an accuracy of about 1.3 μs by measuring the echo at a low signal-to-noise ratio (SNR). The range accuracy was dependent not only on the SNR but also the Doppler shift, which was dependent on the movements. However, it was unclear whether this model could estimate the moving object range at each timepoint. In this study, echoes were measured from the rotating pole at two receiving points by intermittently emitting LFM sounds. The model was shown to localize moving objects in two-dimensional space by accurately estimating the object's range at each timepoint.

  13. Salient Object Detection via Structured Matrix Decomposition.

    PubMed

    Peng, Houwen; Li, Bing; Ling, Haibin; Hu, Weiming; Xiong, Weihua; Maybank, Stephen J

    2016-05-04

    Low-rank recovery models have shown potential for salient object detection, where a matrix is decomposed into a low-rank matrix representing image background and a sparse matrix identifying salient objects. Two deficiencies, however, still exist. First, previous work typically assumes the elements in the sparse matrix are mutually independent, ignoring the spatial and pattern relations of image regions. Second, when the low-rank and sparse matrices are relatively coherent, e.g., when there are similarities between the salient objects and background or when the background is complicated, it is difficult for previous models to disentangle them. To address these problems, we propose a novel structured matrix decomposition model with two structural regularizations: (1) a tree-structured sparsity-inducing regularization that captures the image structure and enforces patches from the same object to have similar saliency values, and (2) a Laplacian regularization that enlarges the gaps between salient objects and the background in feature space. Furthermore, high-level priors are integrated to guide the matrix decomposition and boost the detection. We evaluate our model for salient object detection on five challenging datasets including single object, multiple objects and complex scene images, and show competitive results as compared with 24 state-of-the-art methods in terms of seven performance metrics.

  14. Task 21 - Development of Systems Engineering Applications for Decontamination and Decommissioning Activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erickson, T.A.

    1998-11-01

    The objectives of this task are to: Develop a model (paper) to estimate the cost and waste generation of cleanup within the Environmental Management (EM) complex; Identify technologies applicable to decontamination and decommissioning (D and D) operations within the EM complex; Develop a database of facility information as linked to project baseline summaries (PBSs). The above objectives are carried out through the following four subtasks: Subtask 1--D and D Model Development, Subtask 2--Technology List; Subtask 3--Facility Database, and Subtask 4--Incorporation into a User Model.

  15. A scoping review about conference objectives and evaluative practices: how do we get more out of them?

    PubMed

    Neves, Justin; Lavis, John N; Ranson, M Kent

    2012-08-02

    Large multi-day conferences have often been criticized as ineffective ways to improve social outcomes and to influence policy or practice. Unfortunately, many conference evaluations have also been inadequate in determining the impact of a conference on its associated social sector, with little evidence gathered or analyzed to substantiate or refute these criticisms. The aim of this scoping review is to investigate and report stakeholders' objectives for planning or participating in large multi-day conferences and how these objectives are being evaluated. We conducted a scoping review supplemented by a small number of key informant interviews. Eight bibliographic databases were systematically searched to identify papers describing conference objectives and/or evaluations. We developed a conference evaluation framework based on theoretical models and empirical findings, which structured the descriptive synthesis of the data. We identified 3,073 potential papers for review, of which 44 were included in this study. Our evaluation framework connects five key elements in planning a conference and its evaluation (number in brackets refers to number of themes identified): conference objectives (8), purpose of evaluation (7), evaluation methods (5), indicators of success (9) and theories/models (8). Further analysis of indicators of success identified three categories of indicators with differing scopes (i.e. immediate, prospective or follow-up) as well as empirical links between the purpose of evaluations and these indicators. Conference objectives and evaluations were largely correlated with the type of conference (i.e. academic, political/governmental or business) but diverse overall. While much can be done to improve the quality and usefulness of conference evaluations, there are innovative assessments that are currently being utilized by some conferences and warrant further investigation. This review provides conference evaluators and organizers a simple resource to improve their own assessments by highlighting and categorizing potential objectives and evaluation strategies.

  16. A scoping review about conference objectives and evaluative practices: how do we get more out of them?

    PubMed Central

    2012-01-01

    Large multi-day conferences have often been criticized as ineffective ways to improve social outcomes and to influence policy or practice. Unfortunately, many conference evaluations have also been inadequate in determining the impact of a conference on its associated social sector, with little evidence gathered or analyzed to substantiate or refute these criticisms. The aim of this scoping review is to investigate and report stakeholders’ objectives for planning or participating in large multi-day conferences and how these objectives are being evaluated. We conducted a scoping review supplemented by a small number of key informant interviews. Eight bibliographic databases were systematically searched to identify papers describing conference objectives and/or evaluations. We developed a conference evaluation framework based on theoretical models and empirical findings, which structured the descriptive synthesis of the data. We identified 3,073 potential papers for review, of which 44 were included in this study. Our evaluation framework connects five key elements in planning a conference and its evaluation (number in brackets refers to number of themes identified): conference objectives (8), purpose of evaluation (7), evaluation methods (5), indicators of success (9) and theories/models (8). Further analysis of indicators of success identified three categories of indicators with differing scopes (i.e. immediate, prospective or follow-up) as well as empirical links between the purpose of evaluations and these indicators. Conference objectives and evaluations were largely correlated with the type of conference (i.e. academic, political/governmental or business) but diverse overall. While much can be done to improve the quality and usefulness of conference evaluations, there are innovative assessments that are currently being utilized by some conferences and warrant further investigation. This review provides conference evaluators and organizers a simple resource to improve their own assessments by highlighting and categorizing potential objectives and evaluation strategies. PMID:22857399

  17. How Do They Manage? An Investigation of Early Childhood Leadership

    ERIC Educational Resources Information Center

    Aubrey, Carol; Godfrey, Ray; Harris, Alma

    2013-01-01

    Early childhood (EC) leadership literature indicates few theoretically based studies identifying and testing different models and characteristics of leadership. Objectives were thus to identify, describe and analyse what leadership meant to key EC participants; to consider roles, responsibilities and characteristics; to investigate core…

  18. An Analysis of Health Manpower Models: Volume I.

    ERIC Educational Resources Information Center

    Bonder, Seth; And Others

    Objectives of the project were to identify and describe problem areas and policy issues confronting health manpower planning agencies at all levels, compile an inventory of models and evaluate their usefulness, and to evaluate the potential usefulness of two models (developed under contract to the Bureau of Health Resources Development) designed…

  19. Project Simu-School Component Washington State University

    ERIC Educational Resources Information Center

    Glass, Thomas E.

    1976-01-01

    This component of the project attempts to facilitate planning by furnishing models that manage cumbersome and complex data, supply an objectivity that identifies all relationships between elements of the model, and provide a quantitative model allowing for various forecasting techniques that describe the long-range impact of decisions. (Author/IRT)

  20. Neuronal encoding of object and distance information: a model simulation study on naturalistic optic flow processing

    PubMed Central

    Hennig, Patrick; Egelhaaf, Martin

    2011-01-01

    We developed a model of the input circuitry of the FD1 cell, an identified motion-sensitive interneuron in the blowfly's visual system. The model circuit successfully reproduces the FD1 cell's most conspicuous property: its larger responses to objects than to spatially extended patterns. The model circuit also mimics the time-dependent responses of FD1 to dynamically complex naturalistic stimuli, shaped by the blowfly's saccadic flight and gaze strategy: the FD1 responses are enhanced when, as a consequence of self-motion, a nearby object crosses the receptive field during intersaccadic intervals. Moreover, the model predicts that these object-induced responses are superimposed by pronounced pattern-dependent fluctuations during movements on virtual test flights in a three-dimensional environment with systematic modifications of the environmental patterns. Hence, the FD1 cell is predicted to detect not unambiguously objects defined by the spatial layout of the environment, but to be also sensitive to objects distinguished by textural features. These ambiguous detection abilities suggest an encoding of information about objects—irrespective of the features by which the objects are defined—by a population of cells, with the FD1 cell presumably playing a prominent role in such an ensemble. PMID:22461769

  1. Under what conditions is recognition spared relative to recall after selective hippocampal damage in humans?

    PubMed

    Holdstock, J S; Mayes, A R; Roberts, N; Cezayirli, E; Isaac, C L; O'Reilly, R C; Norman, K A

    2002-01-01

    The claim that recognition memory is spared relative to recall after focal hippocampal damage has been disputed in the literature. We examined this claim by investigating object and object-location recall and recognition memory in a patient, YR, who has adult-onset selective hippocampal damage. Our aim was to identify the conditions under which recognition was spared relative to recall in this patient. She showed unimpaired forced-choice object recognition but clearly impaired recall, even when her control subjects found the object recognition task to be numerically harder than the object recall task. However, on two other recognition tests, YR's performance was not relatively spared. First, she was clearly impaired at an equivalently difficult yes/no object recognition task, but only when targets and foils were very similar. Second, YR was clearly impaired at forced-choice recognition of object-location associations. This impairment was also unrelated to difficulty because this task was no more difficult than the forced-choice object recognition task for control subjects. The clear impairment of yes/no, but not of forced-choice, object recognition after focal hippocampal damage, when targets and foils are very similar, is predicted by the neural network-based Complementary Learning Systems model of recognition. This model postulates that recognition is mediated by hippocampally dependent recollection and cortically dependent familiarity; thus hippocampal damage should not impair item familiarity. The model postulates that familiarity is ineffective when very similar targets and foils are shown one at a time and subjects have to identify which items are old (yes/no recognition). In contrast, familiarity is effective in discriminating which of similar targets and foils, seen together, is old (forced-choice recognition). Independent evidence from the remember/know procedure also indicates that YR's familiarity is normal. The Complementary Learning Systems model can also accommodate the clear impairment of forced-choice object-location recognition memory if it incorporates the view that the most complete convergence of spatial and object information, represented in different cortical regions, occurs in the hippocampus.

  2. Mathematical analysis techniques for modeling the space network activities

    NASA Technical Reports Server (NTRS)

    Foster, Lisa M.

    1992-01-01

    The objective of the present work was to explore and identify mathematical analysis techniques, and in particular, the use of linear programming. This topic was then applied to the Tracking and Data Relay Satellite System (TDRSS) in order to understand the space network better. Finally, a small scale version of the system was modeled, variables were identified, data was gathered, and comparisons were made between actual and theoretical data.

  3. CrossTalk. The Journal of Defense Software Engineering. Volume 25, Number 3

    DTIC Science & Technology

    2012-06-01

    OMG) standard Business Process Modeling and Nota- tion ( BPMN ) [6] graphical notation. I will address each of these: identify and document steps...to a value stream map using BPMN and textual process narratives. The resulting process narratives or process metadata includes key information...objectives. Once the processes are identified we can graphically document them capturing the process using BPMN (see Figure 1). The BPMN models

  4. Modeling and forecasting US presidential election using learning algorithms

    NASA Astrophysics Data System (ADS)

    Zolghadr, Mohammad; Niaki, Seyed Armin Akhavan; Niaki, S. T. A.

    2017-09-01

    The primary objective of this research is to obtain an accurate forecasting model for the US presidential election. To identify a reliable model, artificial neural networks (ANN) and support vector regression (SVR) models are compared based on some specified performance measures. Moreover, six independent variables such as GDP, unemployment rate, the president's approval rate, and others are considered in a stepwise regression to identify significant variables. The president's approval rate is identified as the most significant variable, based on which eight other variables are identified and considered in the model development. Preprocessing methods are applied to prepare the data for the learning algorithms. The proposed procedure significantly increases the accuracy of the model by 50%. The learning algorithms (ANN and SVR) proved to be superior to linear regression based on each method's calculated performance measures. The SVR model is identified as the most accurate model among the other models as this model successfully predicted the outcome of the election in the last three elections (2004, 2008, and 2012). The proposed approach significantly increases the accuracy of the forecast.

  5. Deep Uncertainties in Sea-Level Rise and Storm Surge Projections: Implications for Coastal Flood Risk Management.

    PubMed

    Oddo, Perry C; Lee, Ben S; Garner, Gregory G; Srikrishnan, Vivek; Reed, Patrick M; Forest, Chris E; Keller, Klaus

    2017-09-05

    Sea levels are rising in many areas around the world, posing risks to coastal communities and infrastructures. Strategies for managing these flood risks present decision challenges that require a combination of geophysical, economic, and infrastructure models. Previous studies have broken important new ground on the considerable tensions between the costs of upgrading infrastructure and the damages that could result from extreme flood events. However, many risk-based adaptation strategies remain silent on certain potentially important uncertainties, as well as the tradeoffs between competing objectives. Here, we implement and improve on a classic decision-analytical model (Van Dantzig 1956) to: (i) capture tradeoffs across conflicting stakeholder objectives, (ii) demonstrate the consequences of structural uncertainties in the sea-level rise and storm surge models, and (iii) identify the parametric uncertainties that most strongly influence each objective using global sensitivity analysis. We find that the flood adaptation model produces potentially myopic solutions when formulated using traditional mean-centric decision theory. Moving from a single-objective problem formulation to one with multiobjective tradeoffs dramatically expands the decision space, and highlights the need for compromise solutions to address stakeholder preferences. We find deep structural uncertainties that have large effects on the model outcome, with the storm surge parameters accounting for the greatest impacts. Global sensitivity analysis effectively identifies important parameter interactions that local methods overlook, and that could have critical implications for flood adaptation strategies. © 2017 Society for Risk Analysis.

  6. System and method for automated object detection in an image

    DOEpatents

    Kenyon, Garrett T.; Brumby, Steven P.; George, John S.; Paiton, Dylan M.; Schultz, Peter F.

    2015-10-06

    A contour/shape detection model may use relatively simple and efficient kernels to detect target edges in an object within an image or video. A co-occurrence probability may be calculated for two or more edge features in an image or video using an object definition. Edge features may be differentiated between in response to measured contextual support, and prominent edge features may be extracted based on the measured contextual support. The object may then be identified based on the extracted prominent edge features.

  7. Bottlenose dolphins perceive object features through echolocation.

    PubMed

    Harley, Heidi E; Putman, Erika A; Roitblat, Herbert L

    2003-08-07

    How organisms (including people) recognize distant objects is a fundamental question. The correspondence between object characteristics (distal stimuli), like visual shape, and sensory characteristics (proximal stimuli), like retinal projection, is ambiguous. The view that sensory systems are 'designed' to 'pick up' ecologically useful information is vague about how such mechanisms might work. In echolocating dolphins, which are studied as models for object recognition sonar systems, the correspondence between echo characteristics and object characteristics is less clear. Many cognitive scientists assume that object characteristics are extracted from proximal stimuli, but evidence for this remains ambiguous. For example, a dolphin may store 'sound templates' in its brain and identify whole objects by listening for a particular sound. Alternatively, a dolphin's brain may contain algorithms, derived through natural endowments or experience or both, which allow it to identify object characteristics based on sounds. The standard method used to address this question in many species is indirect and has led to equivocal results with dolphins. Here we outline an appropriate method and test it to show that dolphins extract object characteristics directly from echoes.

  8. Landscape silviculture for late-successional reserve management

    Treesearch

    S Hummel; R.J. Barbour

    2007-01-01

    The effects of different combinations of multiple, variable-intensity silvicultural treatments on fire and habitat management objectives were evaluated for a ±6,000 ha forest reserve using simulation models and optimization techniques. Our methods help identify areas within the reserve where opportunities exist to minimize conflict between the dual landscape objectives...

  9. Southwest Ecosystem Services Project (SwESP): Identifying Ecosystems Services Based on Tribal Values

    EPA Science Inventory

    USEPA Office of Research Development (ORD) new strategic focus is the measurement of benefits and services of ecosystem. The primary objective of the Ecosystem Services Research Program (ESRP) is to identify, measure, monitor, model and map ecosystem services and to enable their ...

  10. The Design, Implementation, and Evaluation of Online Credit Nutrition Courses: A Systematic Review

    ERIC Educational Resources Information Center

    Cohen, Nancy L.; Carbone, Elena T.; Beffa-Negrini, Patricia A.

    2011-01-01

    Objective: To assess how postsecondary online nutrition education courses (ONEC) are delivered, determine ONEC effectiveness, identify theoretical models used, and identify future research needs. Design: Systematic search of database literature. Setting: Postsecondary education. Participants: Nine research articles evaluating postsecondary ONEC.…

  11. Generalisation, decision making, and embodiment effects in mental rotation: A neurorobotic architecture tested with a humanoid robot.

    PubMed

    Seepanomwan, Kristsana; Caligiore, Daniele; Cangelosi, Angelo; Baldassarre, Gianluca

    2015-12-01

    Mental rotation, a classic experimental paradigm of cognitive psychology, tests the capacity of humans to mentally rotate a seen object to decide if it matches a target object. In recent years, mental rotation has been investigated with brain imaging techniques to identify the brain areas involved. Mental rotation has also been investigated through the development of neural-network models, used to identify the specific mechanisms that underlie its process, and with neurorobotics models to investigate its embodied nature. Current models, however, have limited capacities to relate to neuro-scientific evidence, to generalise mental rotation to new objects, to suitably represent decision making mechanisms, and to allow the study of the effects of overt gestures on mental rotation. The work presented in this study overcomes these limitations by proposing a novel neurorobotic model that has a macro-architecture constrained by knowledge held on brain, encompasses a rather general mental rotation mechanism, and incorporates a biologically plausible decision making mechanism. The model was tested using the humanoid robot iCub in tasks requiring the robot to mentally rotate 2D geometrical images appearing on a computer screen. The results show that the robot gained an enhanced capacity to generalise mental rotation to new objects and to express the possible effects of overt movements of the wrist on mental rotation. The model also represents a further step in the identification of the embodied neural mechanisms that may underlie mental rotation in humans and might also give hints to enhance robots' planning capabilities. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  12. Hamiltonian dynamics of extended objects

    NASA Astrophysics Data System (ADS)

    Capovilla, R.; Guven, J.; Rojas, E.

    2004-12-01

    We consider relativistic extended objects described by a reparametrization-invariant local action that depends on the extrinsic curvature of the worldvolume swept out by the object as it evolves. We provide a Hamiltonian formulation of the dynamics of such higher derivative models which is motivated by the ADM formulation of general relativity. The canonical momenta are identified by looking at boundary behaviour under small deformations of the action; the relationship between the momentum conjugate to the embedding functions and the conserved momentum density is established. The canonical Hamiltonian is constructed explicitly; the constraints on the phase space, both primary and secondary, are identified and the role they play in the theory is described. The multipliers implementing the primary constraints are identified in terms of the ADM lapse and shift variables and Hamilton's equations are shown to be consistent with the Euler Lagrange equations.

  13. Automatically Determining Scale Within Unstructured Point Clouds

    NASA Astrophysics Data System (ADS)

    Kadamen, Jayren; Sithole, George

    2016-06-01

    Three dimensional models obtained from imagery have an arbitrary scale and therefore have to be scaled. Automatically scaling these models requires the detection of objects in these models which can be computationally intensive. Real-time object detection may pose problems for applications such as indoor navigation. This investigation poses the idea that relational cues, specifically height ratios, within indoor environments may offer an easier means to obtain scales for models created using imagery. The investigation aimed to show two things, (a) that the size of objects, especially the height off ground is consistent within an environment, and (b) that based on this consistency, objects can be identified and their general size used to scale a model. To test the idea a hypothesis is first tested on a terrestrial lidar scan of an indoor environment. Later as a proof of concept the same test is applied to a model created using imagery. The most notable finding was that the detection of objects can be more readily done by studying the ratio between the dimensions of objects that have their dimensions defined by human physiology. For example the dimensions of desks and chairs are related to the height of an average person. In the test, the difference between generalised and actual dimensions of objects were assessed. A maximum difference of 3.96% (2.93cm) was observed from automated scaling. By analysing the ratio between the heights (distance from the floor) of the tops of objects in a room, identification was also achieved.

  14. A formal theory of feature binding in object perception.

    PubMed

    Ashby, F G; Prinzmetal, W; Ivry, R; Maddox, W T

    1996-01-01

    Visual objects are perceived correctly only if their features are identified and then bound together. Illusory conjunctions result when feature identification is correct but an error occurs during feature binding. A new model is proposed that assumes feature binding errors occur because of uncertainty about the location of visual features. This model accounted for data from 2 new experiments better than a model derived from A. M. Treisman and H. Schmidt's (1982) feature integration theory. The traditional method for detecting the occurrence of true illusory conjunctions is shown to be fundamentally flawed. A reexamination of 2 previous studies provided new insights into the role of attention and location information in object perception and a reinterpretation of the deficits in patients who exhibit attentional disorders.

  15. Modeling recall memory for emotional objects in Alzheimer's disease.

    PubMed

    Sundstrøm, Martin

    2011-07-01

    To examine whether emotional memory (EM) of objects with self-reference in Alzheimer's disease (AD) can be modeled with binomial logistic regression in a free recall and an object recognition test to predict EM enhancement. Twenty patients with AD and twenty healthy controls were studied. Six objects (three presented as gifts) were shown to each participant. Ten minutes later, a free recall and a recognition test were applied. The recognition test had target-objects mixed with six similar distracter objects. Participants were asked to name any object in the recall test and identify each object in the recognition test as known or unknown. The total of gift objects recalled in AD patients (41.6%) was larger than neutral objects (13.3%) and a significant EM recall effect for gifts was found (Wilcoxon: p < .003). EM was not found for recognition in AD patients due to a ceiling effect. Healthy older adults scored overall higher in recall and recognition but showed no EM enhancement due to a ceiling effect. A logistic regression showed that likelihood of emotional recall memory can be modeled as a function of MMSE score (p < .014) and object status (p < .0001) as gift or non-gift. Recall memory was enhanced in AD patients for emotional objects indicating that EM in mild to moderate AD although impaired can be provoked with strong emotional load. The logistic regression model suggests that EM declines with the progression of AD rather than disrupts and may be a useful tool for evaluating magnitude of emotional load.

  16. Building an Ontology for Identity Resolution in Healthcare and Public Health

    PubMed Central

    Duncan, Jeffrey; Eilbeck, Karen; Narus, Scott P.; Clyde, Stephen; Thornton, Sidney; Staes, Catherine

    2015-01-01

    Integration of disparate information from electronic health records, clinical data warehouses, birth certificate registries and other public health information systems offers great potential for clinical care, public health practice, and research. Such integration, however, depends on correctly matching patient-specific records using demographic identifiers. Without standards for these identifiers, record linkage is complicated by issues of structural and semantic heterogeneity. Objectives: Our objectives were to develop and validate an ontology to: 1) identify components of identity and events subsequent to birth that result in creation, change, or sharing of identity information; 2) develop an ontology to facilitate data integration from multiple healthcare and public health sources; and 3) validate the ontology’s ability to model identity-changing events over time. Methods: We interviewed domain experts in area hospitals and public health programs and developed process models describing the creation and transmission of identity information among various organizations for activities subsequent to a birth event. We searched for existing relevant ontologies. We validated the content of our ontology with simulated identity information conforming to scenarios identified in our process models. Results: We chose the Simple Event Model (SEM) to describe events in early childhood and integrated the Clinical Element Model (CEM) for demographic information. We demonstrated the ability of the combined SEM-CEM ontology to model identity events over time. Conclusion: The use of an ontology can overcome issues of semantic and syntactic heterogeneity to facilitate record linkage. PMID:26392849

  17. A multiobjective modeling approach to locate multi-compartment containers for urban-sorted waste

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tralhao, Lino, E-mail: lmlrt@inescc.p; Coutinho-Rodrigues, Joao, E-mail: coutinho@dec.uc.p; Alcada-Almeida, Luis, E-mail: alcada@inescc.p

    2010-12-15

    The location of multi-compartment sorted waste containers for recycling purposes in cities is an important problem in the context of urban waste management. The costs associated with those facilities and the impacts placed on populations are important concerns. This paper introduces a mixed-integer, multiobjective programming approach to identify the locations and capacities of such facilities. The approach incorporates an optimization model in a Geographical Information System (GIS)-based interactive decision support system that includes four objectives. The first objective minimizes the total investment cost; the second one minimizes the average distance from dwellings to the respective multi-compartment container; the last twomore » objectives address the 'pull' and 'push' characteristics of the decision problem, one by minimizing the number of individuals too close to any container, and the other by minimizing the number of dwellings too far from the respective multi-compartment container. The model determines the number of facilities to be opened, the respective container capacities, their locations, their respective shares of the total waste of each type to be collected, and the dwellings assigned to each facility. The approach proposed was tested with a case study for the historical center of Coimbra city, Portugal, where a large urban renovation project, addressing about 800 buildings, is being undertaken. This paper demonstrates that the models and techniques incorporated in the interactive decision support system (IDSS) can be used to assist a decision maker (DM) in analyzing this complex problem in a realistically sized urban application. Ten solutions consisting of different combinations of underground containers for the disposal of four types of sorted waste in 12 candidate sites, were generated. These solutions and tradeoffs among the objectives are presented to the DM via tables, graphs, color-coded maps and other graphics. The DM can then use this information to 'guide' the IDSS in identifying additional solutions of potential interest. Nevertheless, this research showed that a particular solution with a better objective balance can be identified. The actual sequence of additional solutions generated will depend upon the objectives and preferences of the DM in a specific application.« less

  18. Automatic Adviser on stationary devices status identification and anticipated change

    NASA Astrophysics Data System (ADS)

    Shabelnikov, A. N.; Liabakh, N. N.; Gibner, Ya M.; Pushkarev, E. A.

    2018-05-01

    A task is defined to synthesize an Automatic Adviser to identify the automation systems stationary devices status using an autoregressive model of changing their key parameters. An applied model type was rationalized and the research objects monitoring process algorithm was developed. A complex of mobile objects status operation simulation and prediction results analysis was proposed. Research results are commented using a specific example of a hump yard compressor station. The work was supported by the Russian Fundamental Research Fund, project No. 17-20-01040.

  19. Object-orientated DBMS techniques for time-oriented medical record.

    PubMed

    Pinciroli, F; Combi, C; Pozzi, G

    1992-01-01

    In implementing time-orientated medical record (TOMR) management systems, use of a relational model played a big role. Many applications have been developed to extend query and data manipulation languages to temporal aspects of information. Our experience in developing TOMR revealed some deficiencies inside the relational model, such as: (a) abstract data type definition; (b) unified view of data, at a programming level; (c) management of temporal data; (d) management of signals and images. We identified some first topics to face by an object-orientated approach to database design. This paper describes the first steps in designing and implementing a TOMR by an object-orientated DBMS.

  20. 3-D Object Recognition from Point Cloud Data

    NASA Astrophysics Data System (ADS)

    Smith, W.; Walker, A. S.; Zhang, B.

    2011-09-01

    The market for real-time 3-D mapping includes not only traditional geospatial applications but also navigation of unmanned autonomous vehicles (UAVs). Massively parallel processes such as graphics processing unit (GPU) computing make real-time 3-D object recognition and mapping achievable. Geospatial technologies such as digital photogrammetry and GIS offer advanced capabilities to produce 2-D and 3-D static maps using UAV data. The goal is to develop real-time UAV navigation through increased automation. It is challenging for a computer to identify a 3-D object such as a car, a tree or a house, yet automatic 3-D object recognition is essential to increasing the productivity of geospatial data such as 3-D city site models. In the past three decades, researchers have used radiometric properties to identify objects in digital imagery with limited success, because these properties vary considerably from image to image. Consequently, our team has developed software that recognizes certain types of 3-D objects within 3-D point clouds. Although our software is developed for modeling, simulation and visualization, it has the potential to be valuable in robotics and UAV applications. The locations and shapes of 3-D objects such as buildings and trees are easily recognizable by a human from a brief glance at a representation of a point cloud such as terrain-shaded relief. The algorithms to extract these objects have been developed and require only the point cloud and minimal human inputs such as a set of limits on building size and a request to turn on a squaring option. The algorithms use both digital surface model (DSM) and digital elevation model (DEM), so software has also been developed to derive the latter from the former. The process continues through the following steps: identify and group 3-D object points into regions; separate buildings and houses from trees; trace region boundaries; regularize and simplify boundary polygons; construct complex roofs. Several case studies have been conducted using a variety of point densities, terrain types and building densities. The results have been encouraging. More work is required for better processing of, for example, forested areas, buildings with sides that are not at right angles or are not straight, and single trees that impinge on buildings. Further work may also be required to ensure that the buildings extracted are of fully cartographic quality. A first version will be included in production software later in 2011. In addition to the standard geospatial applications and the UAV navigation, the results have a further advantage: since LiDAR data tends to be accurately georeferenced, the building models extracted can be used to refine image metadata whenever the same buildings appear in imagery for which the GPS/IMU values are poorer than those for the LiDAR.

  1. Project Photofly: New 3d Modeling Online Web Service (case Studies and Assessments)

    NASA Astrophysics Data System (ADS)

    Abate, D.; Furini, G.; Migliori, S.; Pierattini, S.

    2011-09-01

    During summer 2010, Autodesk has released a still ongoing project called Project Photofly, freely downloadable from AutodeskLab web site until August 1 2011. Project Photofly based on computer-vision and photogrammetric principles, exploiting the power of cloud computing, is a web service able to convert collections of photographs into 3D models. Aim of our research was to evaluate the Project Photofly, through different case studies, for 3D modeling of cultural heritage monuments and objects, mostly to identify for which goals and objects it is suitable. The automatic approach will be mainly analyzed.

  2. Detecting abandoned objects using interacting multiple models

    NASA Astrophysics Data System (ADS)

    Becker, Stefan; Münch, David; Kieritz, Hilke; Hübner, Wolfgang; Arens, Michael

    2015-10-01

    In recent years, the wide use of video surveillance systems has caused an enormous increase in the amount of data that has to be stored, monitored, and processed. As a consequence, it is crucial to support human operators with automated surveillance applications. Towards this end an intelligent video analysis module for real-time alerting in case of abandoned objects in public spaces is proposed. The overall processing pipeline consists of two major parts. First, person motion is modeled using an Interacting Multiple Model (IMM) filter. The IMM filter estimates the state of a person according to a finite-state, discrete-time Markov chain. Second, the location of persons that stay at a fixed position defines a region of interest, in which a nonparametric background model with dynamic per-pixel state variables identifies abandoned objects. In case of a detected abandoned object, an alarm event is triggered. The effectiveness of the proposed system is evaluated on the PETS 2006 dataset and the i-Lids dataset, both reflecting prototypical surveillance scenarios.

  3. Unanticipated Learning Outcomes Associated with Commitment to Change in Continuing Medical Education

    ERIC Educational Resources Information Center

    Dolcourt, Jack L.; Zuckerman, Grace

    2003-01-01

    Introduction: Educator-derived, predetermined instructional objectives are integral to the traditional instructional model and form the linkage between instructional design and postinstruction evaluation. The traditional model does not consider unanticipated learning outcomes. We explored the contribution of learner-identified desired outcomes…

  4. General Aviation Interior Noise. Part 1; Source/Path Identification

    NASA Technical Reports Server (NTRS)

    Unruh, James F.; Till, Paul D.; Palumbo, Daniel L. (Technical Monitor)

    2002-01-01

    There were two primary objectives of the research effort reported herein. The first objective was to identify and evaluate noise source/path identification technology applicable to single engine propeller driven aircraft that can be used to identify interior noise sources originating from structure-borne engine/propeller vibration, airborne propeller transmission, airborne engine exhaust noise, and engine case radiation. The approach taken to identify the contributions of each of these possible sources was first to conduct a Principal Component Analysis (PCA) of an in-flight noise and vibration database acquired on a Cessna Model 182E aircraft. The second objective was to develop and evaluate advanced technology for noise source ranking of interior panel groups such as the aircraft windshield, instrument panel, firewall, and door/window panels within the cabin of a single engine propeller driven aircraft. The technology employed was that of Acoustic Holography (AH). AH was applied to the test aircraft by acquiring a series of in-flight microphone array measurements within the aircraft cabin and correlating the measurements via PCA. The source contributions of the various panel groups leading to the array measurements were then synthesized by solving the inverse problem using the boundary element model.

  5. Cultural Resource Predictive Modeling

    DTIC Science & Technology

    2017-10-01

    property to manage ? a. Yes 2) Do you use CRPM (Cultural Resource Predictive Modeling) No, but I use predictive modelling informally . For example...resource program and provide support to the test ranges for their missions. This document will provide information such as lessons learned, points...of contact, and resources to the range cultural resource managers . Objective/Scope: Identify existing cultural resource predictive models and

  6. A University-Industry Cooperation Model for Small and Medium Enterprises: The Case of Chengdu KEDA Optoelectronic Technology Ltd.

    ERIC Educational Resources Information Center

    Peng, Shanzhong; Ferreira, Fernando A. F.; Zheng, He

    2017-01-01

    In this study, we develop a firm-dominated incremental cooperation model. Following the critical review of current literature and various cooperation models, we identified a number of strengths and shortcomings that form the basis for our framework. The objective of our theoretical model is to contribute to overcome the existing gap within…

  7. Objectively classifying Southern Hemisphere extratropical cyclones

    NASA Astrophysics Data System (ADS)

    Catto, Jennifer

    2016-04-01

    There has been a long tradition in attempting to separate extratropical cyclones into different classes depending on their cloud signatures, airflows, synoptic precursors, or upper-level flow features. Depending on these features, the cyclones may have different impacts, for example in their precipitation intensity. It is important, therefore, to understand how the distribution of different cyclone classes may change in the future. Many of the previous classifications have been performed manually. In order to be able to evaluate climate models and understand how extratropical cyclones might change in the future, we need to be able to use an automated method to classify cyclones. Extratropical cyclones have been identified in the Southern Hemisphere from the ERA-Interim reanalysis dataset with a commonly used identification and tracking algorithm that employs 850 hPa relative vorticity. A clustering method applied to large-scale fields from ERA-Interim at the time of cyclone genesis (when the cyclone is first detected), has been used to objectively classify identified cyclones. The results are compared to the manual classification of Sinclair and Revell (2000) and the four objectively identified classes shown in this presentation are found to match well. The relative importance of diabatic heating in the clusters is investigated, as well as the differing precipitation characteristics. The success of the objective classification shows its utility in climate model evaluation and climate change studies.

  8. Thermal barrier coating life prediction model

    NASA Technical Reports Server (NTRS)

    Pilsner, B. H.; Hillery, R. V.; Mcknight, R. L.; Cook, T. S.; Kim, K. S.; Duderstadt, E. C.

    1986-01-01

    The objectives of this program are to determine the predominant modes of degradation of a plasma sprayed thermal barrier coating system, and then to develop and verify life prediction models accounting for these degradation modes. The program is divided into two phases, each consisting of several tasks. The work in Phase 1 is aimed at identifying the relative importance of the various failure modes, and developing and verifying life prediction model(s) for the predominant model for a thermal barrier coating system. Two possible predominant failure mechanisms being evaluated are bond coat oxidation and bond coat creep. The work in Phase 2 will develop design-capable, causal, life prediction models for thermomechanical and thermochemical failure modes, and for the exceptional conditions of foreign object damage and erosion.

  9. Development of an objective tool for the diagnosis of myxedema coma.

    PubMed

    Chiong, Yien V; Bammerlin, Elaine; Mariash, Cary N

    2015-09-01

    Myxedema coma, a rare entity, with a reported 25%-65% mortality had no objective criteria for making the diagnosis when we began our study. We developed an objective screening tool for myxedema coma to more easily identify patients and examine the best treatment method in future prospective studies to reduce the mortality of this entity. We conducted a retrospective chart review to find all patients aged ≥18 years admitted with myxedema coma from January 1, 2005 through June 13, 2010 at Indiana University Health Methodist Hospital. On the basis of both our retrospective chart review and on literature accounts, we identified 6 criteria to diagnose myxedema coma. We identified 10 patients initially diagnosed with myxedema coma and established a control group consisting of 13 patients identified with altered mental status and increased thyroid-stimulating hormone (TSH) levels. The 6 variables we created for the screening tool were heart rate, temperature, Glasgow coma scale, TSH, free thyroxine, and precipitating factors. The screening tool has a sensitivity and specificity of about 80%. We ran a logistic regression model using the 10 study patients and 13 controls with the 6 variables. No variables alone significantly contributed to the model. However, the overall model was highly significant (P = 0.012), providing strong support for a scoring system that uses these variables simultaneously. This screening tool enables physicians to rapidly diagnose myxedema coma to expedite treatment. A more refined diagnostic tool may be used in future clinical studies designed to determine the optimal treatment. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. Source detection in astronomical images by Bayesian model comparison

    NASA Astrophysics Data System (ADS)

    Frean, Marcus; Friedlander, Anna; Johnston-Hollitt, Melanie; Hollitt, Christopher

    2014-12-01

    The next generation of radio telescopes will generate exabytes of data on hundreds of millions of objects, making automated methods for the detection of astronomical objects ("sources") essential. Of particular importance are faint, diffuse objects embedded in noise. There is a pressing need for source finding software that identifies these sources, involves little manual tuning, yet is tractable to calculate. We first give a novel image discretisation method that incorporates uncertainty about how an image should be discretised. We then propose a hierarchical prior for astronomical images, which leads to a Bayes factor indicating how well a given region conforms to a model of source that is exceptionally unconstrained, compared to a model of background. This enables the efficient localisation of regions that are "suspiciously different" from the background distribution, so our method looks not for brightness but for anomalous distributions of intensity, which is much more general. The model of background can be iteratively improved by removing the influence on it of sources as they are discovered. The approach is evaluated by identifying sources in real and simulated data, and performs well on these measures: the Bayes factor is maximized at most real objects, while returning only a moderate number of false positives. In comparison to a catalogue constructed by widely-used source detection software with manual post-processing by an astronomer, our method found a number of dim sources that were missing from the "ground truth" catalogue.

  11. Objective quantification of the tinnitus decompensation by synchronization measures of auditory evoked single sweeps.

    PubMed

    Strauss, Daniel J; Delb, Wolfgang; D'Amelio, Roberto; Low, Yin Fen; Falkai, Peter

    2008-02-01

    Large-scale neural correlates of the tinnitus decompensation might be used for an objective evaluation of therapies and neurofeedback based therapeutic approaches. In this study, we try to identify large-scale neural correlates of the tinnitus decompensation using wavelet phase stability criteria of single sweep sequences of late auditory evoked potentials as synchronization stability measure. The extracted measure provided an objective quantification of the tinnitus decompensation and allowed for a reliable discrimination between a group of compensated and decompensated tinnitus patients. We provide an interpretation for our results by a neural model of top-down projections based on the Jastreboff tinnitus model combined with the adaptive resonance theory which has not been applied to model tinnitus so far. Using this model, our stability measure of evoked potentials can be linked to the focus of attention on the tinnitus signal. It is concluded that the wavelet phase stability of late auditory evoked potential single sweeps might be used as objective tinnitus decompensation measure and can be interpreted in the framework of the Jastreboff tinnitus model and adaptive resonance theory.

  12. Object-Based Classification as an Alternative Approach to the Traditional Pixel-Based Classification to Identify Potential Habitat of the Grasshopper Sparrow

    NASA Astrophysics Data System (ADS)

    Jobin, Benoît; Labrecque, Sandra; Grenier, Marcelle; Falardeau, Gilles

    2008-01-01

    The traditional method of identifying wildlife habitat distribution over large regions consists of pixel-based classification of satellite images into a suite of habitat classes used to select suitable habitat patches. Object-based classification is a new method that can achieve the same objective based on the segmentation of spectral bands of the image creating homogeneous polygons with regard to spatial or spectral characteristics. The segmentation algorithm does not solely rely on the single pixel value, but also on shape, texture, and pixel spatial continuity. The object-based classification is a knowledge base process where an interpretation key is developed using ground control points and objects are assigned to specific classes according to threshold values of determined spectral and/or spatial attributes. We developed a model using the eCognition software to identify suitable habitats for the Grasshopper Sparrow, a rare and declining species found in southwestern Québec. The model was developed in a region with known breeding sites and applied on other images covering adjacent regions where potential breeding habitats may be present. We were successful in locating potential habitats in areas where dairy farming prevailed but failed in an adjacent region covered by a distinct Landsat scene and dominated by annual crops. We discuss the added value of this method, such as the possibility to use the contextual information associated to objects and the ability to eliminate unsuitable areas in the segmentation and land cover classification processes, as well as technical and logistical constraints. A series of recommendations on the use of this method and on conservation issues of Grasshopper Sparrow habitat is also provided.

  13. Database Technology Activities and Assessment for Defense Modeling and Simulation Office (DMSO) (August 1991-November 1992). A Documented Briefing

    DTIC Science & Technology

    1994-01-01

    databases and identifying new data entities, data elements, and relationships . - Standard data naming conventions, schema, and definition processes...management system. The use of such a tool could offer: (1) structured support for representation of objects and their relationships to each other (and...their relationships to related multimedia objects such as an engineering drawing of the tank object or a satellite image that contains the installation

  14. 2MASS J11151597+1937266: A Young, Dusty, Isolated, Planetary-mass Object with a Potential Wide Stellar Companion

    NASA Astrophysics Data System (ADS)

    Theissen, Christopher A.; Burgasser, Adam J.; Bardalez Gagliuffi, Daniella C.; Hardegree-Ullman, Kevin K.; Gagné, Jonathan; Schmidt, Sarah J.; West, Andrew A.

    2018-01-01

    We present 2MASS J11151597+1937266, a recently identified low-surface-gravity L dwarf, classified as an L2γ based on Sloan Digital Sky Survey optical spectroscopy. We confirm this spectral type with near-infrared spectroscopy, which provides further evidence that 2MASS J11151597+1937266 is a low-surface-gravity L dwarf. This object also shows significant excess mid-infrared flux, indicative of circumstellar material; and its strong Hα emission (EWHα = 560 ± 82 Å) is an indicator of enhanced magnetic activity or weak accretion. Comparison of its spectral energy distribution to model photospheres yields an effective temperature of {1724}-38+184 {{K}}. We also provide a revised distance estimate of 37 ± 6 pc using a spectral type–luminosity relationship for low-surface-gravity objects. The three-dimensional galactic velocities and positions of 2MASS J11151597+1937266 do not match any known young association or moving group. Assuming a probable age in the range of 5–45 Myr, the model-dependent estimated mass of this object is between 7 and 21 M Jup, making it a potentially isolated planetary-mass object. We also identify a candidate co-moving, young stellar companion, 2MASS J11131089+2110086.

  15. Student Satisfaction in Higher Education: A Meta-Analytic Study

    ERIC Educational Resources Information Center

    Santini, Fernando de Oliveira; Ladeira, Wagner Junior; Sampaio, Claudio Hoffmann; da Silva Costa, Gustavo

    2017-01-01

    This paper discusses the results of a meta-analysis performed to identify key antecedent and consequent constructs of satisfaction in higher education. We offer an integrated model to achieve a better understanding of satisfaction in the context of higher education. To accomplish this objective, we identified 83 studies that were valid and…

  16. Service-based analysis of biological pathways

    PubMed Central

    Zheng, George; Bouguettaya, Athman

    2009-01-01

    Background Computer-based pathway discovery is concerned with two important objectives: pathway identification and analysis. Conventional mining and modeling approaches aimed at pathway discovery are often effective at achieving either objective, but not both. Such limitations can be effectively tackled leveraging a Web service-based modeling and mining approach. Results Inspired by molecular recognitions and drug discovery processes, we developed a Web service mining tool, named PathExplorer, to discover potentially interesting biological pathways linking service models of biological processes. The tool uses an innovative approach to identify useful pathways based on graph-based hints and service-based simulation verifying user's hypotheses. Conclusion Web service modeling of biological processes allows the easy access and invocation of these processes on the Web. Web service mining techniques described in this paper enable the discovery of biological pathways linking these process service models. Algorithms presented in this paper for automatically highlighting interesting subgraph within an identified pathway network enable the user to formulate hypothesis, which can be tested out using our simulation algorithm that are also described in this paper. PMID:19796403

  17. An object-oriented framework for distributed hydrologic and geomorphic modeling using triangulated irregular networks

    NASA Astrophysics Data System (ADS)

    Tucker, Gregory E.; Lancaster, Stephen T.; Gasparini, Nicole M.; Bras, Rafael L.; Rybarczyk, Scott M.

    2001-10-01

    We describe a new set of data structures and algorithms for dynamic terrain modeling using a triangulated irregular network (TINs). The framework provides an efficient method for storing, accessing, and updating a Delaunay triangulation and its associated Voronoi diagram. The basic data structure consists of three interconnected data objects: triangles, nodes, and directed edges. Encapsulating each of these geometric elements within a data object makes it possible to essentially decouple the TIN representation from the modeling applications that make use of it. Both the triangulation and its corresponding Voronoi diagram can be rapidly retrieved or updated, making these methods well suited to adaptive remeshing schemes. We develop a set of algorithms for defining drainage networks and identifying closed depressions (e.g., lakes) for hydrologic and geomorphic modeling applications. We also outline simple numerical algorithms for solving network routing and 2D transport equations within the TIN framework. The methods are illustrated with two example applications, a landscape evolution model and a distributed rainfall-runoff model.

  18. Integrated wetland management for waterfowl and shorebirds at Mattamuskeet National Wildlife Refuge, North Carolina

    USGS Publications Warehouse

    Tavernia, Brian G.; Stanton, John D.; Lyons, James E.

    2017-11-22

    Mattamuskeet National Wildlife Refuge (MNWR) offers a mix of open water, marsh, forest, and cropland habitats on 20,307 hectares in coastal North Carolina. In 1934, Federal legislation (Executive Order 6924) established MNWR to benefit wintering waterfowl and other migratory bird species. On an annual basis, the refuge staff decide how to manage 14 impoundments to benefit not only waterfowl during the nonbreeding season, but also shorebirds during fall and spring migration. In making these decisions, the challenge is to select a portfolio, or collection, of management actions for the impoundments that optimizes use by the three groups of birds while respecting budget constraints. In this study, a decision support tool was developed for these annual management decisions.Within the decision framework, there are three different management objectives: shorebird-use days during fall and spring migrations, and waterfowl-use days during the nonbreeding season. Sixteen potential management actions were identified for impoundments; each action represents a combination of hydroperiod and vegetation manipulation. Example hydroperiods include semi-permanent and seasonal drawdowns, and vegetation manipulations include mechanical-chemical treatment, burning, disking, and no action. Expert elicitation was used to build a Bayesian Belief Network (BBN) model that predicts shorebird- and waterfowl-use days for each potential management action. The BBN was parameterized for a representative impoundment, MI-9, and predictions were re-scaled for this impoundment to predict outcomes at other impoundments on the basis of size. Parameter estimates in the BBN model can be updated using observations from ongoing monitoring that is part of the Integrated Waterbird Management and Monitoring (IWMM) program.The optimal portfolio of management actions depends on the importance, that is, weights, assigned to the three objectives, as well as the budget. Five scenarios with a variety of objective weights and budgets were developed. Given the large number of possible portfolios (1614), a heuristic genetic algorithm was used to identify a management action portfolio that maximized use-day objectives while respecting budget constraints. The genetic algorithm identified a portfolio of management actions for each of the five scenarios, enabling refuge staff to explore the sensitivity of their management decisions to objective weights and budget constraints.The decision framework developed here provides a transparent, defensible, and testable foundation for decision making at MNWR. The BBN model explicitly structures and parameterizes a mental model previously used by an expert to assign management actions to the impoundments. With ongoing IWMM monitoring, predictions from the model can be tested, and model parameters updated, to reflect empirical observations. This framework is intended to be a living document that can be updated to reflect changes in the decision context (for example, new objectives or constraints, or new models to compete with the current BBN model). Rather than a mandate to refuge staff, this framework is intended to be a decision support tool; tool outputs can become part of the deliberations of refuge staff when making difficult management decisions for multiple objectives.

  19. Knowledge modeling in image-guided neurosurgery: application in understanding intraoperative brain shift

    NASA Astrophysics Data System (ADS)

    Cohen-Adad, Julien; Paul, Perrine; Morandi, Xavier; Jannin, Pierre

    2006-03-01

    During an image-guided neurosurgery procedure, the neuronavigation system is subject to inaccuracy because of anatomical deformations which induce a gap between the preoperative images and their anatomical reality. Thus, the objective of many research teams is to succeed in quantifying these deformations in order to update preoperative images. Anatomical intraoperative deformations correspond to a complex spatio-temporal phenomenon. Our objective is to identify the parameters implicated in these deformations and to use these parameters as constrains for systems dedicated to updating preoperative images. In order to identify these parameters of deformation we followed the iterative methodology used for cognitive system conception: identification, conceptualization, formalization, implementation and validation. A state of the art about cortical deformations has been established in order to identify relevant parameters probably involved in the deformations. As a first step, 30 parameters have been identified and described following an ontological approach. They were formalized into a Unified Modeling Language (UML) class diagram. We implemented that model into a web-based application in order to fill a database. Two surgical cases have been studied at this moment. After having entered enough surgical cases for data mining purposes, we expect to identify the most relevant and influential parameters and to gain a better ability to understand the deformation phenomenon. This original approach is part of a global system aiming at quantifying and correcting anatomical deformations.

  20. Understanding identifiability as a crucial step in uncertainty assessment

    NASA Astrophysics Data System (ADS)

    Jakeman, A. J.; Guillaume, J. H. A.; Hill, M. C.; Seo, L.

    2016-12-01

    The topic of identifiability analysis offers concepts and approaches to identify why unique model parameter values cannot be identified, and can suggest possible responses that either increase uniqueness or help to understand the effect of non-uniqueness on predictions. Identifiability analysis typically involves evaluation of the model equations and the parameter estimation process. Non-identifiability can have a number of undesirable effects. In terms of model parameters these effects include: parameters not being estimated uniquely even with ideal data; wildly different values being returned for different initialisations of a parameter optimisation algorithm; and parameters not being physically meaningful in a model attempting to represent a process. This presentation illustrates some of the drastic consequences of ignoring model identifiability analysis. It argues for a more cogent framework and use of identifiability analysis as a way of understanding model limitations and systematically learning about sources of uncertainty and their importance. The presentation specifically distinguishes between five sources of parameter non-uniqueness (and hence uncertainty) within the modelling process, pragmatically capturing key distinctions within existing identifiability literature. It enumerates many of the various approaches discussed in the literature. Admittedly, improving identifiability is often non-trivial. It requires thorough understanding of the cause of non-identifiability, and the time, knowledge and resources to collect or select new data, modify model structures or objective functions, or improve conditioning. But ignoring these problems is not a viable solution. Even simple approaches such as fixing parameter values or naively using a different model structure may have significant impacts on results which are too often overlooked because identifiability analysis is neglected.

  1. Integrated models to support multiobjective ecological restoration decisions.

    PubMed

    Fraser, Hannah; Rumpff, Libby; Yen, Jian D L; Robinson, Doug; Wintle, Brendan A

    2017-12-01

    Many objectives motivate ecological restoration, including improving vegetation condition, increasing the range and abundance of threatened species, and improving species richness and diversity. Although models have been used to examine the outcomes of ecological restoration, few researchers have attempted to develop models to account for multiple, potentially competing objectives. We developed a combined state-and-transition, species-distribution model to predict the effects of restoration actions on vegetation condition and extent, bird diversity, and the distribution of several bird species in southeastern Australian woodlands. The actions reflected several management objectives. We then validated the models against an independent data set and investigated how the best management decision might change when objectives were valued differently. We also used model results to identify effective restoration options for vegetation and bird species under a constrained budget. In the examples we evaluated, no one action (improving vegetation condition and extent, increasing bird diversity, or increasing the probability of occurrence for threatened species) provided the best outcome across all objectives. In agricultural lands, the optimal management actions for promoting the occurrence of the Brown Treecreeper (Climacteris picumnus), an iconic threatened species, resulted in little improvement in the extent of the vegetation and a high probability of decreased vegetation condition. This result highlights that the best management action in any situation depends on how much the different objectives are valued. In our example scenario, no management or weed control were most likely to be the best management options to satisfy multiple restoration objectives. Our approach to exploring trade-offs in management outcomes through integrated modeling and structured decision-support approaches has wide application for situations in which trade-offs exist between competing conservation objectives. © 2017 Society for Conservation Biology.

  2. VAS: A Vision Advisor System combining agents and object-oriented databases

    NASA Technical Reports Server (NTRS)

    Eilbert, James L.; Lim, William; Mendelsohn, Jay; Braun, Ron; Yearwood, Michael

    1994-01-01

    A model-based approach to identifying and finding the orientation of non-overlapping parts on a tray has been developed. The part models contain both exact and fuzzy descriptions of part features, and are stored in an object-oriented database. Full identification of the parts involves several interacting tasks each of which is handled by a distinct agent. Using fuzzy information stored in the model allowed part features that were essentially at the noise level to be extracted and used for identification. This was done by focusing attention on the portion of the part where the feature must be found if the current hypothesis of the part ID is correct. In going from one set of parts to another the only thing that needs to be changed is the database of part models. This work is part of an effort in developing a Vision Advisor System (VAS) that combines agents and objected-oriented databases.

  3. Model Strategies for the Recruitment and Retention of Undergraduate Criminal Justice Students.

    ERIC Educational Resources Information Center

    Positive Futures, Inc., Washington, DC.

    Components for a model strategy/program for the recruitment and retention of students in criminal justice (CJ) programs are presented to stimulate planning activity. These 24 general examples of approaches identify the strategy, state the objectives, provide a rationale, describe implementation, discuss intervention activities, and delineate the…

  4. Agricultural Policy Environmental eXtender simulation of three adjacent row-crop watersheds in the claypan region

    USDA-ARS?s Scientific Manuscript database

    The Agricultural Policy Environmental Extender (APEX) model can simulate crop yields, and pollutant loadings in whole farms or small watersheds with variety of management practices. The study objectives were to identify sensitive parameters and parameterize, calibrate and validate the APEX model fo...

  5. Combustion Technology for Incinerating Wastes from Air Force Industrial Processes.

    DTIC Science & Technology

    1984-02-01

    The assumption of equilibrium between environmental compartments. * The statistical extrapolations yielding "safe" doses of various constituents...would be contacted to identify the assumptions and data requirements needed to design, construct and implement the model. The model’s primary objective...Recovery Planning Model (RRPLAN) is described. This section of the paper summarizes the model’s assumptions , major components and modes of operation

  6. Identifying Bottom-Up and Top-Down Components of Attentional Weight by Experimental Analysis and Computational Modeling

    ERIC Educational Resources Information Center

    Nordfang, Maria; Dyrholm, Mads; Bundesen, Claus

    2013-01-01

    The attentional weight of a visual object depends on the contrast of the features of the object to its local surroundings (feature contrast) and the relevance of the features to one's goals (feature relevance). We investigated the dependency in partial report experiments with briefly presented stimuli but unspeeded responses. The task was to…

  7. Language Delay in Severely Neglected Children: A Cumulative or Specific Effect of Risk Factors?

    ERIC Educational Resources Information Center

    Sylvestre, Audette; Merette, Chantal

    2010-01-01

    Objectives: This research sought to determine if the language delay (LD) of severely neglected children under 3 years old was better explained by a cumulative risk model or by the specificity of risk factors. The objective was also to identify the risk factors with the strongest impact on LD among various biological, psychological, and…

  8. Predicting Parental Monitoring Behaviours for Sugar-Sweetened Beverages in Parents of School-Aged Children: An Application of the Integrative Behavioural Model

    ERIC Educational Resources Information Center

    Housely, Alexandra; Branscum, Paul; Cheney, Marshall; Hofford, Craig

    2016-01-01

    Objective: The objective of this study was to identify theory-based psychosocial and environmental determinants of parental monitoring practices related to child sugar-sweetened beverage consumption. Design: Cross-sectional design. Method: Data were obtained from a convenience sample of parents (n = 270) with children attending an after-school…

  9. The Systematic Development of an Internet-Based Smoking Cessation Intervention for Adults.

    PubMed

    Dalum, Peter; Brandt, Caroline Lyng; Skov-Ettrup, Lise; Tolstrup, Janne; Kok, Gerjo

    2016-07-01

    Objectives The objective of this project was to determine whether intervention mapping is a suitable strategy for developing an Internet- and text message-based smoking cessation intervention. Method We used the Intervention Mapping framework for planning health promotion programs. After a needs assessment, we identified important changeable determinants of cessation behavior, specified objectives for the intervention, selected theoretical methods for meeting our objectives, and operationalized change methods into practical intervention strategies. Results We found that "social cognitive theory," the "transtheoretical model/stages of change," "self-regulation theory," and "appreciative inquiry" were relevant theories for smoking cessation interventions. From these theories, we selected modeling/behavioral journalism, feedback, planning coping responses/if-then statements, gain frame/positive imaging, consciousness-raising, helping relationships, stimulus control, and goal-setting as suitable methods for an Internet- and text-based adult smoking cessation program. Furthermore, we identified computer tailoring as a useful strategy for adapting the intervention to individual users. Conclusion The Intervention Mapping method, with a clear link between behavioral goals, theoretical methods, and practical strategies and materials, proved useful for systematic development of a digital smoking cessation intervention for adults. © 2016 Society for Public Health Education.

  10. Design of an air traffic computer simulation system to support investigation of civil tiltrotor aircraft operations

    NASA Technical Reports Server (NTRS)

    Rogers, Ralph V.

    1993-01-01

    The TATSS Project's goal was to develop a design for computer software that would support the attainment of the following objectives for the air traffic simulation model: (1) Full freedom of movement for each aircraft object in the simulation model. Each aircraft object may follow any designated flight plan or flight path necessary as required by the experiment under consideration. (2) Object position precision up to +/- 3 meters vertically and +/- 15 meters horizontally. (3) Aircraft maneuvering in three space with the object position precision identified above. (4) Air traffic control operations and procedures. (5) Radar, communication, navaid, and landing aid performance. (6) Weather. (7) Ground obstructions and terrain. (8) Detection and recording of separation violations. (9) Measures of performance including deviations from flight plans, air space violations, air traffic control messages per aircraft, and traditional temporal based measures.

  11. Progress in simulating industrial flows using two-equation models: Can more be achieved with further research?

    NASA Technical Reports Server (NTRS)

    Haroutunian, Vahe

    1995-01-01

    This viewgraph presentation provides a brief review of two-equation eddy-viscosity models (TEM's) from the perspective of applied CFD. It provides objective assessment of both well-known and newer models, compares model predictions from various TEM's with experiments, identifies sources of modeling error and gives historical perspective of their effects on model performance and assessment, and recommends directions for future research on TEM's.

  12. A Reduced-Order Model For Zero-Mass Synthetic Jet Actuators

    NASA Technical Reports Server (NTRS)

    Yamaleev, Nail K.; Carpenter, Mark H.; Vatsa, Veer S.

    2007-01-01

    Accurate details of the general performance of fluid actuators is desirable over a range of flow conditions, within some predetermined error tolerance. Designers typically model actuators with different levels of fidelity depending on the acceptable level of error in each circumstance. Crude properties of the actuator (e.g., peak mass rate and frequency) may be sufficient for some designs, while detailed information is needed for other applications (e.g., multiple actuator interactions). This work attempts to address two primary objectives. The first objective is to develop a systematic methodology for approximating realistic 3-D fluid actuators, using quasi-1-D reduced-order models. Near full fidelity can be achieved with this approach at a fraction of the cost of full simulation and only a modest increase in cost relative to most actuator models used today. The second objective, which is a direct consequence of the first, is to determine the approximate magnitude of errors committed by actuator model approximations of various fidelities. This objective attempts to identify which model (ranging from simple orifice exit boundary conditions to full numerical simulations of the actuator) is appropriate for a given error tolerance.

  13. Developing the snow component of a distributed hydrological model: a step-wise approach based on multi-objective analysis

    NASA Astrophysics Data System (ADS)

    Dunn, S. M.; Colohan, R. J. E.

    1999-09-01

    A snow component has been developed for the distributed hydrological model, DIY, using an approach that sequentially evaluates the behaviour of different functions as they are implemented in the model. The evaluation is performed using multi-objective functions to ensure that the internal structure of the model is correct. The development of the model, using a sub-catchment in the Cairngorm Mountains in Scotland, demonstrated that the degree-day model can be enhanced for hydroclimatic conditions typical of those found in Scotland, without increasing meteorological data requirements. An important element of the snow model is a function to account for wind re-distribution. This causes large accumulations of snow in small pockets, which are shown to be important in sustaining baseflows in the rivers during the late spring and early summer, long after the snowpack has melted from the bulk of the catchment. The importance of the wind function would not have been identified using a single objective function of total streamflow to evaluate the model behaviour.

  14. Identification of Bouc-Wen hysteretic parameters based on enhanced response sensitivity approach

    NASA Astrophysics Data System (ADS)

    Wang, Li; Lu, Zhong-Rong

    2017-05-01

    This paper aims to identify parameters of Bouc-Wen hysteretic model using time-domain measured data. It follows a general inverse identification procedure, that is, identifying model parameters is treated as an optimization problem with the nonlinear least squares objective function. Then, the enhanced response sensitivity approach, which has been shown convergent and proper for such kind of problems, is adopted to solve the optimization problem. Numerical tests are undertaken to verify the proposed identification approach.

  15. Tactile Recognition and Localization Using Object Models: The Case of Polyhedra on a Plane.

    DTIC Science & Technology

    1983-03-01

    poor force resolution, but high spatial resolution. We feel that the viability of this recognition approach has important implications on the design of...of the touched object: 1. Surface point - On the basis of sensor readings, some points on the sensor can be identified as being in contact with...the sensor’s shape and location in space are known, one can determine the position of some point on the touched object, to within some uncertainty

  16. A Prognostic Model for One-year Mortality in Patients Requiring Prolonged Mechanical Ventilation

    PubMed Central

    Carson, Shannon S.; Garrett, Joanne; Hanson, Laura C.; Lanier, Joyce; Govert, Joe; Brake, Mary C.; Landucci, Dante L.; Cox, Christopher E.; Carey, Timothy S.

    2009-01-01

    Objective A measure that identifies patients who are at high risk of mortality after prolonged ventilation will help physicians communicate prognosis to patients or surrogate decision-makers. Our objective was to develop and validate a prognostic model for 1-year mortality in patients ventilated for 21 days or more. Design Prospective cohort study. Setting University-based tertiary care hospital Patients 300 consecutive medical, surgical, and trauma patients requiring mechanical ventilation for at least 21 days were prospectively enrolled. Measurements and Main Results Predictive variables were measured on day 21 of ventilation for the first 200 patients and entered into logistic regression models with 1-year and 3-month mortality as outcomes. Final models were validated using data from 100 subsequent patients. One-year mortality was 51% in the development set and 58% in the validation set. Independent predictors of mortality included requirement for vasopressors, hemodialysis, platelet count ≤150 ×109/L, and age ≥50. Areas under the ROC curve for the development model and validation model were 0.82 (se 0.03) and 0.82 (se 0.05) respectively. The model had sensitivity of 0.42 (se 0.12) and specificity of 0.99 (se 0.01) for identifying patients who had ≥90% risk of death at 1 year. Observed mortality was highly consistent with both 3- and 12-month predicted mortality. These four predictive variables can be used in a simple prognostic score that clearly identifies low risk patients (no risk factors, 15% mortality) and high risk patients (3 or 4 risk factors, 97% mortality). Conclusions Simple clinical variables measured on day 21 of mechanical ventilation can identify patients at highest and lowest risk of death from prolonged ventilation. PMID:18552692

  17. Modelling high data rate communication network access protocol

    NASA Technical Reports Server (NTRS)

    Khanna, S.; Foudriat, E. C.; Paterra, Frank; Maly, Kurt J.; Overstreet, C. Michael

    1990-01-01

    Modeling of high data rate communication systems is different from the low data rate systems. Three simulations were built during the development phase of Carrier Sensed Multiple Access/Ring Network (CSMA/RN) modeling. The first was a model using SIMCRIPT based upon the determination and processing of each event at each node. The second simulation was developed in C based upon isolating the distinct object that can be identified as the ring, the message, the node, and the set of critical events. The third model further identified the basic network functionality by creating a single object, the node which includes the set of critical events which occur at the node. The ring structure is implicit in the node structure. This model was also built in C. Each model is discussed and their features compared. It should be stated that the language used was mainly selected by the model developer because of his past familiarity. Further the models were not built with the intent to compare either structure or language but because the complexity of the problem and initial results contained obvious errors, so alternative models were built to isolate, determine, and correct programming and modeling errors. The CSMA/RN protocol is discussed in sufficient detail to understand modeling complexities. Each model is described along with its features and problems. The models are compared and concluding observations and remarks are presented.

  18. Economic and environmental optimization of waste treatment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Münster, M.; Ravn, H.; Hedegaard, K.

    2015-04-15

    Highlights: • Optimizing waste treatment by incorporating LCA methodology. • Applying different objectives (minimizing costs or GHG emissions). • Prioritizing multiple objectives given different weights. • Optimum depends on objective and assumed displaced electricity production. - Abstract: This article presents the new systems engineering optimization model, OptiWaste, which incorporates a life cycle assessment (LCA) methodology and captures important characteristics of waste management systems. As part of the optimization, the model identifies the most attractive waste management options. The model renders it possible to apply different optimization objectives such as minimizing costs or greenhouse gas emissions or to prioritize several objectivesmore » given different weights. A simple illustrative case is analysed, covering alternative treatments of one tonne of residual household waste: incineration of the full amount or sorting out organic waste for biogas production for either combined heat and power generation or as fuel in vehicles. The case study illustrates that the optimal solution depends on the objective and assumptions regarding the background system – illustrated with different assumptions regarding displaced electricity production. The article shows that it is feasible to combine LCA methodology with optimization. Furthermore, it highlights the need for including the integrated waste and energy system into the model.« less

  19. Maturity Models of Healthcare Information Systems and Technologies: a Literature Review.

    PubMed

    Carvalho, João Vidal; Rocha, Álvaro; Abreu, António

    2016-06-01

    The maturity models are instruments to facilitate organizational management, including the management of its information systems function. These instruments are used also in hospitals. The objective of this article is to identify and compare the maturity models for management of information systems and technologies (IST) in healthcare. For each maturity model, it is identified the methodology of development and validation, as well as the scope, stages and their characteristics by dimensions or influence factors. This study resulted in the need to develop a maturity model based on a holistic approach. It will include a comprehensive set of influencing factors to reach all areas and subsystems of health care organizations.

  20. Object Extraction in Cluttered Environments via a P300-Based IFCE

    PubMed Central

    He, Huidong; Xian, Bin; Zeng, Ming; Zhou, Huihui; Niu, Linwei; Chen, Genshe

    2017-01-01

    One of the fundamental issues for robot navigation is to extract an object of interest from an image. The biggest challenges for extracting objects of interest are how to use a machine to model the objects in which a human is interested and extract them quickly and reliably under varying illumination conditions. This article develops a novel method for segmenting an object of interest in a cluttered environment by combining a P300-based brain computer interface (BCI) and an improved fuzzy color extractor (IFCE). The induced P300 potential identifies the corresponding region of interest and obtains the target of interest for the IFCE. The classification results not only represent the human mind but also deliver the associated seed pixel and fuzzy parameters to extract the specific objects in which the human is interested. Then, the IFCE is used to extract the corresponding objects. The results show that the IFCE delivers better performance than the BP network or the traditional FCE. The use of a P300-based IFCE provides a reliable solution for assisting a computer in identifying an object of interest within images taken under varying illumination intensities. PMID:28740505

  1. Development of a Landforms Model for Puerto Rico and its Application for Land Cover Change Analysis

    Treesearch

    Sebastian Martinuzzi; William A. Gould; Olga M. Ramos Gonzalez; Brook E. Edwards

    2007-01-01

    Comprehensive analysis of land morphology is essential to supporting a wide range environmental studies. We developed a landforms model that identifies eleven landform units for Puerto Rico based on parameters of land position and slope. The model is capable of extracting operational information in a simple way and is adaptable to different environments and objectives...

  2. Videogame Construction by Engineering Students for Understanding Modelling Processes: The Case of Simulating Water Behaviour

    ERIC Educational Resources Information Center

    Pretelín-Ricárdez, Angel; Sacristán, Ana Isabel

    2015-01-01

    We present some results of an ongoing research project where university engineering students were asked to construct videogames involving the use of physical systems models. The objective is to help them identify and understand the elements and concepts involved in the modelling process. That is, we use game design as a constructionist approach…

  3. A Multi-Objective, Hub-and-Spoke Supply Chain Design Model For Densified Biomass

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Md S. Roni; Sandra Eksioglu; Kara G. Cafferty

    In this paper we propose a model to design the supply chain for densified biomass. Rail is typically used for long-haul, high-volume shipment of densified biomass. This is the reason why a hub-and-spoke network structure is used to model this supply chain. The model is formulated as a multi-objective, mixed-integer programing problem under economic, environmental, and social criteria. The goal is to identify the feasibility of meeting the Renewable Fuel Standard (RFS) by using biomass for production of cellulosic ethanol. The focus in not just on the costs associated with meeting these standards, but also exploring the social and environmentalmore » benefits that biomass production and processing offers by creating new jobs and reducing greenhouse gas (GHG) emissions. We develop an augmented ?-constraint method to find the exact Pareto solution to this optimization problem. We develop a case study using data from the Mid-West. The model identifies the number, capacity and location of biorefineries needed to make use of the biomass available in the region. The model estimates the delivery cost of cellulosic ethanol under different scenario, the number new jobs created and the GHG emission reductions in the supply chain.« less

  4. A Multi-Objective, Hub-and-Spoke Supply Chain Design Model for Densified Biomass

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacob J. Jacobson; Md. S. Roni; Kara G. Cafferty

    In this paper we propose a model to design the supply chain for densified biomass. Rail is typically used for longhaul, high-volume shipment of densified biomass. This is the reason why a hub-and-spoke network structure is used to model this supply chain. The model is formulated as a multi-objective, mixed-integer programing problem under economic, environmental, and social criteria. The goal is to identify the feasibility of meeting the Renewable Fuel Standard (RFS) by using biomass for production of cellulosic ethanol. The focus is not just on the costs associated with meeting these standards, but also exploring the social and environmentalmore » benefits that biomass production and processing offers by creating new jobs and reducing greenhouse gas (GHG) emissions. We develop an augmented ?-constraint method to find the exact Pareto solution to this optimization problem. We develop a case study using data from the Mid-West. The model identifies the number, capacity and location of biorefineries needed to make use of the biomass available in the region. The model estimates the delivery cost of cellulosic ethanol under different scenario, the number new jobs created and the GHG emission reductions in the supply chain.« less

  5. Evaluation of the 29-km Eta Model. Part 1; Objective Verification at Three Selected Stations

    NASA Technical Reports Server (NTRS)

    Nutter, Paul A.; Manobianco, John; Merceret, Francis J. (Technical Monitor)

    1998-01-01

    This paper describes an objective verification of the National Centers for Environmental Prediction (NCEP) 29-km eta model from May 1996 through January 1998. The evaluation was designed to assess the model's surface and upper-air point forecast accuracy at three selected locations during separate warm (May - August) and cool (October - January) season periods. In order to enhance sample sizes available for statistical calculations, the objective verification includes two consecutive warm and cool season periods. Systematic model deficiencies comprise the larger portion of the total error in most of the surface forecast variables that were evaluated. The error characteristics for both surface and upper-air forecasts vary widely by parameter, season, and station location. At upper levels, a few characteristic biases are identified. Overall however, the upper-level errors are more nonsystematic in nature and could be explained partly by observational measurement uncertainty. With a few exceptions, the upper-air results also indicate that 24-h model error growth is not statistically significant. In February and August 1997, NCEP implemented upgrades to the eta model's physical parameterizations that were designed to change some of the model's error characteristics near the surface. The results shown in this paper indicate that these upgrades led to identifiable and statistically significant changes in forecast accuracy for selected surface parameters. While some of the changes were expected, others were not consistent with the intent of the model updates and further emphasize the need for ongoing sensitivity studies and localized statistical verification efforts. Objective verification of point forecasts is a stringent measure of model performance, but when used alone, is not enough to quantify the overall value that model guidance may add to the forecast process. Therefore, results from a subjective verification of the meso-eta model over the Florida peninsula are discussed in the companion paper by Manobianco and Nutter. Overall verification results presented here and in part two should establish a reasonable benchmark from which model users and developers may pursue the ongoing eta model verification strategies in the future.

  6. The National Evaluation of School Nutrition Programs. Final Report - Executive Summary.

    ERIC Educational Resources Information Center

    Radzikowski, Jack

    This is a summary of the final report of a study (begun in 1979) of the National School Lunch, School Breakfast, and Special Milk Programs. The major objectives of the evaluation were to (1) identify existing information on the school nutrition programs; (2) identify determinants of participation in the programs and develop statistical models for…

  7. Identification of vehicle suspension parameters by design optimization

    NASA Astrophysics Data System (ADS)

    Tey, J. Y.; Ramli, R.; Kheng, C. W.; Chong, S. Y.; Abidin, M. A. Z.

    2014-05-01

    The design of a vehicle suspension system through simulation requires accurate representation of the design parameters. These parameters are usually difficult to measure or sometimes unavailable. This article proposes an efficient approach to identify the unknown parameters through optimization based on experimental results, where the covariance matrix adaptation-evolutionary strategy (CMA-es) is utilized to improve the simulation and experimental results against the kinematic and compliance tests. This speeds up the design and development cycle by recovering all the unknown data with respect to a set of kinematic measurements through a single optimization process. A case study employing a McPherson strut suspension system is modelled in a multi-body dynamic system. Three kinematic and compliance tests are examined, namely, vertical parallel wheel travel, opposite wheel travel and single wheel travel. The problem is formulated as a multi-objective optimization problem with 40 objectives and 49 design parameters. A hierarchical clustering method based on global sensitivity analysis is used to reduce the number of objectives to 30 by grouping correlated objectives together. Then, a dynamic summation of rank value is used as pseudo-objective functions to reformulate the multi-objective optimization to a single-objective optimization problem. The optimized results show a significant improvement in the correlation between the simulated model and the experimental model. Once accurate representation of the vehicle suspension model is achieved, further analysis, such as ride and handling performances, can be implemented for further optimization.

  8. An insect-inspired model for visual binding I: learning objects and their characteristics.

    PubMed

    Northcutt, Brandon D; Dyhr, Jonathan P; Higgins, Charles M

    2017-04-01

    Visual binding is the process of associating the responses of visual interneurons in different visual submodalities all of which are responding to the same object in the visual field. Recently identified neuropils in the insect brain termed optic glomeruli reside just downstream of the optic lobes and have an internal organization that could support visual binding. Working from anatomical similarities between optic and olfactory glomeruli, we have developed a model of visual binding based on common temporal fluctuations among signals of independent visual submodalities. Here we describe and demonstrate a neural network model capable both of refining selectivity of visual information in a given visual submodality, and of associating visual signals produced by different objects in the visual field by developing inhibitory neural synaptic weights representing the visual scene. We also show that this model is consistent with initial physiological data from optic glomeruli. Further, we discuss how this neural network model may be implemented in optic glomeruli at a neuronal level.

  9. Interoperability format translation and transformation between IFC architectural design file and simulation file formats

    DOEpatents

    Chao, Tian-Jy; Kim, Younghun

    2015-02-03

    Automatically translating a building architecture file format (Industry Foundation Class) to a simulation file, in one aspect, may extract data and metadata used by a target simulation tool from a building architecture file. Interoperability data objects may be created and the extracted data is stored in the interoperability data objects. A model translation procedure may be prepared to identify a mapping from a Model View Definition to a translation and transformation function. The extracted data may be transformed using the data stored in the interoperability data objects, an input Model View Definition template, and the translation and transformation function to convert the extracted data to correct geometric values needed for a target simulation file format used by the target simulation tool. The simulation file in the target simulation file format may be generated.

  10. An investigation on fatality of drivers in vehicle-fixed object accidents on expressways in China: Using multinomial logistic regression model.

    PubMed

    Peng, Yong; Peng, Shuangling; Wang, Xinghua; Tan, Shiyang

    2018-06-01

    This study aims to identify the effects of characteristics of vehicle, roadway, driver, and environment on fatality of drivers in vehicle-fixed object accidents on expressways in Changsha-Zhuzhou-Xiangtan district of Hunan province in China by developing multinomial logistic regression models. For this purpose, 121 vehicle-fixed object accidents from 2011-2017 are included in the modeling process. First, descriptive statistical analysis is made to understand the main characteristics of the vehicle-fixed object crashes. Then, 19 explanatory variables are selected, and correlation analysis of each two variables is conducted to choose the variables to be concluded. Finally, five multinomial logistic regression models including different independent variables are compared, and the model with best fitting and prediction capability is chosen as the final model. The results showed that the turning direction in avoiding fixed objects raised the possibility that drivers would die. About 64% of drivers died in the accident were found being ejected out of the car, of which 50% did not use a seatbelt before the fatal accidents. Drivers are likely to die when they encounter bad weather on the expressway. Drivers with less than 10 years of driving experience are more likely to die in these accidents. Fatigue or distracted driving is also a significant factor in fatality of drivers. Findings from this research provide an insight into reducing fatality of drivers in vehicle-fixed object accidents.

  11. Fuzzy Similarity and Fuzzy Inclusion Measures in Polyline Matching: A Case Study of Potential Streams Identification for Archaeological Modelling in GIS

    NASA Astrophysics Data System (ADS)

    Ďuračiová, Renata; Rášová, Alexandra; Lieskovský, Tibor

    2017-12-01

    When combining spatial data from various sources, it is often important to determine similarity or identity of spatial objects. Besides the differences in geometry, representations of spatial objects are inevitably more or less uncertain. Fuzzy set theory can be used to address both modelling of the spatial objects uncertainty and determining the identity, similarity, and inclusion of two sets as fuzzy identity, fuzzy similarity, and fuzzy inclusion. In this paper, we propose to use fuzzy measures to determine the similarity or identity of two uncertain spatial object representations in geographic information systems. Labelling the spatial objects by the degree of their similarity or inclusion measure makes the process of their identification more efficient. It reduces the need for a manual control. This leads to a more simple process of spatial datasets update from external data sources. We use this approach to get an accurate and correct representation of historical streams, which is derived from contemporary digital elevation model, i.e. we identify the segments that are similar to the streams depicted on historical maps.

  12. On the objective identification of flood seasons

    NASA Astrophysics Data System (ADS)

    Cunderlik, Juraj M.; Ouarda, Taha B. M. J.; BobéE, Bernard

    2004-01-01

    The determination of seasons of high and low probability of flood occurrence is a task with many practical applications in contemporary hydrology and water resources management. Flood seasons are generally identified subjectively by visually assessing the temporal distribution of flood occurrences and, then at a regional scale, verified by comparing the temporal distribution with distributions obtained at hydrologically similar neighboring sites. This approach is subjective, time consuming, and potentially unreliable. The main objective of this study is therefore to introduce a new, objective, and systematic method for the identification of flood seasons. The proposed method tests the significance of flood seasons by comparing the observed variability of flood occurrences with the theoretical flood variability in a nonseasonal model. The method also addresses the uncertainty resulting from sampling variability by quantifying the probability associated with the identified flood seasons. The performance of the method was tested on an extensive number of samples with different record lengths generated from several theoretical models of flood seasonality. The proposed approach was then applied on real data from a large set of sites with different flood regimes across Great Britain. The results show that the method can efficiently identify flood seasons from both theoretical and observed distributions of flood occurrence. The results were used for the determination of the main flood seasonality types in Great Britain.

  13. Clinical governance in Scotland: an educational model.

    PubMed Central

    Lough, Murray; Kelly, Diane; Taylor, Mike; Snadden, David; Patterson, Bill; McNamara, Iain; Murray, Stuart

    2002-01-01

    The concepts underpinning clinical governance are similar throughout the United Kingdom but models for its implementation will differ widely. This model aims to enable practices to identify areas for further learning and development against specific outcomes. Criteria sets and standards are suggested and a governance plan is used to allow practices to prioritise their objectives. Resourcing will always be a major issue and such a model should be fully evaluated. PMID:11942453

  14. Clinical governance in Scotland: an educational model.

    PubMed

    Lough, Murray; Kelly, Diane; Taylor, Mike; Snadden, David; Patterson, Bill; McNamara, Iain; Murray, Stuart

    2002-04-01

    The concepts underpinning clinical governance are similar throughout the United Kingdom but models for its implementation will differ widely. This model aims to enable practices to identify areas for further learning and development against specific outcomes. Criteria sets and standards are suggested and a governance plan is used to allow practices to prioritise their objectives. Resourcing will always be a major issue and such a model should be fully evaluated.

  15. Improving Hydrological Simulations by Incorporating GRACE Data for Parameter Calibration

    NASA Astrophysics Data System (ADS)

    Bai, P.

    2017-12-01

    Hydrological model parameters are commonly calibrated by observed streamflow data. This calibration strategy is questioned when the modeled hydrological variables of interest are not limited to streamflow. Well-performed streamflow simulations do not guarantee the reliable reproduction of other hydrological variables. One of the reasons is that hydrological model parameters are not reasonably identified. The Gravity Recovery and Climate Experiment (GRACE) satellite-derived total water storage change (TWSC) data provide an opportunity to constrain hydrological model parameterizations in combination with streamflow observations. We constructed a multi-objective calibration scheme based on GRACE-derived TWSC and streamflow observations, with the aim of improving the parameterizations of hydrological models. The multi-objective calibration scheme was compared with the traditional single-objective calibration scheme, which is based only on streamflow observations. Two monthly hydrological models were employed on 22 Chinese catchments with different hydroclimatic conditions. The model evaluation was performed using observed streamflows, GRACE-derived TWSC, and evapotranspiraiton (ET) estimates from flux towers and from the water balance approach. Results showed that the multi-objective calibration provided more reliable TWSC and ET simulations without significant deterioration in the accuracy of streamflow simulations than the single-objective calibration. In addition, the improvements of TWSC and ET simulations were more significant in relatively dry catchments than in relatively wet catchments. This study highlights the importance of including additional constraints besides streamflow observations in the parameter estimation to improve the performances of hydrological models.

  16. Using Brigham Young University's Orson Pratt Observatory 16" telescope to identify possible transiting planets discovered by the Kilodegree Extremely Little Telescope

    NASA Astrophysics Data System (ADS)

    Matt, Kyle; Stephens, Denise C.; Gaillard, Clement; KELT-North

    2016-01-01

    We use a 16" telescope on the Brigham Young University (BYU) campus to follow-up on the Kilodegree Extremely Little Telescope (KELT) survey to identify possible transiting planets. KELT is an all sky survey that monitors the same areas of the sky throughout the year to identify stars that exhibit a change in brightness. Objects found to exhibit a variation in brightness similar to predicted models of transiting planets are sent to the ground-based follow-up team where we get high precision differential photometry to determine whether or not a transit is occurring and if the transiting object is a planet or companion star. If a planetary transit is found, the object is forwarded for radial velocity follow-up and could eventually be published as a KELT planet. In this poster we present light curves from possible planets we have identified as well as eclipsing binary systems and non-detections. We will highlight features of our telescope and camera and the basic steps for data reduction and analysis.

  17. A generalized fuzzy linear programming approach for environmental management problem under uncertainty.

    PubMed

    Fan, Yurui; Huang, Guohe; Veawab, Amornvadee

    2012-01-01

    In this study, a generalized fuzzy linear programming (GFLP) method was developed to deal with uncertainties expressed as fuzzy sets that exist in the constraints and objective function. A stepwise interactive algorithm (SIA) was advanced to solve GFLP model and generate solutions expressed as fuzzy sets. To demonstrate its application, the developed GFLP method was applied to a regional sulfur dioxide (SO2) control planning model to identify effective SO2 mitigation polices with a minimized system performance cost under uncertainty. The results were obtained to represent the amount of SO2 allocated to different control measures from different sources. Compared with the conventional interval-parameter linear programming (ILP) approach, the solutions obtained through GFLP were expressed as fuzzy sets, which can provide intervals for the decision variables and objective function, as well as related possibilities. Therefore, the decision makers can make a tradeoff between model stability and the plausibility based on solutions obtained through GFLP and then identify desired policies for SO2-emission control under uncertainty.

  18. An approach for quantitative image quality analysis for CT

    NASA Astrophysics Data System (ADS)

    Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe

    2016-03-01

    An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.

  19. Designing a Predictive Model of Student Satisfaction in Online Learning

    ERIC Educational Resources Information Center

    Parahoo, Sanjai K; Santally, Mohammad Issack; Rajabalee, Yousra; Harvey, Heather Lea

    2016-01-01

    Higher education institutions consider student satisfaction to be one of the major elements in determining the quality of their programs. The objective of the study was to develop a model of student satisfaction to identify the influencers that emerged in online higher education settings. The study adopted a mixed method approach to identify…

  20. A SYSTEMS APPROACH TO CHARACTERIZING AND PREDICTING THYROID TOXICITY USING AN AMPHIBIAN MODEL

    EPA Science Inventory

    The EPA was recently mandated to evaluate the potential effects of chemicals on endocrine function and has identified Xenopus as a model organism to use as the basis for a thyroid disruption screening assay. The main objective of this work is to develop a hypothalamic-pituitary-t...

  1. Principles of E-network modelling of heterogeneous systems

    NASA Astrophysics Data System (ADS)

    Tarakanov, D.; Tsapko, I.; Tsapko, S.; Buldygin, R.

    2016-04-01

    The present article is concerned with the analytical and simulation modelling of heterogeneous technical systems using E-network mathematical apparatus (the expansion of Petri nets). The distinguishing feature of the given system is the presence of the module6 which identifies the parameters of the controlled object as well as the external environment.

  2. Capacity Levels of Academic Staff in a Malaysian Public University: Students' Perspective

    ERIC Educational Resources Information Center

    Tajuddin, Muhammad Jawad; Ghani, Muhammad Faizal A.; Siraj, Saedah; Saifuldin, Mohd Helmi Firdaus; Kenayatulla, Husaina Banu; Elham, Faisol

    2013-01-01

    This research aims to develop a competency model for staff of higher education institutions in Malaysia. The model involves the listing of the main features and implementation strategy for the development of academic competence. Specifically, this research aims to achieve the following research objectives: a) to identify if there is any…

  3. Identifying biological concepts from a protein-related corpus with a probabilistic topic model

    PubMed Central

    Zheng, Bin; McLean, David C; Lu, Xinghua

    2006-01-01

    Background Biomedical literature, e.g., MEDLINE, contains a wealth of knowledge regarding functions of proteins. Major recurring biological concepts within such text corpora represent the domains of this body of knowledge. The goal of this research is to identify the major biological topics/concepts from a corpus of protein-related MEDLINE© titles and abstracts by applying a probabilistic topic model. Results The latent Dirichlet allocation (LDA) model was applied to the corpus. Based on the Bayesian model selection, 300 major topics were extracted from the corpus. The majority of identified topics/concepts was found to be semantically coherent and most represented biological objects or concepts. The identified topics/concepts were further mapped to the controlled vocabulary of the Gene Ontology (GO) terms based on mutual information. Conclusion The major and recurring biological concepts within a collection of MEDLINE documents can be extracted by the LDA model. The identified topics/concepts provide parsimonious and semantically-enriched representation of the texts in a semantic space with reduced dimensionality and can be used to index text. PMID:16466569

  4. Addressing wild turkey population declines using structured decision making

    USGS Publications Warehouse

    Robinson, Kelly F.; Fuller, Angela K.; Schiavone, Michael V.; Swift, Bryan L.; Diefenbach, Duane R.; Siemer, William F.; Decker, Daniel J.

    2017-01-01

    We present a case study from New York, USA, of the use of structured decision making (SDM) to identify fall turkey harvest regulations that best meet stakeholder objectives, in light of recent apparent declines in abundance of wild turkeys in the northeastern United States. We used the SDM framework to incorporate the multiple objectives associated with turkey hunting, stakeholder desires, and region-specific ecological and environmental factors that could influence fall harvest. We identified a set of 4 fall harvest regulations, composed of different season lengths and bag limits, and evaluated their relative achievement of the objectives. We used a stochastic turkey population model, statistical modeling, and expert elicitation to evaluate the consequences of each harvest regulation on each of the objectives. We conducted a statewide mail survey of fall turkey hunters in New York to gather the necessary information to evaluate tradeoffs among multiple objectives associated with hunter satisfaction. The optimal fall harvest regulation was a 2-week season and allowed for the harvest of 1 bird/hunter. This regulation was the most conservative of those evaluated, reflecting the concerns about recent declines in turkey abundance among agency wildlife biologists and the hunting public. Depending on the region of the state, the 2-week, 1-bird regulation was predicted to result in 7–32% more turkeys on the landscape after 5 years. The SDM process provided a transparent framework for setting fall turkey harvest regulations and reduced potential stakeholder conflict by explicitly taking the multiple objectives of different stakeholder groups into account.

  5. On the methodology of Engineering Geodesy

    NASA Astrophysics Data System (ADS)

    Brunner, Fritz K.

    2007-09-01

    Textbooks on geodetic surveying usually describe a very small number of principles which should provide the foundation of geodetic surveying. Here, the author argues that an applied field, such as engineering geodesy, has a methodology as foundation rather than a few principles. Ten methodological elements (ME) are identified: (1) Point discretisation of natural surfaces and objects, (2) distinction between coordinate and observation domain, (3) definition of reference systems, (4) specification of unknown parameters and desired precisions, (5) geodetic network and observation design, (6) quality control of equipment, (7) quality control of measurements, (8) establishment of measurement models, (9) establishment of parameter estimation models, (10) quality control of results. Each ME consists of a suite of theoretical developments, geodetic techniques and calculation procedures, which will be discussed. This paper is to be considered a first attempt at identifying the specific elements of the methodology of engineering geodesy. A better understanding of this methodology could lead to an increased objectivity, to a transformation of subjective practical experiences into objective working methods, and consequently to a new structure for teaching this rather diverse subject.

  6. Exploiting range imagery: techniques and applications

    NASA Astrophysics Data System (ADS)

    Armbruster, Walter

    2009-07-01

    Practically no applications exist for which automatic processing of 2D intensity imagery can equal human visual perception. This is not the case for range imagery. The paper gives examples of 3D laser radar applications, for which automatic data processing can exceed human visual cognition capabilities and describes basic processing techniques for attaining these results. The examples are drawn from the fields of helicopter obstacle avoidance, object detection in surveillance applications, object recognition at high range, multi-object-tracking, and object re-identification in range image sequences. Processing times and recognition performances are summarized. The techniques used exploit the bijective continuity of the imaging process as well as its independence of object reflectivity, emissivity and illumination. This allows precise formulations of the probability distributions involved in figure-ground segmentation, feature-based object classification and model based object recognition. The probabilistic approach guarantees optimal solutions for single images and enables Bayesian learning in range image sequences. Finally, due to recent results in 3D-surface completion, no prior model libraries are required for recognizing and re-identifying objects of quite general object categories, opening the way to unsupervised learning and fully autonomous cognitive systems.

  7. Model Study for an Economic Data Program on the Conditions of Arts and Cultural Institutions. Final Report.

    ERIC Educational Resources Information Center

    Deane, Robert T.; And Others

    The development of econometric models and a data base to predict the responsiveness of arts institutions to changes in the economy is reported. The study focused on models for museums, theaters (profit and non-profit), symphony, ballet, opera, and dance. The report details four objectives of the project: to identify useful databases and studies on…

  8. Neurally and ocularly informed graph-based models for searching 3D environments

    NASA Astrophysics Data System (ADS)

    Jangraw, David C.; Wang, Jun; Lance, Brent J.; Chang, Shih-Fu; Sajda, Paul

    2014-08-01

    Objective. As we move through an environment, we are constantly making assessments, judgments and decisions about the things we encounter. Some are acted upon immediately, but many more become mental notes or fleeting impressions—our implicit ‘labeling’ of the world. In this paper, we use physiological correlates of this labeling to construct a hybrid brain-computer interface (hBCI) system for efficient navigation of a 3D environment. Approach. First, we record electroencephalographic (EEG), saccadic and pupillary data from subjects as they move through a small part of a 3D virtual city under free-viewing conditions. Using machine learning, we integrate the neural and ocular signals evoked by the objects they encounter to infer which ones are of subjective interest to them. These inferred labels are propagated through a large computer vision graph of objects in the city, using semi-supervised learning to identify other, unseen objects that are visually similar to the labeled ones. Finally, the system plots an efficient route to help the subjects visit the ‘similar’ objects it identifies. Main results. We show that by exploiting the subjects’ implicit labeling to find objects of interest instead of exploring naively, the median search precision is increased from 25% to 97%, and the median subject need only travel 40% of the distance to see 84% of the objects of interest. We also find that the neural and ocular signals contribute in a complementary fashion to the classifiers’ inference of subjects’ implicit labeling. Significance. In summary, we show that neural and ocular signals reflecting subjective assessment of objects in a 3D environment can be used to inform a graph-based learning model of that environment, resulting in an hBCI system that improves navigation and information delivery specific to the user’s interests.

  9. Aerothermal modeling program, phase 1

    NASA Technical Reports Server (NTRS)

    Sturgess, G. J.

    1983-01-01

    The physical modeling embodied in the computational fluid dynamics codes is discussed. The objectives were to identify shortcomings in the models and to provide a program plan to improve the quantitative accuracy. The physical models studied were for: turbulent mass and momentum transport, heat release, liquid fuel spray, and gaseous radiation. The approach adopted was to test the models against appropriate benchmark-quality test cases from experiments in the literature for the constituent flows that together make up the combustor real flow.

  10. Extremely red quasars from SDSS, BOSS and WISE: classification of optical spectra

    NASA Astrophysics Data System (ADS)

    Ross, Nicholas P.; Hamann, Fred; Zakamska, Nadia L.; Richards, Gordon T.; Villforth, Carolin; Strauss, Michael A.; Greene, Jenny E.; Alexandroff, Rachael; Brandt, W. Niel; Liu, Guilin; Myers, Adam D.; Pâris, Isabelle; Schneider, Donald P.

    2015-11-01

    Quasars with extremely red infrared-to-optical colours are an interesting population that can test ideas about quasar evolution as well as orientation, obscuration and geometric effects in the so-called AGN unified model. To identify such a population, we match the quasar catalogues of the Sloan Digital Sky Survey (SDSS), the Baryon Oscillation Spectroscopic Survey (BOSS) to the Wide-Field Infrared Survey Explorer (WISE) to identify quasars with extremely high infrared-to-optical ratios. We identify 65 objects with rAB - W4Vega > 14 mag (i.e. Fν(22 μm)/Fν(r) ≳ 1000). This sample spans a redshift range of 0.28 < z < 4.36 and has a bimodal distribution, with peaks at z ˜ 0.8 and z ˜ 2.5. It includes three z > 2.6 objects that are detected in the W4 band but not W1 or W2 (i.e. `W1W2 dropouts'). The SDSS/BOSS spectra show that the majority of the objects are reddened type 1 quasars, type 2 quasars (both at low and high redshift) or objects with deep low-ionization broad absorption lines (BALs) that suppress the observed r-band flux. In addition, we identify a class of type 1 permitted broad emission-line objects at z ≃ 2-3 which are characterized by emission line rest-frame equivalent widths (REWs) of ≳150 Å, much larger than those of typical quasars. In particular, 55 per cent (45 per cent) of the non-BAL type 1s with measurable C IV in our sample have REW(C IV) > 100 (150) Å, compared to only 5.8 per cent (1.3 per cent) for non-BAL quasars in BOSS. These objects often also have unusual line ratios, such as very high N V/Ly α ratios. These large REWs might be caused by suppressed continuum emission analogous to type 2 quasars; however, there is no obvious mechanism in standard unified models to suppress the continuum without also obscuring the broad emission lines.

  11. Gestalt isomorphism and the primacy of subjective conscious experience: a Gestalt Bubble model.

    PubMed

    Lehar, Steven

    2003-08-01

    A serious crisis is identified in theories of neurocomputation, marked by a persistent disparity between the phenomenological or experiential account of visual perception and the neurophysiological level of description of the visual system. In particular, conventional concepts of neural processing offer no explanation for the holistic global aspects of perception identified by Gestalt theory. The problem is paradigmatic and can be traced to contemporary concepts of the functional role of the neural cell, known as the Neuron Doctrine. In the absence of an alternative neurophysiologically plausible model, I propose a perceptual modeling approach, to model the percept as experienced subjectively, rather than modeling the objective neurophysiological state of the visual system that supposedly subserves that experience. A Gestalt Bubble model is presented to demonstrate how the elusive Gestalt principles of emergence, reification, and invariance can be expressed in a quantitative model of the subjective experience of visual consciousness. That model in turn reveals a unique computational strategy underlying visual processing, which is unlike any algorithm devised by man, and certainly unlike the atomistic feed-forward model of neurocomputation offered by the Neuron Doctrine paradigm. The perceptual modeling approach reveals the primary function of perception as that of generating a fully spatial virtual-reality replica of the external world in an internal representation. The common objections to this "picture-in-the-head" concept of perceptual representation are shown to be ill founded.

  12. Viability of piping plover Charadrius melodus metapopulations

    USGS Publications Warehouse

    Plissner, Jonathan H.; Haig, Susan M.

    2000-01-01

    The metapopulation viability analysis package, VORTEX, was used to examine viability and recovery objectives for piping plovers Charadrius melodus, an endangered shorebird that breeds in three distinct regions of North America. Baseline models indicate that while Atlantic Coast populations, under current management practices, are at little risk of near-term extinction, Great Plains and Great Lakes populations require 36% higher mean fecundity for a significant probability of persisting for the next 100 years. Metapopulation structure (i.e. the delineation of populations within the metapopulation) and interpopulation dispersal rates had varying effects on model results; however, spatially-structured metapopulations exhibited lower viability than that reported for single-population models. The models were most sensitive to variation in survivorship; hence, additional mortality data will improve their accuracy. With this information, such models become useful tools in identifying successful management objectives; and sensitivity analyses, even in the absence of some data, may indicate which options are likely to be most effective. Metapopulation viability models are best suited for developing conservation strategies for achieving recovery objectives based on maintaining an externally derived, target population size and structure.

  13. Automatic identification of bacterial types using statistical imaging methods

    NASA Astrophysics Data System (ADS)

    Trattner, Sigal; Greenspan, Hayit; Tepper, Gapi; Abboud, Shimon

    2003-05-01

    The objective of the current study is to develop an automatic tool to identify bacterial types using computer-vision and statistical modeling techniques. Bacteriophage (phage)-typing methods are used to identify and extract representative profiles of bacterial types, such as the Staphylococcus Aureus. Current systems rely on the subjective reading of plaque profiles by human expert. This process is time-consuming and prone to errors, especially as technology is enabling the increase in the number of phages used for typing. The statistical methodology presented in this work, provides for an automated, objective and robust analysis of visual data, along with the ability to cope with increasing data volumes.

  14. Review: To be or not to be an identifiable model. Is this a relevant question in animal science modelling?

    PubMed

    Muñoz-Tamayo, R; Puillet, L; Daniel, J B; Sauvant, D; Martin, O; Taghipoor, M; Blavy, P

    2018-04-01

    What is a good (useful) mathematical model in animal science? For models constructed for prediction purposes, the question of model adequacy (usefulness) has been traditionally tackled by statistical analysis applied to observed experimental data relative to model-predicted variables. However, little attention has been paid to analytic tools that exploit the mathematical properties of the model equations. For example, in the context of model calibration, before attempting a numerical estimation of the model parameters, we might want to know if we have any chance of success in estimating a unique best value of the model parameters from available measurements. This question of uniqueness is referred to as structural identifiability; a mathematical property that is defined on the sole basis of the model structure within a hypothetical ideal experiment determined by a setting of model inputs (stimuli) and observable variables (measurements). Structural identifiability analysis applied to dynamic models described by ordinary differential equations (ODEs) is a common practice in control engineering and system identification. This analysis demands mathematical technicalities that are beyond the academic background of animal science, which might explain the lack of pervasiveness of identifiability analysis in animal science modelling. To fill this gap, in this paper we address the analysis of structural identifiability from a practitioner perspective by capitalizing on the use of dedicated software tools. Our objectives are (i) to provide a comprehensive explanation of the structural identifiability notion for the community of animal science modelling, (ii) to assess the relevance of identifiability analysis in animal science modelling and (iii) to motivate the community to use identifiability analysis in the modelling practice (when the identifiability question is relevant). We focus our study on ODE models. By using illustrative examples that include published mathematical models describing lactation in cattle, we show how structural identifiability analysis can contribute to advancing mathematical modelling in animal science towards the production of useful models and, moreover, highly informative experiments via optimal experiment design. Rather than attempting to impose a systematic identifiability analysis to the modelling community during model developments, we wish to open a window towards the discovery of a powerful tool for model construction and experiment design.

  15. Identification of time-varying structural dynamic systems - An artificial intelligence approach

    NASA Technical Reports Server (NTRS)

    Glass, B. J.; Hanagud, S.

    1992-01-01

    An application of the artificial intelligence-derived methodologies of heuristic search and object-oriented programming to the problem of identifying the form of the model and the associated parameters of a time-varying structural dynamic system is presented in this paper. Possible model variations due to changes in boundary conditions or configurations of a structure are organized into a taxonomy of models, and a variant of best-first search is used to identify the model whose simulated response best matches that of the current physical structure. Simulated model responses are verified experimentally. An output-error approach is used in a discontinuous model space, and an equation-error approach is used in the parameter space. The advantages of the AI methods used, compared with conventional programming techniques for implementing knowledge structuring and inheritance, are discussed. Convergence conditions and example problems have been discussed. In the example problem, both the time-varying model and its new parameters have been identified when changes occur.

  16. Tactile recognition and localization using object models: the case of polyhedra on a plane.

    PubMed

    Gaston, P C; Lozano-Perez, T

    1984-03-01

    This paper discusses how data from multiple tactile sensors may be used to identify and locate one object, from among a set of known objects. We use only local information from sensors: 1) the position of contact points and 2) ranges of surface normals at the contact points. The recognition and localization process is structured as the development and pruning of a tree of consistent hypotheses about pairings between contact points and object surfaces. In this paper, we deal with polyhedral objects constrained to lie on a known plane, i.e., having three degrees of positioning freedom relative to the sensors. We illustrate the performance of the algorithm by simulation.

  17. Characterization and Computational Modeling of Minor Phases in Alloy LSHR

    NASA Technical Reports Server (NTRS)

    Jou, Herng-Jeng; Olson, Gregory; Gabb, Timothy; Garg, Anita; Miller, Derek

    2012-01-01

    The minor phases of powder metallurgy disk superalloy LSHR were studied. Samples were consistently heat treated at three different temperatures for long times to approach equilibrium. Additional heat treatments were also performed for shorter times, to assess minor phase kinetics in non-equilibrium conditions. Minor phases including MC carbides, M23C6 carbides, M3B2 borides, and sigma were identified. Their average sizes and total area fractions were determined. CALPHAD thermodynamics databases and PrecipiCalc(TradeMark), a computational precipitation modeling tool, were employed with Ni-base thermodynamics and diffusion databases to model and simulate the phase microstructural evolution observed in the experiments with an objective to identify the model limitations and the directions of model enhancement.

  18. Interoperability format translation and transformation between IFC architectural design file and simulation file formats

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chao, Tian-Jy; Kim, Younghun

    Automatically translating a building architecture file format (Industry Foundation Class) to a simulation file, in one aspect, may extract data and metadata used by a target simulation tool from a building architecture file. Interoperability data objects may be created and the extracted data is stored in the interoperability data objects. A model translation procedure may be prepared to identify a mapping from a Model View Definition to a translation and transformation function. The extracted data may be transformed using the data stored in the interoperability data objects, an input Model View Definition template, and the translation and transformation function tomore » convert the extracted data to correct geometric values needed for a target simulation file format used by the target simulation tool. The simulation file in the target simulation file format may be generated.« less

  19. Extension of RCC Topological Relations for 3d Complex Objects Components Extracted from 3d LIDAR Point Clouds

    NASA Astrophysics Data System (ADS)

    Xing, Xu-Feng; Abolfazl Mostafavia, Mir; Wang, Chen

    2016-06-01

    Topological relations are fundamental for qualitative description, querying and analysis of a 3D scene. Although topological relations for 2D objects have been extensively studied and implemented in GIS applications, their direct extension to 3D is very challenging and they cannot be directly applied to represent relations between components of complex 3D objects represented by 3D B-Rep models in R3. Herein we present an extended Region Connection Calculus (RCC) model to express and formalize topological relations between planar regions for creating 3D model represented by Boundary Representation model in R3. We proposed a new dimension extended 9-Intersection model to represent the basic relations among components of a complex object, including disjoint, meet and intersect. The last element in 3*3 matrix records the details of connection through the common parts of two regions and the intersecting line of two planes. Additionally, this model can deal with the case of planar regions with holes. Finally, the geometric information is transformed into a list of strings consisting of topological relations between two planar regions and detailed connection information. The experiments show that the proposed approach helps to identify topological relations of planar segments of point cloud automatically.

  20. Real and virtual explorations of the environment and interactive tracking of movable objects for the blind on the basis of tactile-acoustical maps and 3D environment models.

    PubMed

    Hub, Andreas; Hartter, Tim; Kombrink, Stefan; Ertl, Thomas

    2008-01-01

    PURPOSE.: This study describes the development of a multi-functional assistant system for the blind which combines localisation, real and virtual navigation within modelled environments and the identification and tracking of fixed and movable objects. The approximate position of buildings is determined with a global positioning sensor (GPS), then the user establishes exact position at a specific landmark, like a door. This location initialises indoor navigation, based on an inertial sensor, a step recognition algorithm and map. Tracking of movable objects is provided by another inertial sensor and a head-mounted stereo camera, combined with 3D environmental models. This study developed an algorithm based on shape and colour to identify objects and used a common face detection algorithm to inform the user of the presence and position of others. The system allows blind people to determine their position with approximately 1 metre accuracy. Virtual exploration of the environment can be accomplished by moving one's finger on a touch screen of a small portable tablet PC. The name of rooms, building features and hazards, modelled objects and their positions are presented acoustically or in Braille. Given adequate environmental models, this system offers blind people the opportunity to navigate independently and safely, even within unknown environments. Additionally, the system facilitates education and rehabilitation by providing, in several languages, object names, features and relative positions.

  1. Utility of a novel error-stepping method to improve gradient-based parameter identification by increasing the smoothness of the local objective surface: a case-study of pulmonary mechanics.

    PubMed

    Docherty, Paul D; Schranz, Christoph; Chase, J Geoffrey; Chiew, Yeong Shiong; Möller, Knut

    2014-05-01

    Accurate model parameter identification relies on accurate forward model simulations to guide convergence. However, some forward simulation methodologies lack the precision required to properly define the local objective surface and can cause failed parameter identification. The role of objective surface smoothness in identification of a pulmonary mechanics model was assessed using forward simulation from a novel error-stepping method and a proprietary Runge-Kutta method. The objective surfaces were compared via the identified parameter discrepancy generated in a Monte Carlo simulation and the local smoothness of the objective surfaces they generate. The error-stepping method generated significantly smoother error surfaces in each of the cases tested (p<0.0001) and more accurate model parameter estimates than the Runge-Kutta method in three of the four cases tested (p<0.0001) despite a 75% reduction in computational cost. Of note, parameter discrepancy in most cases was limited to a particular oblique plane, indicating a non-intuitive multi-parameter trade-off was occurring. The error-stepping method consistently improved or equalled the outcomes of the Runge-Kutta time-integration method for forward simulations of the pulmonary mechanics model. This study indicates that accurate parameter identification relies on accurate definition of the local objective function, and that parameter trade-off can occur on oblique planes resulting prematurely halted parameter convergence. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  2. Ensemble Learning Method for Outlier Detection and its Application to Astronomical Light Curves

    NASA Astrophysics Data System (ADS)

    Nun, Isadora; Protopapas, Pavlos; Sim, Brandon; Chen, Wesley

    2016-09-01

    Outlier detection is necessary for automated data analysis, with specific applications spanning almost every domain from financial markets to epidemiology to fraud detection. We introduce a novel mixture of the experts outlier detection model, which uses a dynamically trained, weighted network of five distinct outlier detection methods. After dimensionality reduction, individual outlier detection methods score each data point for “outlierness” in this new feature space. Our model then uses dynamically trained parameters to weigh the scores of each method, allowing for a finalized outlier score. We find that the mixture of experts model performs, on average, better than any single expert model in identifying both artificially and manually picked outliers. This mixture model is applied to a data set of astronomical light curves, after dimensionality reduction via time series feature extraction. Our model was tested using three fields from the MACHO catalog and generated a list of anomalous candidates. We confirm that the outliers detected using this method belong to rare classes, like Novae, He-burning, and red giant stars; other outlier light curves identified have no available information associated with them. To elucidate their nature, we created a website containing the light-curve data and information about these objects. Users can attempt to classify the light curves, give conjectures about their identities, and sign up for follow up messages about the progress made on identifying these objects. This user submitted data can be used further train of our mixture of experts model. Our code is publicly available to all who are interested.

  3. Concrete crosstie fastener sub-system testing and modeling.

    DOT National Transportation Integrated Search

    2014-02-10

    The primary objective of this project is to identify methods of improving concrete railroad crosstie fastening system design and performance by conducting a thorough investigation of the behavior of the fastening system using Finite Element Analysis ...

  4. Traffic prediction using wireless cellular networks : final report.

    DOT National Transportation Integrated Search

    2016-03-01

    The major objective of this project is to obtain traffic information from existing wireless : infrastructure. : In this project freeway traffic is identified and modeled using data obtained from existing : wireless cellular networks. Most of the prev...

  5. Rational selection of experimental readout and intervention sites for reducing uncertainties in computational model predictions.

    PubMed

    Flassig, Robert J; Migal, Iryna; der Zalm, Esther van; Rihko-Struckmann, Liisa; Sundmacher, Kai

    2015-01-16

    Understanding the dynamics of biological processes can substantially be supported by computational models in the form of nonlinear ordinary differential equations (ODE). Typically, this model class contains many unknown parameters, which are estimated from inadequate and noisy data. Depending on the ODE structure, predictions based on unmeasured states and associated parameters are highly uncertain, even undetermined. For given data, profile likelihood analysis has been proven to be one of the most practically relevant approaches for analyzing the identifiability of an ODE structure, and thus model predictions. In case of highly uncertain or non-identifiable parameters, rational experimental design based on various approaches has shown to significantly reduce parameter uncertainties with minimal amount of effort. In this work we illustrate how to use profile likelihood samples for quantifying the individual contribution of parameter uncertainty to prediction uncertainty. For the uncertainty quantification we introduce the profile likelihood sensitivity (PLS) index. Additionally, for the case of several uncertain parameters, we introduce the PLS entropy to quantify individual contributions to the overall prediction uncertainty. We show how to use these two criteria as an experimental design objective for selecting new, informative readouts in combination with intervention site identification. The characteristics of the proposed multi-criterion objective are illustrated with an in silico example. We further illustrate how an existing practically non-identifiable model for the chlorophyll fluorescence induction in a photosynthetic organism, D. salina, can be rendered identifiable by additional experiments with new readouts. Having data and profile likelihood samples at hand, the here proposed uncertainty quantification based on prediction samples from the profile likelihood provides a simple way for determining individual contributions of parameter uncertainties to uncertainties in model predictions. The uncertainty quantification of specific model predictions allows identifying regions, where model predictions have to be considered with care. Such uncertain regions can be used for a rational experimental design to render initially highly uncertain model predictions into certainty. Finally, our uncertainty quantification directly accounts for parameter interdependencies and parameter sensitivities of the specific prediction.

  6. Methodology to improve design of accelerated life tests in civil engineering projects.

    PubMed

    Lin, Jing; Yuan, Yongbo; Zhou, Jilai; Gao, Jie

    2014-01-01

    For reliability testing an Energy Expansion Tree (EET) and a companion Energy Function Model (EFM) are proposed and described in this paper. Different from conventional approaches, the EET provides a more comprehensive and objective way to systematically identify external energy factors affecting reliability. The EFM introduces energy loss into a traditional Function Model to identify internal energy sources affecting reliability. The combination creates a sound way to enumerate the energies to which a system may be exposed during its lifetime. We input these energies into planning an accelerated life test, a Multi Environment Over Stress Test. The test objective is to discover weak links and interactions among the system and the energies to which it is exposed, and design them out. As an example, the methods are applied to the pipe in subsea pipeline. However, they can be widely used in other civil engineering industries as well. The proposed method is compared with current methods.

  7. Rasch model of a dynamic assessment: an investigation of the children's inferential thinking modifiability test.

    PubMed

    Rittner, Linda L; Pulos, Steven M

    2014-01-01

    The purpose of this study was to develop a general procedure for evaluation of a dynamic assessment and to demonstrate an analysis of a dynamic assessment, the CITM (Tzuriel, 1995b), as an objective measure for use as a group assessment. The techniques used to determine the fit of the CITM to a Rasch partial credit model are explicitly outlined. A modified format of the CITM was administered to 266 diverse second grade students in the USA; 58% of participants were identified as low SES. The participants (males n = 144) were White Anglo and Latino American students (55%), many of whom were first generation Mexican immigrants. The CITM was found to adequately fit a Rasch partial credit model (PCM) indicating that the CITM is a likely candidate for a group administered dynamic assessment that can be measured objectively. Data also supported that a model for objectively measuring change in learning ability for inferential thinking in the CITM was feasible.

  8. Integrating satellite actual evapotranspiration patterns into distributed model parametrization and evaluation for a mesoscale catchment

    NASA Astrophysics Data System (ADS)

    Demirel, M. C.; Mai, J.; Stisen, S.; Mendiguren González, G.; Koch, J.; Samaniego, L. E.

    2016-12-01

    Distributed hydrologic models are traditionally calibrated and evaluated against observations of streamflow. Spatially distributed remote sensing observations offer a great opportunity to enhance spatial model calibration schemes. For that it is important to identify the model parameters that can change spatial patterns before the satellite based hydrologic model calibration. Our study is based on two main pillars: first we use spatial sensitivity analysis to identify the key parameters controlling the spatial distribution of actual evapotranspiration (AET). Second, we investigate the potential benefits of incorporating spatial patterns from MODIS data to calibrate the mesoscale Hydrologic Model (mHM). This distributed model is selected as it allows for a change in the spatial distribution of key soil parameters through the calibration of pedo-transfer function parameters and includes options for using fully distributed daily Leaf Area Index (LAI) directly as input. In addition the simulated AET can be estimated at the spatial resolution suitable for comparison to the spatial patterns observed using MODIS data. We introduce a new dynamic scaling function employing remotely sensed vegetation to downscale coarse reference evapotranspiration. In total, 17 parameters of 47 mHM parameters are identified using both sequential screening and Latin hypercube one-at-a-time sampling methods. The spatial patterns are found to be sensitive to the vegetation parameters whereas streamflow dynamics are sensitive to the PTF parameters. The results of multi-objective model calibration show that calibration of mHM against observed streamflow does not reduce the spatial errors in AET while they improve only the streamflow simulations. We will further examine the results of model calibration using only multi spatial objective functions measuring the association between observed AET and simulated AET maps and another case including spatial and streamflow metrics together.

  9. Toward Enhancing Automated Credibility Assessment: A Model for Question Type Classification and Tools for Linguistic Analysis

    ERIC Educational Resources Information Center

    Moffitt, Kevin Christopher

    2011-01-01

    The three objectives of this dissertation were to develop a question type model for predicting linguistic features of responses to interview questions, create a tool for linguistic analysis of documents, and use lexical bundle analysis to identify linguistic differences between fraudulent and non-fraudulent financial reports. First, The Moffitt…

  10. Shaken but not stirred: Multiscale habitat suitability modeling of sympatric marten species (Martes martes and Martes foina) in the northern Iberian Peninsula

    Treesearch

    Maria Vergara; Samuel A. Cushman; Fermin Urra; Aritz Ruiz-Gonzalez

    2016-01-01

    Multispecies and multiscale habitat suitability models (HSM) are important to identify the environmental variables and scales influencing habitat selection and facilitate the comparison of closely related species with different ecological requirements. Objectives This study explores the multiscale relationships of habitat suitability for the pine (Martes...

  11. Identifying Affordances of 3D Printed Tangible Models for Understanding Core Biological Concepts

    ERIC Educational Resources Information Center

    Davenport, Jodi L.; Silberglitt, Matt; Boxerman, Jonathan; Olson, Arthur

    2014-01-01

    3D models derived from actual molecular structures have the potential to transform student learning in biology. We share findings related to our research questions: 1) what types of interactions with a protein folding kit promote specific learning objectives?, and 2) what features of the instructional environment (e.g., peer interactions, teacher…

  12. A Computerized Information System Model for Decision Making for the Oklahoma State Department of Vocational and Technical Education.

    ERIC Educational Resources Information Center

    Smith, Hubert Gene

    The objectives of the study presented in the dissertation were to identify present and anticipated information requirements of the various departments within the Oklahoma State Department of Vocational and Technical Education, to design a computerized information system model utilizing an integrated systems concept to meet information…

  13. Spatial modeling of cutaneous leishmaniasis in the Andean region of Colombia.

    PubMed

    Pérez-Flórez, Mauricio; Ocampo, Clara Beatriz; Valderrama-Ardila, Carlos; Alexander, Neal

    2016-06-27

    The objective of this research was to identify environmental risk factors for cutaneous leishmaniasis (CL) in Colombia and map high-risk municipalities. The study area was the Colombian Andean region, comprising 715 rural and urban municipalities. We used 10 years of CL surveillance: 2000-2009. We used spatial-temporal analysis - conditional autoregressive Poisson random effects modelling - in a Bayesian framework to model the dependence of municipality-level incidence on land use, climate, elevation and population density. Bivariable spatial analysis identified rainforests, forests and secondary vegetation, temperature, and annual precipitation as positively associated with CL incidence. By contrast, livestock agroecosystems and temperature seasonality were negatively associated. Multivariable analysis identified land use - rainforests and agro-livestock - and climate - temperature, rainfall and temperature seasonality - as best predictors of CL. We conclude that climate and land use can be used to identify areas at high risk of CL and that this approach is potentially applicable elsewhere in Latin America.

  14. Practical Challenges of Systems Thinking and Modeling in Public Health

    PubMed Central

    Trochim, William M.; Cabrera, Derek A.; Milstein, Bobby; Gallagher, Richard S.; Leischow, Scott J.

    2006-01-01

    Objectives. Awareness of and support for systems thinking and modeling in the public health field are growing, yet there are many practical challenges to implementation. We sought to identify and describe these challenges from the perspectives of practicing public health professionals. Methods. A systems-based methodology, concept mapping, was used in a study of 133 participants from 2 systems-based public health initiatives (the Initiative for the Study and Implementation of Systems and the Syndemics Prevention Network). This method identified 100 key challenges to implementation of systems thinking and modeling in public health work. Results. The project resulted in a map identifying 8 categories of challenges and the dynamic interactions among them. Conclusions. Implementation by public health professionals of the 8 simple rules we derived from the clusters in the map identified here will help to address challenges and improve the organization of systems that protect the public’s health. PMID:16449581

  15. A fast 3-D object recognition algorithm for the vision system of a special-purpose dexterous manipulator

    NASA Technical Reports Server (NTRS)

    Hung, Stephen H. Y.

    1989-01-01

    A fast 3-D object recognition algorithm that can be used as a quick-look subsystem to the vision system for the Special-Purpose Dexterous Manipulator (SPDM) is described. Global features that can be easily computed from range data are used to characterize the images of a viewer-centered model of an object. This algorithm will speed up the processing by eliminating the low level processing whenever possible. It may identify the object, reject a set of bad data in the early stage, or create a better environment for a more powerful algorithm to carry the work further.

  16. [The Visual Association Test to study episodic memory in clinical geriatric psychology].

    PubMed

    Diesfeldt, Han; Prins, Marleen; Lauret, Gijs

    2018-04-01

    The Visual Association Test (VAT) is a brief learning task that consists of six line drawings of pairs of interacting objects (association cards). Subjects are asked to name or identify each object and later are presented with one object from the pair (the cue) and asked to name the other (the target). The VAT was administered in a consecutive sample of 174 psychogeriatric day care participants with mild to major neurocognitive disorder. Comparison of test performance with normative data from non-demented subjects revealed that 69% scored within the range of a major deficit (0-8 over two recall trials), 14% a minor, and 17% no deficit (9-10, and ≥10 respectively).VAT-scores correlated with another test of memory function, the Cognitive Screening Test (CST), based on the Short Portable Mental Status Questionnaire (r = 0.53). Tests of executive functioning (Expanded Mental Control Test, Category Fluency, Clock Drawing) did not add significantly to the explanation of variance in VAT-scores.Fifty-five participants (31.6%) were faced with initial problems in naming or identifying one or more objects on the cue cards or association cards. If necessary, naming was aided by the investigator. Initial difficulties in identifying cue objects were associated with lower VAT-scores, but this did not hold for difficulties in identifying target objects.A hierarchical multiple regression analysis was used to examine whether linear or quadratic trends best fitted VAT performance across the range of CST scores. The regression model revealed a linear but not a quadratic trend. The best fitting linear model implied that VAT scores differentiated between CST scores in the lower, as well as in the upper range, indicating the absence of floor and ceiling effects, respectively. Moreover, the VAT compares favourably to word list-learning tasks being more attractive in its presentation of interacting visual objects and cued recall based on incidental learning of the association between cues and targets.For practical purposes and based on documented sensitivity and specificity, Bayesian probability tables give predictive power of age-specific VAT cutoff scores for the presence or absence of a major neurocognitive disorder across a range of a priori probabilities or base rates.

  17. A hybrid framework for quantifying the influence of data in hydrological model calibration

    NASA Astrophysics Data System (ADS)

    Wright, David P.; Thyer, Mark; Westra, Seth; McInerney, David

    2018-06-01

    Influence diagnostics aim to identify a small number of influential data points that have a disproportionate impact on the model parameters and/or predictions. The key issues with current influence diagnostic techniques are that the regression-theory approaches do not provide hydrologically relevant influence metrics, while the case-deletion approaches are computationally expensive to calculate. The main objective of this study is to introduce a new two-stage hybrid framework that overcomes these challenges, by delivering hydrologically relevant influence metrics in a computationally efficient manner. Stage one uses computationally efficient regression-theory influence diagnostics to identify the most influential points based on Cook's distance. Stage two then uses case-deletion influence diagnostics to quantify the influence of points using hydrologically relevant metrics. To illustrate the application of the hybrid framework, we conducted three experiments on 11 hydro-climatologically diverse Australian catchments using the GR4J hydrological model. The first experiment investigated how many data points from stage one need to be retained in order to reliably identify those points that have the hightest influence on hydrologically relevant metrics. We found that a choice of 30-50 is suitable for hydrological applications similar to those explored in this study (30 points identified the most influential data 98% of the time and reduced the required recalibrations by 99% for a 10 year calibration period). The second experiment found little evidence of a change in the magnitude of influence with increasing calibration period length from 1, 2, 5 to 10 years. Even for 10 years the impact of influential points can still be high (>30% influence on maximum predicted flows). The third experiment compared the standard least squares (SLS) objective function with the weighted least squares (WLS) objective function on a 10 year calibration period. In two out of three flow metrics there was evidence that SLS, with the assumption of homoscedastic residual error, identified data points with higher influence (largest changes of 40%, 10%, and 44% for the maximum, mean, and low flows, respectively) than WLS, with the assumption of heteroscedastic residual errors (largest changes of 26%, 6%, and 6% for the maximum, mean, and low flows, respectively). The hybrid framework complements existing model diagnostic tools and can be applied to a wide range of hydrological modelling scenarios.

  18. Application of meandering centreline migration modelling and object-based approach of Long Nab member

    NASA Astrophysics Data System (ADS)

    Saadi, Saad

    2017-04-01

    Characterizing the complexity and heterogeneity of the geometries and deposits in meandering river system is an important concern for the reservoir modelling of fluvial environments. Re-examination of the Long Nab member in the Scalby formation of the Ravenscar Group (Yorkshire, UK), integrating digital outcrop data and forward modelling approaches, will lead to a geologically realistic numerical model of the meandering river geometry. The methodology is based on extracting geostatistics from modern analogous, meandering rivers that exemplify both the confined and non-confined meandering point bars deposits and morphodynamics of Long Nab member. The parameters derived from the modern systems (i.e. channel width, amplitude, radius of curvature, sinuosity, wavelength, channel length and migration rate) are used as a statistical control for the forward simulation and resulting object oriented channel models. The statistical data derived from the modern analogues is multi-dimensional in nature, making analysis difficult. We apply data mining techniques such as parallel coordinates to investigate and identify the important relationships within the modern analogue data, which can then be used drive the development of, and as input to the forward model. This work will increase our understanding of meandering river morphodynamics, planform architecture and stratigraphic signature of various fluvial deposits and features. We will then use these forward modelling based channel objects to build reservoir models, and compare the behaviour of the forward modelled channels with traditional object modelling in hydrocarbon flow simulations.

  19. A comparative analysis of 9 multi-model averaging approaches in hydrological continuous streamflow simulation

    NASA Astrophysics Data System (ADS)

    Arsenault, Richard; Gatien, Philippe; Renaud, Benoit; Brissette, François; Martel, Jean-Luc

    2015-10-01

    This study aims to test whether a weighted combination of several hydrological models can simulate flows more accurately than the models taken individually. In addition, the project attempts to identify the most efficient model averaging method and the optimal number of models to include in the weighting scheme. In order to address the first objective, streamflow was simulated using four lumped hydrological models (HSAMI, HMETS, MOHYSE and GR4J-6), each of which were calibrated with three different objective functions on 429 watersheds. The resulting 12 hydrographs (4 models × 3 metrics) were weighted and combined with the help of 9 averaging methods which are the simple arithmetic mean (SAM), Akaike information criterion (AICA), Bates-Granger (BGA), Bayes information criterion (BICA), Bayesian model averaging (BMA), Granger-Ramanathan average variant A, B and C (GRA, GRB and GRC) and the average by SCE-UA optimization (SCA). The same weights were then applied to the hydrographs in validation mode, and the Nash-Sutcliffe Efficiency metric was measured between the averaged and observed hydrographs. Statistical analyses were performed to compare the accuracy of weighted methods to that of individual models. A Kruskal-Wallis test and a multi-objective optimization algorithm were then used to identify the most efficient weighted method and the optimal number of models to integrate. Results suggest that the GRA, GRB, GRC and SCA weighted methods perform better than the individual members. Model averaging from these four methods were superior to the best of the individual members in 76% of the cases. Optimal combinations on all watersheds included at least one of each of the four hydrological models. None of the optimal combinations included all members of the ensemble of 12 hydrographs. The Granger-Ramanathan average variant C (GRC) is recommended as the best compromise between accuracy, speed of execution, and simplicity.

  20. Mass storage system reference model, Version 4

    NASA Technical Reports Server (NTRS)

    Coleman, Sam (Editor); Miller, Steve (Editor)

    1993-01-01

    The high-level abstractions that underlie modern storage systems are identified. The information to generate the model was collected from major practitioners who have built and operated large storage facilities, and represents a distillation of the wisdom they have acquired over the years. The model provides a common terminology and set of concepts to allow existing systems to be examined and new systems to be discussed and built. It is intended that the model and the interfaces identified from it will allow and encourage vendors to develop mutually-compatible storage components that can be combined to form integrated storage systems and services. The reference model presents an abstract view of the concepts and organization of storage systems. From this abstraction will come the identification of the interfaces and modules that will be used in IEEE storage system standards. The model is not yet suitable as a standard; it does not contain implementation decisions, such as how abstract objects should be broken up into software modules or how software modules should be mapped to hosts; it does not give policy specifications, such as when files should be migrated; does not describe how the abstract objects should be used or connected; and does not refer to specific hardware components. In particular, it does not fully specify the interfaces.

  1. Objective spatiotemporal proxy-model comparisons of the Asian monsoon for the last millennium

    NASA Astrophysics Data System (ADS)

    Anchukaitis, K. J.; Cook, E. R.; Ammann, C. M.; Buckley, B. M.; D'Arrigo, R. D.; Jacoby, G.; Wright, W. E.; Davi, N.; Li, J.

    2008-12-01

    The Asian monsoon system can be studied using a complementary proxy/simulation approach which evaluates climate models using estimates of past precipitation and temperature, and which subsequently applies the best understanding of the physics of the climate system as captured in general circulation models to evaluate the broad-scale dynamics behind regional paleoclimate reconstructions. Here, we use a millennial-length climate field reconstruction of monsoon season summer (JJA) drought, developed from tree- ring proxies, with coupled climate simulations from NCAR CSM1.4 and CCSM3 to evaluate the cause of large- scale persistent droughts over the last one thousand years. Direct comparisons are made between the external forced response within the climate model and the spatiotemporal field reconstruction. In order to identify patterns of drought associated with internal variability in the climate system, we use a model/proxy analog technique which objectively selects epochs in the model that most closely reproduce those observed in the reconstructions. The concomitant ocean-atmosphere dynamics are then interpreted in order to identify and understand the internal climate system forcing of low frequency monsoon variability. We examine specific periods of extensive or intensive regional drought in the 15th, 17th, and 18th centuries, many of which are coincident with major cultural changes in the region.

  2. Toward uniform implementation of parametric map Digital Imaging and Communication in Medicine standard in multisite quantitative diffusion imaging studies.

    PubMed

    Malyarenko, Dariya; Fedorov, Andriy; Bell, Laura; Prah, Melissa; Hectors, Stefanie; Arlinghaus, Lori; Muzi, Mark; Solaiyappan, Meiyappan; Jacobs, Michael; Fung, Maggie; Shukla-Dave, Amita; McManus, Kevin; Boss, Michael; Taouli, Bachir; Yankeelov, Thomas E; Quarles, Christopher Chad; Schmainda, Kathleen; Chenevert, Thomas L; Newitt, David C

    2018-01-01

    This paper reports on results of a multisite collaborative project launched by the MRI subgroup of Quantitative Imaging Network to assess current capability and provide future guidelines for generating a standard parametric diffusion map Digital Imaging and Communication in Medicine (DICOM) in clinical trials that utilize quantitative diffusion-weighted imaging (DWI). Participating sites used a multivendor DWI DICOM dataset of a single phantom to generate parametric maps (PMs) of the apparent diffusion coefficient (ADC) based on two models. The results were evaluated for numerical consistency among models and true phantom ADC values, as well as for consistency of metadata with attributes required by the DICOM standards. This analysis identified missing metadata descriptive of the sources for detected numerical discrepancies among ADC models. Instead of the DICOM PM object, all sites stored ADC maps as DICOM MR objects, generally lacking designated attributes and coded terms for quantitative DWI modeling. Source-image reference, model parameters, ADC units and scale, deemed important for numerical consistency, were either missing or stored using nonstandard conventions. Guided by the identified limitations, the DICOM PM standard has been amended to include coded terms for the relevant diffusion models. Open-source software has been developed to support conversion of site-specific formats into the standard representation.

  3. Assessment of GHG models for the surface transportation sector

    DOT National Transportation Integrated Search

    1993-09-01

    The Intermodal Surface Transportation Efficiency Act (ISTEA) of 1991 calls for a study of U.S. international border crossings. The objective of the study is to identify existing and emerging trade corridors and transportation subsystems that facilita...

  4. Tracker Toolkit

    NASA Technical Reports Server (NTRS)

    Lewis, Steven J.; Palacios, David M.

    2013-01-01

    This software can track multiple moving objects within a video stream simultaneously, use visual features to aid in the tracking, and initiate tracks based on object detection in a subregion. A simple programmatic interface allows plugging into larger image chain modeling suites. It extracts unique visual features for aid in tracking and later analysis, and includes sub-functionality for extracting visual features about an object identified within an image frame. Tracker Toolkit utilizes a feature extraction algorithm to tag each object with metadata features about its size, shape, color, and movement. Its functionality is independent of the scale of objects within a scene. The only assumption made on the tracked objects is that they move. There are no constraints on size within the scene, shape, or type of movement. The Tracker Toolkit is also capable of following an arbitrary number of objects in the same scene, identifying and propagating the track of each object from frame to frame. Target objects may be specified for tracking beforehand, or may be dynamically discovered within a tripwire region. Initialization of the Tracker Toolkit algorithm includes two steps: Initializing the data structures for tracked target objects, including targets preselected for tracking; and initializing the tripwire region. If no tripwire region is desired, this step is skipped. The tripwire region is an area within the frames that is always checked for new objects, and all new objects discovered within the region will be tracked until lost (by leaving the frame, stopping, or blending in to the background).

  5. Child Care: How Do Military and Civilian Center Costs Compare? United States General Accounting Office Report to Congressional Requesters.

    ERIC Educational Resources Information Center

    Fagnoni, Cynthia M.

    The Department of Defense's (DOD) child development program has been identified as a model for the rest of the nation. To provide a benchmark cost estimate for Congress as it addresses child care issues, this report identifies the objectives of the military child development program, describes its operation, determines the full costs of DOD…

  6. A Systematic Review of Conceptual Frameworks of Medical Complexity and New Model Development.

    PubMed

    Zullig, Leah L; Whitson, Heather E; Hastings, Susan N; Beadles, Chris; Kravchenko, Julia; Akushevich, Igor; Maciejewski, Matthew L

    2016-03-01

    Patient complexity is often operationalized by counting multiple chronic conditions (MCC) without considering contextual factors that can affect patient risk for adverse outcomes. Our objective was to develop a conceptual model of complexity addressing gaps identified in a review of published conceptual models. We searched for English-language MEDLINE papers published between 1 January 2004 and 16 January 2014. Two reviewers independently evaluated abstracts and all authors contributed to the development of the conceptual model in an iterative process. From 1606 identified abstracts, six conceptual models were selected. One additional model was identified through reference review. Each model had strengths, but several constructs were not fully considered: 1) contextual factors; 2) dynamics of complexity; 3) patients' preferences; 4) acute health shocks; and 5) resilience. Our Cycle of Complexity model illustrates relationships between acute shocks and medical events, healthcare access and utilization, workload and capacity, and patient preferences in the context of interpersonal, organizational, and community factors. This model may inform studies on the etiology of and changes in complexity, the relationship between complexity and patient outcomes, and intervention development to improve modifiable elements of complex patients.

  7. Identifying Novel Phenotypes of Vulnerability and Resistance to Activity-Based Anorexia in Adolescent Female Rats

    PubMed Central

    Barbarich-Marsteller, Nicole C.; Underwood, Mark D.; Foltin, Richard W.; Myers, Michael M.; Walsh, B. Timothy; Barrett, Jeffrey S.; Marsteller, Douglas A.

    2018-01-01

    Objective Activity-based anorexia is a translational rodent model that results in severe weight loss, hyperactivity, and voluntary self-starvation. The goal of our investigation was to identify vulnerable and resistant phenotypes of activity-based anorexia in adolescent female rats. Method Sprague-Dawley rats were maintained under conditions of restricted access to food (N = 64; or unlimited access, N = 16) until experimental exit, predefined as a target weight loss of 30–35% or meeting predefined criteria for animal health. Nonlinear mixed effects statistical modeling was used to describe wheel running behavior, time to event analysis was used to assess experimental exit, and a regressive partitioning algorithm was used to classify phenotypes. Results Objective criteria were identified for distinguishing novel phenotypes of activity-based anorexia, including a vulnerable phenotype that conferred maximal hyperactivity, minimal food intake, and the shortest time to experimental exit, and a resistant phenotype that conferred minimal activity and the longest time to experimental exit. Discussion The identification of objective criteria for defining vulnerable and resistant phenotypes of activity-based anorexia in adolescent female rats provides an important framework for studying the neural mechanisms that promote vulnerability to or protection against the development of self-starvation and hyperactivity during adolescence. Ultimately, future studies using these novel phenotypes may provide important translational insights into the mechanisms that promote these maladaptive behaviors characteristic of anorexia nervosa. PMID:23853140

  8. Object extraction in photogrammetric computer vision

    NASA Astrophysics Data System (ADS)

    Mayer, Helmut

    This paper discusses state and promising directions of automated object extraction in photogrammetric computer vision considering also practical aspects arising for digital photogrammetric workstations (DPW). A review of the state of the art shows that there are only few practically successful systems on the market. Therefore, important issues for a practical success of automated object extraction are identified. A sound and most important powerful theoretical background is the basis. Here, we particularly point to statistical modeling. Testing makes clear which of the approaches are suited best and how useful they are for praxis. A key for commercial success of a practical system is efficient user interaction. As the means for data acquisition are changing, new promising application areas such as extremely detailed three-dimensional (3D) urban models for virtual television or mission rehearsal evolve.

  9. The ODD protocol: A review and first update

    USGS Publications Warehouse

    Grimm, Volker; Berger, Uta; DeAngelis, Donald L.; Polhill, J. Gary; Giske, Jarl; Railsback, Steve F.

    2010-01-01

    The 'ODD' (Overview, Design concepts, and Details) protocol was published in 2006 to standardize the published descriptions of individual-based and agent-based models (ABMs). The primary objectives of ODD are to make model descriptions more understandable and complete, thereby making ABMs less subject to criticism for being irreproducible. We have systematically evaluated existing uses of the ODD protocol and identified, as expected, parts of ODD needing improvement and clarification. Accordingly, we revise the definition of ODD to clarify aspects of the original version and thereby facilitate future standardization of ABM descriptions. We discuss frequently raised critiques in ODD but also two emerging, and unanticipated, benefits: ODD improves the rigorous formulation of models and helps make the theoretical foundations of large models more visible. Although the protocol was designed for ABMs, it can help with documenting any large, complex model, alleviating some general objections against such models.

  10. Reconstruction of 3d Objects of Assets and Facilities by Using Benchmark Points

    NASA Astrophysics Data System (ADS)

    Baig, S. U.; Rahman, A. A.

    2013-08-01

    Acquiring and modeling 3D geo-data of building assets and facility objects is one of the challenges. A number of methods and technologies are being utilized for this purpose. Total station, GPS, photogrammetric and terrestrial laser scanning are few of these technologies. In this paper, points commonly shared by potential facades of assets and facilities modeled from point clouds are identified. These points are useful for modeling process to reconstruct 3D models of assets and facilities stored to be used for management purposes. These models are segmented through different planes to produce accurate 2D plans. This novel method improves the efficiency and quality of construction of models of assets and facilities with the aim utilize in 3D management projects such as maintenance of buildings or group of items that need to be replaced, or renovated for new services.

  11. Encapsulating model complexity and landscape-scale analyses of state-and-transition simulation models: an application of ecoinformatics and juniper encroachment in sagebrush steppe ecosystems

    USGS Publications Warehouse

    O'Donnell, Michael

    2015-01-01

    State-and-transition simulation modeling relies on knowledge of vegetation composition and structure (states) that describe community conditions, mechanistic feedbacks such as fire that can affect vegetation establishment, and ecological processes that drive community conditions as well as the transitions between these states. However, as the need for modeling larger and more complex landscapes increase, a more advanced awareness of computing resources becomes essential. The objectives of this study include identifying challenges of executing state-and-transition simulation models, identifying common bottlenecks of computing resources, developing a workflow and software that enable parallel processing of Monte Carlo simulations, and identifying the advantages and disadvantages of different computing resources. To address these objectives, this study used the ApexRMS® SyncroSim software and embarrassingly parallel tasks of Monte Carlo simulations on a single multicore computer and on distributed computing systems. The results demonstrated that state-and-transition simulation models scale best in distributed computing environments, such as high-throughput and high-performance computing, because these environments disseminate the workloads across many compute nodes, thereby supporting analysis of larger landscapes, higher spatial resolution vegetation products, and more complex models. Using a case study and five different computing environments, the top result (high-throughput computing versus serial computations) indicated an approximate 96.6% decrease of computing time. With a single, multicore compute node (bottom result), the computing time indicated an 81.8% decrease relative to using serial computations. These results provide insight into the tradeoffs of using different computing resources when research necessitates advanced integration of ecoinformatics incorporating large and complicated data inputs and models. - See more at: http://aimspress.com/aimses/ch/reader/view_abstract.aspx?file_no=Environ2015030&flag=1#sthash.p1XKDtF8.dpuf

  12. Neurally and ocularly informed graph-based models for searching 3D environments.

    PubMed

    Jangraw, David C; Wang, Jun; Lance, Brent J; Chang, Shih-Fu; Sajda, Paul

    2014-08-01

    As we move through an environment, we are constantly making assessments, judgments and decisions about the things we encounter. Some are acted upon immediately, but many more become mental notes or fleeting impressions-our implicit 'labeling' of the world. In this paper, we use physiological correlates of this labeling to construct a hybrid brain-computer interface (hBCI) system for efficient navigation of a 3D environment. First, we record electroencephalographic (EEG), saccadic and pupillary data from subjects as they move through a small part of a 3D virtual city under free-viewing conditions. Using machine learning, we integrate the neural and ocular signals evoked by the objects they encounter to infer which ones are of subjective interest to them. These inferred labels are propagated through a large computer vision graph of objects in the city, using semi-supervised learning to identify other, unseen objects that are visually similar to the labeled ones. Finally, the system plots an efficient route to help the subjects visit the 'similar' objects it identifies. We show that by exploiting the subjects' implicit labeling to find objects of interest instead of exploring naively, the median search precision is increased from 25% to 97%, and the median subject need only travel 40% of the distance to see 84% of the objects of interest. We also find that the neural and ocular signals contribute in a complementary fashion to the classifiers' inference of subjects' implicit labeling. In summary, we show that neural and ocular signals reflecting subjective assessment of objects in a 3D environment can be used to inform a graph-based learning model of that environment, resulting in an hBCI system that improves navigation and information delivery specific to the user's interests.

  13. Using the SCERTS model assessment tool to identify music therapy goals for clients with autism spectrum disorder.

    PubMed

    Walworth, Darcy D; Register, Dena; Engel, Judy Nguyen

    2009-01-01

    The purposes of this paper were to identify and compare goals and objectives addressed by music therapists that are contained in the SCERTS Model, for use with children at risk or diagnosed with a communication impartment including Autism Spectrum Disorder (ASD). A video analysis of music therapists working with clients at risk or diagnosed with ASD (N = 33) was conducted to: (a) identify the areas of the SCERTS assessment model that music therapists are currently addressing within their sessions for clients with ASD, and (b) compare the frequency of SCERTS domains and goals addressed by music therapists within sessions. Results of the analysis revealed that all three domains of social communication, emotional regulation, and transactional support were addressed within music therapy sessions. Within each domain both broad goals were all addressed including joint attention and symbol use for social communication, self-regulation and mutual regulation for emotional regulation, and interpersonal support and learning support for transactional support. Overall, music therapists addressed transactional support goals and subgoals more often than social communication and emotional regulation goals and subgoals. The highest frequency goal area addressed was interpersonal support (73.96%) and the lowest goal area addressed was joint attention (35.96%). For the social partner and language partner language stages, 58 of the 320 possible subgoals were addressed with 90% frequency or higher, while 13 of the same subgoals were never addressed. The SCERTS Model is designed for use by a multidisciplinary team of professionals and family members throughout a client's treatment and contains an ongoing assessment tool with resulting goals and objectives. This analysis indicates that many SCERTS goals and objectives can be addressed in music therapy interventions. Additionally, goals and subgoals not previously recognized in music therapy treatment can be generated by the use of the SCERTS Model.

  14. Using Cultural Modeling to Inform a NEDSS-Compatible System Functionality Evaluation

    PubMed Central

    Anderson, Olympia; Torres-Urquidy, Miguel

    2013-01-01

    Objective The culture by which public health professionals work defines their organizational objectives, expectations, policies, and values. These aspects of culture are often intangible and difficult to qualify. The introduction of an information system could further complicate the culture of a jurisdiction if the intangibles of a culture are not clearly understood. This report describes how cultural modeling can be used to capture intangible elements or factors that may affect NEDSS-compatible (NC) system functionalities within the culture of public health jurisdictions. Introduction The National Notifiable Disease Surveillance System (NNDSS) comprises many activities including collaborations, processes, standards, and systems which support gathering data from US states and territories. As part of NNDSS, the National Electronic Disease Surveillance System (NEDSS) provides the standards, tools, and resources to support reporting public health jurisdictions (jurisdictions). The NEDSS Base System (NBS) is a CDC-developed, software application available to jurisdictions to collect, manage, analyze and report national notifiable disease (NND) data. An evaluation of NEDSS with the objective of identifying the functionalities of NC systems and the impact of these features on the user’s culture is underway. Methods We used cultural models to capture additional NC system functionality gaps within the culture of the user. Cultural modeling is a process of graphically depicting people and organizations referred to as influencers and the intangible factors that affect the user’s operations or work as influences. Influencers are denoted as bubbles while influences are depicted as arrows penetrating the bubbles. In the cultural model, influence can be seen by the size and proximity (or lack of) in the model. We restricted the models to secondary data sources and interviews of CDC programs (data users) and public health jurisdictions (data reporters). Results Three cultural models were developed from the secondary information sources; these models include the NBS vendor, public health jurisdiction (jurisdiction) activities, and NEDSS technical consultants. The vendor cultural model identified channels of communication about functionalities flowing from the vendor and the NBS users with CDC as the approval mechanism. The jurisdiction activities model highlighted perceived issues external to the organization that had some impact in their organization. Key disconnecting issues in the jurisdiction model included situational awareness, data competency, and bureaucracy. This model also identified poor coordination as a major influencer of the jurisdiction’s activities. The NEDSS technical model identified major issues and disconnects among data access, capture and reporting, processing, and ELR functionalities (Figure 1). The data processing functionality resulted in the largest negative influencer with issues that included: loss of data specificity, lengthy submission strategies, and risk of data use. Collectively, the models depict issues with the system functionality but mostly identify other factors that may influence how jurisdictions use the system, moreover determining the functionalities to be included. Conclusions By using the cultural model as a guide, we are able to clarify complex relationships using multiple data sources and improve our understanding of the impacts of the NC system functionalities on user’s operations. Modeling the recipients of the data (e.g. CDC programs) will provide insight on additional factors that may inform the NEDSS evaluation.

  15. Using site-selection model to identify suitable sites for seagrass transplantation in the west coast of South Sulawesi

    NASA Astrophysics Data System (ADS)

    Lanuru, Mahatma; Mashoreng, S.; Amri, K.

    2018-03-01

    The success of seagrass transplantation is very much depending on the site selection and suitable transplantation methods. The main objective of this study is to develop and use a site-selection model to identify the suitability of sites for seagrass (Enhalus acoroides) transplantation. Model development was based on the physical and biological characteristics of the transplantation site. The site-selection process is divided into 3 phases: Phase I identifies potential seagrass habitat using available knowledge, removes unnecessary sites before the transplantation test is performed. Phase II involves field assessment and transplantation test of the best scoring areas identified in Phase I. Phase III is the final calculation of the TSI (Transplant Suitability Index), based on results from Phases I and II. The model was used to identify the suitability of sites for seagrass transplantation in the West coast of South Sulawesi (3 sites at Labakkang Coast, 3 sites at Awerange Bay, and 3 sites at Lale-Lae Island). Of the 9 sites, two sites were predicted by the site-selection model to be the most suitable sites for seagrass transplantation: Site II at Labakkang Coast and Site III at Lale-Lae Island.

  16. A Penalized Robust Method for Identifying Gene-Environment Interactions

    PubMed Central

    Shi, Xingjie; Liu, Jin; Huang, Jian; Zhou, Yong; Xie, Yang; Ma, Shuangge

    2015-01-01

    In high-throughput studies, an important objective is to identify gene-environment interactions associated with disease outcomes and phenotypes. Many commonly adopted methods assume specific parametric or semiparametric models, which may be subject to model mis-specification. In addition, they usually use significance level as the criterion for selecting important interactions. In this study, we adopt the rank-based estimation, which is much less sensitive to model specification than some of the existing methods and includes several commonly encountered data and models as special cases. Penalization is adopted for the identification of gene-environment interactions. It achieves simultaneous estimation and identification and does not rely on significance level. For computation feasibility, a smoothed rank estimation is further proposed. Simulation shows that under certain scenarios, for example with contaminated or heavy-tailed data, the proposed method can significantly outperform the existing alternatives with more accurate identification. We analyze a lung cancer prognosis study with gene expression measurements under the AFT (accelerated failure time) model. The proposed method identifies interactions different from those using the alternatives. Some of the identified genes have important implications. PMID:24616063

  17. Student generated learning objectives: extent of congruence with faculty set objectives and factors influencing their generation.

    PubMed

    Abdul Ghaffar Al-Shaibani, Tarik A; Sachs-Robertson, Annette; Al Shazali, Hafiz O; Sequeira, Reginald P; Hamdy, Hosam; Al-Roomi, Khaldoon

    2003-07-01

    A problem-based learning strategy is used for curriculum planning and implementation at the Arabian Gulf University, Bahrain. Problems are constructed in a way that faculty-set objectives are expected to be identified by students during tutorials. Students in small groups, along with a tutor functioning as a facilitator, identify learning issues and define their learning objectives. We compared objectives identified by student groups with faculty-set objectives to determine extent of congruence, and identified factors that influenced students' ability at identifying faculty-set objectives. Male and female students were segregated and randomly grouped. A faculty tutor was allocated for each group. This study was based on 13 problems given to entry-level medical students. Pooled objectives of these problems were classified into four categories: structural, functional, clinical and psychosocial. Univariate analysis of variance was used for comparison, and a p > 0.05 was considered significant. The mean of overall objectives generated by the students was 54.2%, for each problem. Students identified psychosocial learning objectives more readily than structural ones. Female students identified more psychosocial objectives, whereas male students identified more of structural objectives. Tutor characteristics such as medical/non-medical background, and the years of teaching were correlated with categories of learning issues identified. Students identify part of the faculty-set learning objectives during tutorials with a faculty tutor acting as a facilitator. Students' gender influences types of learning issues identified. Content expertise of tutors does not influence identification of learning needs by students.

  18. Images of intravitreal objects projected onto posterior surface of model eye.

    PubMed

    Kawamura, Ryosuke; Shinoda, Kei; Inoue, Makoto; Noda, Toru; Ohnuma, Kazuhiko; Hirakata, Akito

    2013-11-01

    To try to recreate the images reported by patients during vitreous surgery in a model eye. A fluid-filled model eye with a posterior frosted translucent surface which corresponded to the retina was used. Three holes were made in the model eye through which an endoillumination pipe and intraocular forceps could be inserted. A thin plastic sheet simulating an epiretinal membrane and an intraocular lens (IOL) simulating a dislocated IOL were placed on the retina. The images falling on the posterior surface were photographed from the rear. The images seen through the surgical microscope were also recorded. The images from the rear were mirror images of those seen through the surgical microscope. Intraocular instruments were seen as black shafts from the rear. When the plastic sheet was picked up, the tip of the forceps was seen more sharply on the posterior surface. The images of the dislocated IOL from the posterior were similar to that seen through the surgical microscope, including the yellow optics and blue haptics. Intravitreal objects can form images on the surface of a model eye. Objects located closer to the surface are seen more sharply, and the colour of the objects can be identified. © 2013 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  19. Performance and Reliability Optimization for Aerospace Systems subject to Uncertainty and Degradation

    NASA Technical Reports Server (NTRS)

    Miller, David W.; Uebelhart, Scott A.; Blaurock, Carl

    2004-01-01

    This report summarizes work performed by the Space Systems Laboratory (SSL) for NASA Langley Research Center in the field of performance optimization for systems subject to uncertainty. The objective of the research is to develop design methods and tools to the aerospace vehicle design process which take into account lifecycle uncertainties. It recognizes that uncertainty between the predictions of integrated models and data collected from the system in its operational environment is unavoidable. Given the presence of uncertainty, the goal of this work is to develop means of identifying critical sources of uncertainty, and to combine these with the analytical tools used with integrated modeling. In this manner, system uncertainty analysis becomes part of the design process, and can motivate redesign. The specific program objectives were: 1. To incorporate uncertainty modeling, propagation and analysis into the integrated (controls, structures, payloads, disturbances, etc.) design process to derive the error bars associated with performance predictions. 2. To apply modern optimization tools to guide in the expenditure of funds in a way that most cost-effectively improves the lifecycle productivity of the system by enhancing the subsystem reliability and redundancy. The results from the second program objective are described. This report describes the work and results for the first objective: uncertainty modeling, propagation, and synthesis with integrated modeling.

  20. Neural network identification of aircraft nonlinear aerodynamic characteristics

    NASA Astrophysics Data System (ADS)

    Egorchev, M. V.; Tiumentsev, Yu V.

    2018-02-01

    The simulation problem for the controlled aircraft motion is considered in the case of imperfect knowledge of the modeling object and its operating conditions. The work aims to develop a class of modular semi-empirical dynamic models that combine the capabilities of theoretical and neural network modeling. We consider the use of semi-empirical neural network models for solving the problem of identifying aerodynamic characteristics of an aircraft. We also discuss the formation problem for a representative set of data characterizing the behavior of a simulated dynamic system, which is one of the critical tasks in the synthesis of ANN-models. The effectiveness of the proposed approach is demonstrated using a simulation example of the aircraft angular motion and identifying the corresponding coefficients of aerodynamic forces and moments.

  1. Perspectives On Dilution Jet Mixing

    NASA Technical Reports Server (NTRS)

    Holdeman, J. D.; Srinivasan, R.

    1990-01-01

    NASA recently completed program of measurements and modeling of mixing of transverse jets with ducted crossflow, motivated by need to design or tailor temperature pattern at combustor exit in gas turbine engines. Objectives of program to identify dominant physical mechanisms governing mixing, extend empirical models to provide near-term predictive capability, and compare numerical code calculations with data to guide future analysis improvement efforts.

  2. Behavioral Change Theories Can Inform the Prediction of Young Adults' Adoption of a Plant-Based Diet

    ERIC Educational Resources Information Center

    Wyker, Brett A.; Davison, Kirsten K.

    2010-01-01

    Objective: Drawing on the Theory of Planned Behavior (TPB) and the Transtheoretical Model (TTM), this study (1) examines links between stages of change for following a plant-based diet (PBD) and consuming more fruits and vegetables (FV); (2) tests an integrated theoretical model predicting intention to follow a PBD; and (3) identifies associated…

  3. A Model Pilot Program for Training Personnel to Develop Solutions to Major Educational Problems. Final Report.

    ERIC Educational Resources Information Center

    Cullinan, Paul A.; Merrifield, Philip R.

    This document is the final report of the Model Educational Research Training (MERT) program, a graduate program of the New York University School of Education. MERT trains urban school staff in skills necessary to identify problems, design valid research projects, and apply research results. The long-term objective is the training of small groups…

  4. The Role of Loneliness in the Relationship between Anxiety and Depression in Clinical and School-Based Youth

    ERIC Educational Resources Information Center

    Ebesutani, Chad; Fierstein, Matthew; Viana, Andres G.; Trent, Lindsay; Young, John; Sprung, Manuel

    2015-01-01

    Identifying mechanisms that explain the relationship between anxiety and depression are needed. The Tripartite Model is one model that has been proposed to help explain the association between these two problems, positing a shared component called negative affect. The objective of the present study was to examine the role of loneliness in relation…

  5. Trends in highway construction costs in Louisiana : technical summary.

    DOT National Transportation Integrated Search

    1999-09-01

    The objectives of this study are to observe past trends in highway construction costs in Louisiana, identify factors that determine these costs, quantify their impact, and establish a model that can be used to predict future construction cost in Loui...

  6. Modeling the tagged-neutron UXO identification technique using the Geant4 toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou Y.; Mitra S.; Zhu X.

    2011-10-16

    It is proposed to use 14 MeV neutrons tagged by the associated particle neutron time-of-flight technique (APnTOF) to identify the fillers of unexploded ordnances (UXO) by characterizing their carbon, nitrogen and oxygen contents. To facilitate the design and construction of a prototype system, a preliminary simulation model was developed, using the Geant4 toolkit. This work established the toolkit environment for (a) generating tagged neutrons, (b) their transport and interactions within a sample to induce emission and detection of characteristic gamma-rays, and (c) 2D and 3D-image reconstruction of the interrogated object using the neutron and gamma-ray time-of-flight information. Using the modeling,more » this article demonstrates the novelty of the tagged-neutron approach for extracting useful signals with high signal-to-background discrimination of an object-of-interest from that of its environment. Simulations indicated that an UXO filled with the RDX explosive, hexogen (C{sub 3}H{sub 6}O{sub 6}N{sub 6}), can be identified to a depth of 20 cm when buried in soil.« less

  7. Experimental and theoretical modelling of sand-water-object interaction under nonlinear progressive waves

    NASA Astrophysics Data System (ADS)

    Testik, Firat Yener

    An experimental and theoretical study has been conducted to obtain a fundamental understanding of the dynamics of the sand, water and a solid object interaction as progressive gravity waves impinge on a sloping beach. Aside from obvious scientific interest, this exceedingly complex physical problem is important for naval applications, related to the behavior of disk/cylindrical shaped objects (mines) in the coastal waters. To address this problem, it was divided into a set of simpler basic problems. To begin, nonlinear progressive waves were investigated experimentally in a wave tank for the case of a rigid (impermeable) sloping bottom. Parameterizations for wave characteristics were proposed and compared with the experiments. In parallel, a numerical wave tank model (NWT) was calibrated using experimental data from a single run, and wave field in the wave tank was simulated numerically for the selected experiments. Subsequently, a layer of sand was placed on the slope and bottom topography evolution processes (ripple and sandbar dynamics, bottom topography relaxation under variable wave forcing, etc.) were investigated experimentally. Models for those processes were developed and verified by experimental measurements. Flow over a circular cylinder placed horizontally on a plane wall was also studied. The far-flow field of the cylinder placed in the wave tank was investigated experimentally and numerical results from the NWT simulations were compared with the experimental data. In the mean time, the near-flow velocity/vorticity field around a short cylinder under steady and oscillatory flow was studied in a towing tank. Horseshoe vortex formation and periodic shedding were documented and explained. With the understanding gained through the aforementioned studies, dynamics and burial/scour around the bottom objects in the wave tank were studied. Possible scenarios on the behavior of the disk-shaped objects were identified and explained. Scour around 3D cylindrical objects was investigated. Different scour regimes were identified experimentally and explained theoretically. Proper physical parameterizations on the time evolution and equilibrium scour characteristics were proposed and verified experimentally.

  8. Planning of reach-and-grasp movements: effects of validity and type of object information

    NASA Technical Reports Server (NTRS)

    Loukopoulos, L. D.; Engelbrecht, S. F.; Berthier, N. E.

    2001-01-01

    Individuals are assumed to plan reach-and-grasp movements by using two separate processes. In 1 of the processes, extrinsic (direction, distance) object information is used in planning the movement of the arm that transports the hand to the target location (transport planning); whereas in the other, intrinsic (shape) object information is used in planning the preshaping of the hand and the grasping of the target object (manipulation planning). In 2 experiments, the authors used primes to provide information to participants (N = 5, Experiment 1; N = 6, Experiment 2) about extrinsic and intrinsic object properties. The validity of the prime information was systematically varied. The primes were succeeded by a cue, which always correctly identified the location and shape of the target object. Reaction times were recorded. Four models of transport and manipulation planning were tested. The only model that was consistent with the data was 1 in which arm transport and object manipulation planning were postulated to be independent processes that operate partially in parallel. The authors suggest that the processes involved in motor planning before execution are primarily concerned with the geometric aspects of the upcoming movement but not with the temporal details of its execution.

  9. An enhanced digital line graph design

    USGS Publications Warehouse

    Guptill, Stephen C.

    1990-01-01

    In response to increasing information demands on its digital cartographic data, the U.S. Geological Survey has designed an enhanced version of the Digital Line Graph, termed Digital Line Graph - Enhanced (DLG-E). In the DLG-E model, the phenomena represented by geographic and cartographic data are termed entities. Entities represent individual phenomena in the real world. A feature is an abstraction of a set of entities, with the feature description encompassing only selected properties of the entities (typically the properties that have been portrayed cartographically on a map). Buildings, bridges, roads, streams, grasslands, and counties are examples of features. A feature instance, that is, one occurrence of a feature, is described in the digital environment by feature objects and spatial objects. A feature object identifies a feature instance and its nonlocational attributes. Nontopological relationships are associated with feature objects. The locational aspects of the feature instance are represented by spatial objects. Four spatial objects (points, nodes, chains, and polygons) and their topological relationships are defined. To link the locational and nonlocational aspects of the feature instance, a given feature object is associated with (or is composed of) a set of spatial objects. These objects, attributes, and relationships are the components of the DLG-E data model. To establish a domain of features for DLG-E, an approach using a set of classes, or views, of spatial entities was adopted. The five views that were developed are cover, division, ecosystem, geoposition, and morphology. The views are exclusive; each view is a self-contained analytical approach to the entire range of world features. Because each view is independent of the others, a single point on the surface of the Earth can be represented under multiple views. Under the five views, over 200 features were identified and defined. This set constitutes an initial domain of DLG-E features.

  10. On Specifying the Functional Design for a Protected DMS Tool

    DTIC Science & Technology

    1977-03-01

    of a secure data management system in terms of abstract entities. In keeping with this, the model identifies a security policy which is sufficient... policy of the model may be expressed, there- fore, as the rules which mediate the access of subjects to objects. The access authorization of the...level of a subject; however, this possibly is not acknowledged in our model. The specification of the DMS tool embodies this protection policy

  11. Microsoft kinect-based artificial perception system for control of functional electrical stimulation assisted grasping.

    PubMed

    Strbac, Matija; Kočović, Slobodan; Marković, Marko; Popović, Dejan B

    2014-01-01

    We present a computer vision algorithm that incorporates a heuristic model which mimics a biological control system for the estimation of control signals used in functional electrical stimulation (FES) assisted grasping. The developed processing software acquires the data from Microsoft Kinect camera and implements real-time hand tracking and object analysis. This information can be used to identify temporal synchrony and spatial synergies modalities for FES control. Therefore, the algorithm acts as artificial perception which mimics human visual perception by identifying the position and shape of the object with respect to the position of the hand in real time during the planning phase of the grasp. This artificial perception used within the heuristically developed model allows selection of the appropriate grasp and prehension. The experiments demonstrate that correct grasp modality was selected in more than 90% of tested scenarios/objects. The system is portable, and the components are low in cost and robust; hence, it can be used for the FES in clinical or even home environment. The main application of the system is envisioned for functional electrical therapy, that is, intensive exercise assisted with FES.

  12. Microsoft Kinect-Based Artificial Perception System for Control of Functional Electrical Stimulation Assisted Grasping

    PubMed Central

    Kočović, Slobodan; Popović, Dejan B.

    2014-01-01

    We present a computer vision algorithm that incorporates a heuristic model which mimics a biological control system for the estimation of control signals used in functional electrical stimulation (FES) assisted grasping. The developed processing software acquires the data from Microsoft Kinect camera and implements real-time hand tracking and object analysis. This information can be used to identify temporal synchrony and spatial synergies modalities for FES control. Therefore, the algorithm acts as artificial perception which mimics human visual perception by identifying the position and shape of the object with respect to the position of the hand in real time during the planning phase of the grasp. This artificial perception used within the heuristically developed model allows selection of the appropriate grasp and prehension. The experiments demonstrate that correct grasp modality was selected in more than 90% of tested scenarios/objects. The system is portable, and the components are low in cost and robust; hence, it can be used for the FES in clinical or even home environment. The main application of the system is envisioned for functional electrical therapy, that is, intensive exercise assisted with FES. PMID:25202707

  13. Many-objective robust decision making for water allocation under climate change.

    PubMed

    Yan, Dan; Ludwig, Fulco; Huang, He Qing; Werners, Saskia E

    2017-12-31

    Water allocation is facing profound challenges due to climate change uncertainties. To identify adaptive water allocation strategies that are robust to climate change uncertainties, a model framework combining many-objective robust decision making and biophysical modeling is developed for large rivers. The framework was applied to the Pearl River basin (PRB), China where sufficient flow to the delta is required to reduce saltwater intrusion in the dry season. Before identifying and assessing robust water allocation plans for the future, the performance of ten state-of-the-art MOEAs (multi-objective evolutionary algorithms) is evaluated for the water allocation problem in the PRB. The Borg multi-objective evolutionary algorithm (Borg MOEA), which is a self-adaptive optimization algorithm, has the best performance during the historical periods. Therefore it is selected to generate new water allocation plans for the future (2079-2099). This study shows that robust decision making using carefully selected MOEAs can help limit saltwater intrusion in the Pearl River Delta. However, the framework could perform poorly due to larger than expected climate change impacts on water availability. Results also show that subjective design choices from the researchers and/or water managers could potentially affect the ability of the model framework, and cause the most robust water allocation plans to fail under future climate change. Developing robust allocation plans in a river basin suffering from increasing water shortage requires the researchers and water managers to well characterize future climate change of the study regions and vulnerabilities of their tools. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. The Master Lens Database and The Orphan Lenses Project

    NASA Astrophysics Data System (ADS)

    Moustakas, Leonidas

    2012-10-01

    Strong gravitational lenses are uniquely suited for the study of dark matter structure and substructure within massive halos of many scales, act as gravitational telescopes for distant faint objects, and can give powerful and competitive cosmological constraints. While hundreds of strong lenses are known to date, spanning five orders of magnitude in mass scale, thousands will be identified this decade. To fully exploit the power of these objects presently, and in the near future, we are creating the Master Lens Database. This is a clearinghouse of all known strong lens systems, with a sophisticated and modern database of uniformly measured and derived observational and lens-model derived quantities, using archival Hubble data across several instruments. This Database enables new science that can be done with a comprehensive sample of strong lenses. The operational goal of this proposal is to develop the process and the code to semi-automatically stage Hubble data of each system, create appropriate masks of the lensing objects and lensing features, and derive gravitational lens models, to provide a uniform and fairly comprehensive information set that is ingested into the Database. The scientific goal for this team is to use the properties of the ensemble of lenses to make a new study of the internal structure of lensing galaxies, and to identify new objects that show evidence of strong substructure lensing, for follow-up study. All data, scripts, masks, model setup files, and derived parameters, will be public, and free. The Database will be accessible online and through a sophisticated smartphone application, which will also be free.

  15. Logistic regression modeling to assess groundwater vulnerability to contamination in Hawaii, USA

    NASA Astrophysics Data System (ADS)

    Mair, Alan; El-Kadi, Aly I.

    2013-10-01

    Capture zone analysis combined with a subjective susceptibility index is currently used in Hawaii to assess vulnerability to contamination of drinking water sources derived from groundwater. In this study, we developed an alternative objective approach that combines well capture zones with multiple-variable logistic regression (LR) modeling and applied it to the highly-utilized Pearl Harbor and Honolulu aquifers on the island of Oahu, Hawaii. Input for the LR models utilized explanatory variables based on hydrogeology, land use, and well geometry/location. A suite of 11 target contaminants detected in the region, including elevated nitrate (> 1 mg/L), four chlorinated solvents, four agricultural fumigants, and two pesticides, was used to develop the models. We then tested the ability of the new approach to accurately separate groups of wells with low and high vulnerability, and the suitability of nitrate as an indicator of other types of contamination. Our results produced contaminant-specific LR models that accurately identified groups of wells with the lowest/highest reported detections and the lowest/highest nitrate concentrations. Current and former agricultural land uses were identified as significant explanatory variables for eight of the 11 target contaminants, while elevated nitrate was a significant variable for five contaminants. The utility of the combined approach is contingent on the availability of hydrologic and chemical monitoring data for calibrating groundwater and LR models. Application of the approach using a reference site with sufficient data could help identify key variables in areas with similar hydrogeology and land use but limited data. In addition, elevated nitrate may also be a suitable indicator of groundwater contamination in areas with limited data. The objective LR modeling approach developed in this study is flexible enough to address a wide range of contaminants and represents a suitable addition to the current subjective approach.

  16. Traffic Behavior Recognition Using the Pachinko Allocation Model

    PubMed Central

    Huynh-The, Thien; Banos, Oresti; Le, Ba-Vui; Bui, Dinh-Mao; Yoon, Yongik; Lee, Sungyoung

    2015-01-01

    CCTV-based behavior recognition systems have gained considerable attention in recent years in the transportation surveillance domain for identifying unusual patterns, such as traffic jams, accidents, dangerous driving and other abnormal behaviors. In this paper, a novel approach for traffic behavior modeling is presented for video-based road surveillance. The proposed system combines the pachinko allocation model (PAM) and support vector machine (SVM) for a hierarchical representation and identification of traffic behavior. A background subtraction technique using Gaussian mixture models (GMMs) and an object tracking mechanism based on Kalman filters are utilized to firstly construct the object trajectories. Then, the sparse features comprising the locations and directions of the moving objects are modeled by PAM into traffic topics, namely activities and behaviors. As a key innovation, PAM captures not only the correlation among the activities, but also among the behaviors based on the arbitrary directed acyclic graph (DAG). The SVM classifier is then utilized on top to train and recognize the traffic activity and behavior. The proposed model shows more flexibility and greater expressive power than the commonly-used latent Dirichlet allocation (LDA) approach, leading to a higher recognition accuracy in the behavior classification. PMID:26151213

  17. Economics of a nest-box program for the conservation of an endangered species: a reappraisal

    Treesearch

    Daniel A. Spring; Michael Bevers; John O.S. Kennedy; Dan Harley

    2001-01-01

    An optimization model is developed to identify timing and placement strategies for the installation of nest boxes and the harvesting of timber to meet joint timber–wildlife objectives. Optimal management regimes are determined on the basis of their impacts on the local abundance of a threatened species and net present value (NPV) and are identified for a range of NPV...

  18. Integrating land cover modeling and adaptive management to conserve endangered species and reduce catastrophic fire risk

    USGS Publications Warehouse

    Breininger, David; Duncan, Brean; Eaton, Mitchell J.; Johnson, Fred; Nichols, James

    2014-01-01

    Land cover modeling is used to inform land management, but most often via a two-step process, where science informs how management alternatives can influence resources, and then, decision makers can use this information to make decisions. A more efficient process is to directly integrate science and decision-making, where science allows us to learn in order to better accomplish management objectives and is developed to address specific decisions. Co-development of management and science is especially productive when decisions are complicated by multiple objectives and impeded by uncertainty. Multiple objectives can be met by the specification of tradeoffs, and relevant uncertainty can be addressed through targeted science (i.e., models and monitoring). We describe how to integrate habitat and fuel monitoring with decision-making focused on the dual objectives of managing for endangered species and minimizing catastrophic fire risk. Under certain conditions, both objectives might be achieved by a similar management policy; other conditions require tradeoffs between objectives. Knowledge about system responses to actions can be informed by developing hypotheses based on ideas about fire behavior and then applying competing management actions to different land units in the same system state. Monitoring and management integration is important to optimize state-specific management decisions and to increase knowledge about system responses. We believe this approach has broad utility and identifies a clear role for land cover modeling programs intended to inform decision-making.

  19. Blended near-optimal alternative generation, visualization, and interaction for water resources decision making

    NASA Astrophysics Data System (ADS)

    Rosenberg, David E.

    2015-04-01

    State-of-the-art systems analysis techniques focus on efficiently finding optimal solutions. Yet an optimal solution is optimal only for the modeled issues and managers often seek near-optimal alternatives that address unmodeled objectives, preferences, limits, uncertainties, and other issues. Early on, Modeling to Generate Alternatives (MGA) formalized near-optimal as performance within a tolerable deviation from the optimal objective function value and identified a few maximally different alternatives that addressed some unmodeled issues. This paper presents new stratified, Monte-Carlo Markov Chain sampling and parallel coordinate plotting tools that generate and communicate the structure and extent of the near-optimal region to an optimization problem. Interactive plot controls allow users to explore region features of most interest. Controls also streamline the process to elicit unmodeled issues and update the model formulation in response to elicited issues. Use for an example, single-objective, linear water quality management problem at Echo Reservoir, Utah, identifies numerous and flexible practices to reduce the phosphorus load to the reservoir and maintain close-to-optimal performance. Flexibility is upheld by further interactive alternative generation, transforming the formulation into a multiobjective problem, and relaxing the tolerance parameter to expand the near-optimal region. Compared to MGA, the new blended tools generate more numerous alternatives faster, more fully show the near-optimal region, and help elicit a larger set of unmodeled issues.

  20. [Development of an Operational Model for the Application of Planning-Programming-Budgeting Systems in Local School Districts. Program Budgeting Note 3, Cost-Effectiveness Analysis: What Is It?

    ERIC Educational Resources Information Center

    State Univ. of New York, Buffalo. Western New York School Study Council.

    Cost effectiveness analysis is used in situations where benefits and costs are not readily converted into a money base. Five elements can be identified in such an analytic process: (1) The objective must be defined in terms of what it is and how it is attained; (2) alternatives to the objective must be clearly definable; (3) the costs must be…

  1. Evaluation of Stratospheric Transport in New 3D Models Using the Global Modeling Initiative Grading Criteria

    NASA Technical Reports Server (NTRS)

    Strahan, Susan E.; Douglass, Anne R.; Einaudi, Franco (Technical Monitor)

    2001-01-01

    The Global Modeling Initiative (GMI) Team developed objective criteria for model evaluation in order to identify the best representation of the stratosphere. This work created a method to quantitatively and objectively discriminate between different models. In the original GMI study, 3 different meteorological data sets were used to run an offline chemistry and transport model (CTM). Observationally-based grading criteria were derived and applied to these simulations and various aspects of stratospheric transport were evaluated; grades were assigned. Here we report on the application of the GMI evaluation criteria to CTM simulations integrated with a new assimilated wind data set and a new general circulation model (GCM) wind data set. The Finite Volume Community Climate Model (FV-CCM) is a new GCM developed at Goddard which uses the NCAR CCM physics and the Lin and Rood advection scheme. The FV-Data Assimilation System (FV-DAS) is a new data assimilation system which uses the FV-CCM as its core model. One year CTM simulations of 2.5 degrees longitude by 2 degrees latitude resolution were run for each wind data set. We present the evaluation of temperature and annual transport cycles in the lower and middle stratosphere in the two new CTM simulations. We include an evaluation of high latitude transport which was not part of the original GMI criteria. Grades for the new simulations will be compared with those assigned during the original GMT evaluations and areas of improvement will be identified.

  2. Identifying and Modeling Dynamic Preference Evolution in Multipurpose Water Resources Systems

    NASA Astrophysics Data System (ADS)

    Mason, E.; Giuliani, M.; Castelletti, A.; Amigoni, F.

    2018-04-01

    Multipurpose water systems are usually operated on a tradeoff of conflicting operating objectives. Under steady state climatic and socioeconomic conditions, such tradeoff is supposed to represent a fair and/or efficient preference. Extreme variability in external forcing might affect water operators' risk aversion and force a change in her/his preference. Properly accounting for these shifts is key to any rigorous retrospective assessment of the operator's behaviors, and to build descriptive models for projecting the future system evolution. In this study, we explore how the selection of different preferences is linked to variations in the external forcing. We argue that preference selection evolves according to recent, extreme variations in system performance: underperforming in one of the objectives pushes the preference toward the harmed objective. To test this assumption, we developed a rational procedure to simulate the operator's preference selection. We map this selection onto a multilateral negotiation, where multiple virtual agents independently optimize different objectives. The agents periodically negotiate a compromise policy for the operation of the system. Agents' attitudes in each negotiation step are determined by the recent system performance measured by the specific objective they maximize. We then propose a numerical model of preference dynamics that implements a concept from cognitive psychology, the availability bias. We test our modeling framework on a synthetic lake operated for flood control and water supply. Results show that our model successfully captures the operator's preference selection and dynamic evolution driven by extreme wet and dry situations.

  3. Assessing Animal Welfare Impacts in the Management of European Rabbits (Oryctolagus cuniculus), European Moles (Talpa europaea) and Carrion Crows (Corvus corone)

    PubMed Central

    Baker, Sandra E.; Sharp, Trudy M.; Macdonald, David W.

    2016-01-01

    Human-wildlife conflict is a global issue. Attempts to manage this conflict impact upon wild animal welfare, an issue receiving little attention until relatively recently. Where human activities harm animal welfare these effects should be minimised where possible. However, little is known about the welfare impacts of different wildlife management interventions, and opinions on impacts vary widely. Welfare impacts therefore need to be assessed objectively. Our objectives were to: 1) establish whether an existing welfare assessment model could differentiate and rank the impacts of different wildlife management interventions (for decision-making purposes); 2) identify and evaluate any additional benefits of making formal welfare assessments; and 3) illustrate issues raised by application of the model. We applied the welfare assessment model to interventions commonly used with rabbits (Oryctolagus cuniculus), moles (Talpa europaea) and crows (Corvus corone) in the UK. The model ranked interventions for rabbits (least impact first: fencing, head shot, chest shot) and crows (shooting, scaring, live trapping with cervical dislocation). For moles, managing molehills and tunnels scored least impact. Both spring trapping, and live trapping followed by translocation, scored greater impacts, but these could not be compared directly as they scored on different axes of the model. Some rankings appeared counter-intuitive, highlighting the need for objective formal welfare assessments. As well as ranking the humaneness of interventions, the model highlighted future research needs and how Standard Operating Procedures might be improved. The model is a milestone in assessing wildlife management welfare impacts, but our research revealed some limitations of the model and we discuss likely challenges in resolving these. In future, the model might be developed to improve its utility, e.g. by refining the time-scales. It might also be used to reach consensus among stakeholders about relative welfare impacts or to identify ways of improving wildlife management practice in the field. PMID:26726808

  4. Assessing Animal Welfare Impacts in the Management of European Rabbits (Oryctolagus cuniculus), European Moles (Talpa europaea) and Carrion Crows (Corvus corone).

    PubMed

    Baker, Sandra E; Sharp, Trudy M; Macdonald, David W

    2016-01-01

    Human-wildlife conflict is a global issue. Attempts to manage this conflict impact upon wild animal welfare, an issue receiving little attention until relatively recently. Where human activities harm animal welfare these effects should be minimised where possible. However, little is known about the welfare impacts of different wildlife management interventions, and opinions on impacts vary widely. Welfare impacts therefore need to be assessed objectively. Our objectives were to: 1) establish whether an existing welfare assessment model could differentiate and rank the impacts of different wildlife management interventions (for decision-making purposes); 2) identify and evaluate any additional benefits of making formal welfare assessments; and 3) illustrate issues raised by application of the model. We applied the welfare assessment model to interventions commonly used with rabbits (Oryctolagus cuniculus), moles (Talpa europaea) and crows (Corvus corone) in the UK. The model ranked interventions for rabbits (least impact first: fencing, head shot, chest shot) and crows (shooting, scaring, live trapping with cervical dislocation). For moles, managing molehills and tunnels scored least impact. Both spring trapping, and live trapping followed by translocation, scored greater impacts, but these could not be compared directly as they scored on different axes of the model. Some rankings appeared counter-intuitive, highlighting the need for objective formal welfare assessments. As well as ranking the humaneness of interventions, the model highlighted future research needs and how Standard Operating Procedures might be improved. The model is a milestone in assessing wildlife management welfare impacts, but our research revealed some limitations of the model and we discuss likely challenges in resolving these. In future, the model might be developed to improve its utility, e.g. by refining the time-scales. It might also be used to reach consensus among stakeholders about relative welfare impacts or to identify ways of improving wildlife management practice in the field.

  5. Multi-objective vs. single-objective calibration of a hydrologic model using single- and multi-objective screening

    NASA Astrophysics Data System (ADS)

    Mai, Juliane; Cuntz, Matthias; Shafii, Mahyar; Zink, Matthias; Schäfer, David; Thober, Stephan; Samaniego, Luis; Tolson, Bryan

    2016-04-01

    Hydrologic models are traditionally calibrated against observed streamflow. Recent studies have shown however, that only a few global model parameters are constrained using this kind of integral signal. They can be identified using prior screening techniques. Since different objectives might constrain different parameters, it is advisable to use multiple information to calibrate those models. One common approach is to combine these multiple objectives (MO) into one single objective (SO) function and allow the use of a SO optimization algorithm. Another strategy is to consider the different objectives separately and apply a MO Pareto optimization algorithm. In this study, two major research questions will be addressed: 1) How do multi-objective calibrations compare with corresponding single-objective calibrations? 2) How much do calibration results deteriorate when the number of calibrated parameters is reduced by a prior screening technique? The hydrologic model employed in this study is a distributed hydrologic model (mHM) with 52 model parameters, i.e. transfer coefficients. The model uses grid cells as a primary hydrologic unit, and accounts for processes like snow accumulation and melting, soil moisture dynamics, infiltration, surface runoff, evapotranspiration, subsurface storage and discharge generation. The model is applied in three distinct catchments over Europe. The SO calibrations are performed using the Dynamically Dimensioned Search (DDS) algorithm with a fixed budget while the MO calibrations are achieved using the Pareto Dynamically Dimensioned Search (PA-DDS) algorithm allowing for the same budget. The two objectives used here are the Nash Sutcliffe Efficiency (NSE) of the simulated streamflow and the NSE of the logarithmic transformation. It is shown that the SO DDS results are located close to the edges of the Pareto fronts of the PA-DDS. The MO calibrations are hence preferable due to their supply of multiple equivalent solutions from which the user can choose at the end due to the specific needs. The sequential single-objective parameter screening was employed prior to the calibrations reducing the number of parameters by at least 50% in the different catchments and for the different single objectives. The single-objective calibrations led to a faster convergence of the objectives and are hence beneficial when using a DDS on single-objectives. The above mentioned parameter screening technique is generalized for multi-objectives and applied before calibration using the PA-DDS algorithm. Two different alternatives of this MO-screening are tested. The comparison of the calibration results using all parameters and using only screened parameters shows for both alternatives that the PA-DDS algorithm does not profit in terms of trade-off size and function evaluations required to achieve converged pareto fronts. This is because the PA-DDS algorithm automatically reduces search space with progress of the calibration run. This automatic reduction should be different for other search algorithms. It is therefore hypothesized that prior screening can but must not be beneficial for parameter estimation dependent on the chosen optimization algorithm.

  6. Risks posed by climate change to the delivery of Water Framework Directive objectives in the UK.

    PubMed

    Wilby, R L; Orr, H G; Hedger, M; Forrow, D; Blackmore, M

    2006-12-01

    The EU Water Framework Directive (WFD) is novel because it integrates water quality, water resources, physical habitat and, to some extent, flooding for all surface and groundwaters and takes forward river basin management. However, the WFD does not explicitly mention risks posed by climate change to the achievement of its environmental objectives. This is despite the fact that the time scale for the implementation process and achieving particular objectives extends into the 2020s, when climate models project changes in average temperature and precipitation. This paper begins by reviewing the latest UK climate change scenarios and the wider policy and science context of the WFD. We then examine the potential risks of climate change to key phases of the River Basin Management Process that underpin the WFD (such as characterisation of river basins and their water bodies, risk assessments to identify pressures and impacts, programmes of measures (POMs) options appraisal, monitoring and modelling, policy and management activities). Despite these risks the WFD could link new policy and participative mechanisms (being established for the River Basin Management Plans) to the emerging framework of national and regional climate change adaptation policy. The risks are identified with a view to informing policy opportunities, objective setting, adaptation strategies and the research agenda. Key knowledge gaps have already been identified during the implementation of the WFD, such as the links between hydromorphology and ecosystem status, but the overarching importance of linking climate change to these considerations needs to be highlighted. The next generation of (probabilistic) climate change scenarios will present new opportunities and challenges for risk analysis and policy-making.

  7. Search-based model identification of smart-structure damage

    NASA Technical Reports Server (NTRS)

    Glass, B. J.; Macalou, A.

    1991-01-01

    This paper describes the use of a combined model and parameter identification approach, based on modal analysis and artificial intelligence (AI) techniques, for identifying damage or flaws in a rotating truss structure incorporating embedded piezoceramic sensors. This smart structure example is representative of a class of structures commonly found in aerospace systems and next generation space structures. Artificial intelligence techniques of classification, heuristic search, and an object-oriented knowledge base are used in an AI-based model identification approach. A finite model space is classified into a search tree, over which a variant of best-first search is used to identify the model whose stored response most closely matches that of the input. Newly-encountered models can be incorporated into the model space. This adaptativeness demonstrates the potential for learning control. Following this output-error model identification, numerical parameter identification is used to further refine the identified model. Given the rotating truss example in this paper, noisy data corresponding to various damage configurations are input to both this approach and a conventional parameter identification method. The combination of the AI-based model identification with parameter identification is shown to lead to smaller parameter corrections than required by the use of parameter identification alone.

  8. The Role of Life-Space, Social Activity, and Depression on the Subjective Memory Complaints of Community-Dwelling Filipino Elderly: A Structural Equation Model

    ERIC Educational Resources Information Center

    de Guzman, Allan B.; Lagdaan, Lovely France M.; Lagoy, Marie Lauren V.

    2015-01-01

    Subjective memory complaints are one of the major concerns of the elderly and remain a challenging area in gerontology. There are previous studies that identify different factors affecting subjective memory complaints. However, an extended model that correlates life-space on subjective memory complaints remains a blank spot. The objective of this…

  9. Induced Stress, Artificial Environment, Simulated Tactical Operations Center Model

    DTIC Science & Technology

    1973-06-01

    oriented 4 activities or, at best , tre application of dor:trinal i. 14 concepts to command post exercises. Unlike mechanical skills, weapon’s...training model identified as APSTRAT, an acronym indicating aptitude and strategies , be considered as a point of reference. Several instructional...post providing visual and aural sensing tasks and training objective oriented performance tasks. Vintilly, ho concludes that failure should be

  10. Temporal and Location Based RFID Event Data Management and Processing

    NASA Astrophysics Data System (ADS)

    Wang, Fusheng; Liu, Peiya

    Advance of sensor and RFID technology provides significant new power for humans to sense, understand and manage the world. RFID provides fast data collection with precise identification of objects with unique IDs without line of sight, thus it can be used for identifying, locating, tracking and monitoring physical objects. Despite these benefits, RFID poses many challenges for data processing and management. RFID data are temporal and history oriented, multi-dimensional, and carrying implicit semantics. Moreover, RFID applications are heterogeneous. RFID data management or data warehouse systems need to support generic and expressive data modeling for tracking and monitoring physical objects, and provide automated data interpretation and processing. We develop a powerful temporal and location oriented data model for modeling and queryingRFID data, and a declarative event and rule based framework for automated complex RFID event processing. The approach is general and can be easily adapted for different RFID-enabled applications, thus significantly reduces the cost of RFID data integration.

  11. 3-D World Modeling For An Autonomous Robot

    NASA Astrophysics Data System (ADS)

    Goldstein, M.; Pin, F. G.; Weisbin, C. R.

    1987-01-01

    This paper presents a methodology for a concise representation of the 3-D world model for a mobile robot, using range data. The process starts with the segmentation of the scene into "objects" that are given a unique label, based on principles of range continuity. Then the external surface of each object is partitioned into homogeneous surface patches. Contours of surface patches in 3-D space are identified by estimating the normal and curvature associated with each pixel. The resulting surface patches are then classified as planar, convex or concave. Since the world model uses a volumetric representation for the 3-D environment, planar surfaces are represented by thin volumetric polyhedra. Spherical and cylindrical surfaces are extracted and represented by appropriate volumetric primitives. All other surfaces are represented using the boolean union of spherical volumes (as described in a separate paper by the same authors). The result is a general, concise representation of the external 3-D world, which allows for efficient and robust 3-D object recognition.

  12. Integrated control strategy for autonomous decentralized conveyance systems based on distributed MEMS arrays

    NASA Astrophysics Data System (ADS)

    Zhou, Lingfei; Chapuis, Yves-Andre; Blonde, Jean-Philippe; Bervillier, Herve; Fukuta, Yamato; Fujita, Hiroyuki

    2004-07-01

    In this paper, the authors proposed to study a model and a control strategy of a two-dimensional conveyance system based on the principles of the Autonomous Decentralized Microsystems (ADM). The microconveyance system is based on distributed cooperative MEMS actuators which can produce a force field onto the surface of the device to grip and move a micro-object. The modeling approach proposed here is based on a simple model of a microconveyance system which is represented by a 5 x 5 matrix of cells. Each cell is consisted of a microactuator, a microsensor, and a microprocessor to provide actuation, autonomy and decentralized intelligence to the cell. Thus, each cell is able to identify a micro-object crossing on it and to decide by oneself the appropriate control strategy to convey the micro-object to its destination target. The control strategy could be established through five simple decision rules that the cell itself has to respect at each calculate cycle time. Simulation and FPGA implementation results are given in the end of the paper in order to validate model and control approach of the microconveyance system.

  13. Structured decision making for managing pneumonia epizootics in bighorn sheep

    USGS Publications Warehouse

    Sells, Sarah N.; Mitchell, Michael S.; Edwards, Victoria L.; Gude, Justin A.; Anderson, Neil J.

    2016-01-01

    Good decision-making is essential to conserving wildlife populations. Although there may be multiple ways to address a problem, perfect solutions rarely exist. Managers are therefore tasked with identifying decisions that will best achieve desired outcomes. Structured decision making (SDM) is a method of decision analysis used to identify the most effective, efficient, and realistic decisions while accounting for values and priorities of the decision maker. The stepwise process includes identifying the management problem, defining objectives for solving the problem, developing alternative approaches to achieve the objectives, and formally evaluating which alternative is most likely to accomplish the objectives. The SDM process can be more effective than informal decision-making because it provides a transparent way to quantitatively evaluate decisions for addressing multiple management objectives while incorporating science, uncertainty, and risk tolerance. To illustrate the application of this process to a management need, we present an SDM-based decision tool developed to identify optimal decisions for proactively managing risk of pneumonia epizootics in bighorn sheep (Ovis canadensis) in Montana. Pneumonia epizootics are a major challenge for managers due to long-term impacts to herds, epistemic uncertainty in timing and location of future epizootics, and consequent difficulty knowing how or when to manage risk. The decision tool facilitates analysis of alternative decisions for how to manage herds based on predictions from a risk model, herd-specific objectives, and predicted costs and benefits of each alternative. Decision analyses for 2 example herds revealed that meeting management objectives necessitates specific approaches unique to each herd. The analyses showed how and under what circumstances the alternatives are optimal compared to other approaches and current management. Managers can be confident that these decisions are effective, efficient, and realistic because they explicitly account for important considerations managers implicitly weigh when making decisions, including competing management objectives, uncertainty in potential outcomes, and risk tolerance.

  14. Identifying psychophysiological indices of expert vs. novice performance in deadly force judgment and decision making

    PubMed Central

    Johnson, Robin R.; Stone, Bradly T.; Miranda, Carrie M.; Vila, Bryan; James, Lois; James, Stephen M.; Rubio, Roberto F.; Berka, Chris

    2014-01-01

    Objective: To demonstrate that psychophysiology may have applications for objective assessment of expertise development in deadly force judgment and decision making (DFJDM). Background: Modern training techniques focus on improving decision-making skills with participative assessment between trainees and subject matter experts primarily through subjective observation. Objective metrics need to be developed. The current proof of concept study explored the potential for psychophysiological metrics in deadly force judgment contexts. Method: Twenty-four participants (novice, expert) were recruited. All wore a wireless Electroencephalography (EEG) device to collect psychophysiological data during high-fidelity simulated deadly force judgment and decision-making simulations using a modified Glock firearm. Participants were exposed to 27 video scenarios, one-third of which would have justified use of deadly force. Pass/fail was determined by whether the participant used deadly force appropriately. Results: Experts had a significantly higher pass rate compared to novices (p < 0.05). Multiple metrics were shown to distinguish novices from experts. Hierarchical regression analyses indicate that psychophysiological variables are able to explain 72% of the variability in expert performance, but only 37% in novices. Discriminant function analysis (DFA) using psychophysiological metrics was able to discern between experts and novices with 72.6% accuracy. Conclusion: While limited due to small sample size, the results suggest that psychophysiology may be developed for use as an objective measure of expertise in DFDJM. Specifically, discriminant function measures may have the potential to objectively identify expert skill acquisition. Application: Psychophysiological metrics may create a performance model with the potential to optimize simulator-based DFJDM training. These performance models could be used for trainee feedback, and/or by the instructor to assess performance objectively. PMID:25100966

  15. Research on artistic gymnastics training guidance model

    NASA Astrophysics Data System (ADS)

    Luo, Lin; Sun, Xianzhong

    2017-04-01

    Rhythmic gymnastics training guidance model, taking into consideration the features of artistic gymnastics training, is put forward to help gymnasts identify their deficiencies and unskilled technical movements and improve their training effects. The model is built on the foundation of both physical quality indicator model and artistic gymnastics training indicator model. Physical quality indicator model composed of bodily factor, flexibility-strength factor and speed-dexterity factor delivers an objective evaluation with reference to basic sport testing data. Training indicator model, based on physical fitness indicator, helps analyze the technical movements, through which the impact from each bodily factor on technical movements is revealed. AG training guidance model, in further combination with actual training data and in comparison with the data shown in the training indicator model, helps identify the problems in trainings, and thus improve the training effect. These three models when in combined use and in comparison with historical model data can check and verify the improvement in training effect over a certain period of time.

  16. Computational Gene Expression Modeling Identifies Salivary Biomarker Analysis that Predict Oral Feeding Readiness in the Newborn

    PubMed Central

    Maron, Jill L.; Hwang, Jooyeon S.; Pathak, Subash; Ruthazer, Robin; Russell, Ruby L.; Alterovitz, Gil

    2014-01-01

    Objective To combine mathematical modeling of salivary gene expression microarray data and systems biology annotation with RT-qPCR amplification to identify (phase I) and validate (phase II) salivary biomarker analysis for the prediction of oral feeding readiness in preterm infants. Study design Comparative whole transcriptome microarray analysis from 12 preterm newborns pre- and post-oral feeding success was used for computational modeling and systems biology analysis to identify potential salivary transcripts associated with oral feeding success (phase I). Selected gene expression biomarkers (15 from computational modeling; 6 evidence-based; and 3 reference) were evaluated by RT-qPCR amplification on 400 salivary samples from successful (n=200) and unsuccessful (n=200) oral feeders (phase II). Genes, alone and in combination, were evaluated by a multivariate analysis controlling for sex and post-conceptional age (PCA) to determine the probability that newborns achieved successful oral feeding. Results Advancing post-conceptional age (p < 0.001) and female sex (p = 0.05) positively predicted an infant’s ability to feed orally. A combination of five genes, NPY2R (hunger signaling), AMPK (energy homeostasis), PLXNA1 (olfactory neurogenesis), NPHP4 (visual behavior) and WNT3 (facial development), in addition to PCA and sex, demonstrated good accuracy for determining feeding success (AUROC = 0.78). Conclusions We have identified objective and biologically relevant salivary biomarkers that noninvasively assess a newborn’s developing brain, sensory and facial development as they relate to oral feeding success. Understanding the mechanisms that underlie the development of oral feeding readiness through translational and computational methods may improve clinical decision making while decreasing morbidities and health care costs. PMID:25620512

  17. Spatial modeling of cutaneous leishmaniasis in the Andean region of Colombia

    PubMed Central

    Pérez-Flórez, Mauricio; Ocampo, Clara Beatriz; Valderrama-Ardila, Carlos; Alexander, Neal

    2016-01-01

    The objective of this research was to identify environmental risk factors for cutaneous leishmaniasis (CL) in Colombia and map high-risk municipalities. The study area was the Colombian Andean region, comprising 715 rural and urban municipalities. We used 10 years of CL surveillance: 2000-2009. We used spatial-temporal analysis - conditional autoregressive Poisson random effects modelling - in a Bayesian framework to model the dependence of municipality-level incidence on land use, climate, elevation and population density. Bivariable spatial analysis identified rainforests, forests and secondary vegetation, temperature, and annual precipitation as positively associated with CL incidence. By contrast, livestock agroecosystems and temperature seasonality were negatively associated. Multivariable analysis identified land use - rainforests and agro-livestock - and climate - temperature, rainfall and temperature seasonality - as best predictors of CL. We conclude that climate and land use can be used to identify areas at high risk of CL and that this approach is potentially applicable elsewhere in Latin America. PMID:27355214

  18. Analyzing the impact of intermodal-related risk to the design and management of biofuel supply chain.

    DOT National Transportation Integrated Search

    2014-12-01

    The objective of this project is to design decision-support tools for identifying : biorefinery locations that ensure a cost-efficient and reliable supply chain. We built : mathematical models which take into consideration the benefits (such as acces...

  19. A Vision for the Future: Site-Based Strategic Planning.

    ERIC Educational Resources Information Center

    Herman, Jerry J.

    1989-01-01

    Presents a model to help principals with strategic planning. Success hinges on involving stakeholders, scanning for relevant data, identifying critical success factors, developing vision and mission statements, analyzing the site manager's supports and constraints, creating strategic goals and objectives, developing action plans, allocating…

  20. Gaining insights into interrill soil erosion processes using rare earth element tracers

    USDA-ARS?s Scientific Manuscript database

    Increasing interest in developing process-based erosion models requires better understanding of the relationships among soil detachment, transportation, and deposition. The objectives are to 1) identify the limiting process between soil detachment and sediment transport for interrill erosion, 2) und...

  1. Characterizing convective cold pools: Characterizing Convective Cold Pools

    DOE PAGES

    Drager, Aryeh J.; van den Heever, Susan C.

    2017-05-09

    Cold pools produced by convective storms play an important role in Earth's climate system. However, a common framework does not exist for objectively identifying convective cold pools in observations and models. The present study investigates convective cold pools within a simulation of tropical continental convection that uses a cloud-resolving model with a coupled land-surface model. Multiple variables are assessed for their potential in identifying convective cold pool boundaries, and a novel technique is developed and tested for identifying and tracking cold pools in numerical model simulations. This algorithm is based on surface rainfall rates and radial gradients in the densitymore » potential temperature field. The algorithm successfully identifies near-surface cold pool boundaries and is able to distinguish between connected cold pools. Once cold pools have been identified and tracked, composites of cold pool evolution are then constructed, and average cold pool properties are investigated. Wet patches are found to develop within the centers of cold pools where the ground has been soaked with rainwater. These wet patches help to maintain cool surface temperatures and reduce cold pool dissipation, which has implications for the development of subsequent convection.« less

  2. Characterizing convective cold pools: Characterizing Convective Cold Pools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Drager, Aryeh J.; van den Heever, Susan C.

    Cold pools produced by convective storms play an important role in Earth's climate system. However, a common framework does not exist for objectively identifying convective cold pools in observations and models. The present study investigates convective cold pools within a simulation of tropical continental convection that uses a cloud-resolving model with a coupled land-surface model. Multiple variables are assessed for their potential in identifying convective cold pool boundaries, and a novel technique is developed and tested for identifying and tracking cold pools in numerical model simulations. This algorithm is based on surface rainfall rates and radial gradients in the densitymore » potential temperature field. The algorithm successfully identifies near-surface cold pool boundaries and is able to distinguish between connected cold pools. Once cold pools have been identified and tracked, composites of cold pool evolution are then constructed, and average cold pool properties are investigated. Wet patches are found to develop within the centers of cold pools where the ground has been soaked with rainwater. These wet patches help to maintain cool surface temperatures and reduce cold pool dissipation, which has implications for the development of subsequent convection.« less

  3. Methodology to Improve Design of Accelerated Life Tests in Civil Engineering Projects

    PubMed Central

    Lin, Jing; Yuan, Yongbo; Zhou, Jilai; Gao, Jie

    2014-01-01

    For reliability testing an Energy Expansion Tree (EET) and a companion Energy Function Model (EFM) are proposed and described in this paper. Different from conventional approaches, the EET provides a more comprehensive and objective way to systematically identify external energy factors affecting reliability. The EFM introduces energy loss into a traditional Function Model to identify internal energy sources affecting reliability. The combination creates a sound way to enumerate the energies to which a system may be exposed during its lifetime. We input these energies into planning an accelerated life test, a Multi Environment Over Stress Test. The test objective is to discover weak links and interactions among the system and the energies to which it is exposed, and design them out. As an example, the methods are applied to the pipe in subsea pipeline. However, they can be widely used in other civil engineering industries as well. The proposed method is compared with current methods. PMID:25111800

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Nicholas R.; Carlsen, Brett W.; Dixon, Brent W.

    Dynamic fuel cycle simulation tools are intended to model holistic transient nuclear fuel cycle scenarios. As with all simulation tools, fuel cycle simulators require verification through unit tests, benchmark cases, and integral tests. Model validation is a vital aspect as well. Although compara-tive studies have been performed, there is no comprehensive unit test and benchmark library for fuel cycle simulator tools. The objective of this paper is to identify the must test functionalities of a fuel cycle simulator tool within the context of specific problems of interest to the Fuel Cycle Options Campaign within the U.S. Department of Energy smore » Office of Nuclear Energy. The approach in this paper identifies the features needed to cover the range of promising fuel cycle options identified in the DOE-NE Fuel Cycle Evaluation and Screening (E&S) and categorizes these features to facilitate prioritization. Features were categorized as essential functions, integrating features, and exemplary capabilities. One objective of this paper is to propose a library of unit tests applicable to each of the essential functions. Another underlying motivation for this paper is to encourage an international dialog on the functionalities and standard test methods for fuel cycle simulator tools.« less

  5. A Multi-Objective Decision Making Approach for Solving the Image Segmentation Fusion Problem.

    PubMed

    Khelifi, Lazhar; Mignotte, Max

    2017-08-01

    Image segmentation fusion is defined as the set of methods which aim at merging several image segmentations, in a manner that takes full advantage of the complementarity of each one. Previous relevant researches in this field have been impeded by the difficulty in identifying an appropriate single segmentation fusion criterion, providing the best possible, i.e., the more informative, result of fusion. In this paper, we propose a new model of image segmentation fusion based on multi-objective optimization which can mitigate this problem, to obtain a final improved result of segmentation. Our fusion framework incorporates the dominance concept in order to efficiently combine and optimize two complementary segmentation criteria, namely, the global consistency error and the F-measure (precision-recall) criterion. To this end, we present a hierarchical and efficient way to optimize the multi-objective consensus energy function related to this fusion model, which exploits a simple and deterministic iterative relaxation strategy combining the different image segments. This step is followed by a decision making task based on the so-called "technique for order performance by similarity to ideal solution". Results obtained on two publicly available databases with manual ground truth segmentations clearly show that our multi-objective energy-based model gives better results than the classical mono-objective one.

  6. Orthodontic treatment need in Asian adult males.

    PubMed

    Soh, Jen; Sandham, Andrew

    2004-12-01

    Orthodontic treatment in adults has gained social and professional acceptance in recent years. An assessment of orthodontic treatment need helps to identify individuals who will benefit from treatment and safeguard their interest. The purpose of this study was to assess the objective and subjective levels of orthodontic treatment need in a sample of orthodontically untreated adult Asian males. A sample of male army recruits (n = 339, age 17-22 years, Chinese = 258, Malay = 60, Indian = 21) with no history of orthodontic treatment or craniofacial anomalies participated in the study on a voluntary basis with informed consent. Impressions for study models were taken. Objective treatment need was assessed based on study model analysis using the Index of Orthodontic Treatment Need (IOTN). Questionnaires were used to assess subjective treatment need based on subjective esthetic component (EC) ratings. Fifty percentage of the sample had a definite need for orthodontic treatment (dental health component [DHC] grades 4 and 5), whereas 29.2% had a moderate need for treatment (DHC grades 3). The occlusal trait most commonly identified was dental crossbite. Malay males had the highest percentage with a definite need for treatment for both dental health and esthetic reasons in comparison with Chinese and Indian males. However, there was no difference in the level of treatment need among the ethnic groups (P > .05). No correlation between objective and subjective EC scores was found (P > .05). A high level of investigator-identified treatment need was not supported by a similar level of subject awareness among the adult sample.

  7. An architecture for object-oriented intelligent control of power systems in space

    NASA Technical Reports Server (NTRS)

    Holmquist, Sven G.; Jayaram, Prakash; Jansen, Ben H.

    1993-01-01

    A control system for autonomous distribution and control of electrical power during space missions is being developed. This system should free the astronauts from localizing faults and reconfiguring loads if problems with the power distribution and generation components occur. The control system uses an object-oriented simulation model of the power system and first principle knowledge to detect, identify, and isolate faults. Each power system component is represented as a separate object with knowledge of its normal behavior. The reasoning process takes place at three different levels of abstraction: the Physical Component Model (PCM) level, the Electrical Equivalent Model (EEM) level, and the Functional System Model (FSM) level, with the PCM the lowest level of abstraction and the FSM the highest. At the EEM level the power system components are reasoned about as their electrical equivalents, e.g, a resistive load is thought of as a resistor. However, at the PCM level detailed knowledge about the component's specific characteristics is taken into account. The FSM level models the system at the subsystem level, a level appropriate for reconfiguration and scheduling. The control system operates in two modes, a reactive and a proactive mode, simultaneously. In the reactive mode the control system receives measurement data from the power system and compares these values with values determined through simulation to detect the existence of a fault. The nature of the fault is then identified through a model-based reasoning process using mainly the EEM. Compound component models are constructed at the EEM level and used in the fault identification process. In the proactive mode the reasoning takes place at the PCM level. Individual components determine their future health status using a physical model and measured historical data. In case changes in the health status seem imminent the component warns the control system about its impending failure. The fault isolation process uses the FSM level for its reasoning base.

  8. Development and application of a green fluorescent protein (GFP) expressing E. coli O103 surrogate for tracking contamination through grinding and identifying persistent points of contamination

    USDA-ARS?s Scientific Manuscript database

    Objective: To 1.) develop and validate an easily trackable E. coli O157:H7/non-O157 STEC surrogate that can be detected to the same level of sensitivity as E. coli O157:H7; and 2.) apply the trackable surrogate to model contamination passage through grinding and identify points where contamination ...

  9. A Critical Survey of Optimization Models for Tactical and Strategic Aspects of Air Traffic Flow Management

    NASA Technical Reports Server (NTRS)

    Bertsimas, Dimitris; Odoni, Amedeo

    1997-01-01

    This document presents a critical review of the principal existing optimization models that have been applied to Air Traffic Flow Management (TFM). Emphasis will be placed on two problems, the Generalized Tactical Flow Management Problem (GTFMP) and the Ground Holding Problem (GHP), as well as on some of their variations. To perform this task, we have carried out an extensive literature review that has covered more than 40 references, most of them very recent. Based on the review of this emerging field our objectives were to: (i) identify the best available models; (ii) describe typical contexts for applications of the models; (iii) provide illustrative model formulations; and (iv) identify the methodologies that can be used to solve the models. We shall begin our presentation below by providing a brief context for the models that we are reviewing. In Section 3 we shall offer a taxonomy and identify four classes of models for review. In Sections 4, 5, and 6 we shall then review, respectively, models for the Single-Airport Ground Holding Problem, the Generalized Tactical FM P and the Multi-Airport Ground Holding Problem (for the definition of these problems see Section 3 below). In each section, we identify the best available models and discuss briefly their computational performance and applications, if any, to date. Section 7 summarizes our conclusions about the state of the art.

  10. A modified multi-objective particle swarm optimization approach and its application to the design of a deepwater composite riser

    NASA Astrophysics Data System (ADS)

    Zheng, Y.; Chen, J.

    2017-09-01

    A modified multi-objective particle swarm optimization method is proposed for obtaining Pareto-optimal solutions effectively. Different from traditional multi-objective particle swarm optimization methods, Kriging meta-models and the trapezoid index are introduced and integrated with the traditional one. Kriging meta-models are built to match expensive or black-box functions. By applying Kriging meta-models, function evaluation numbers are decreased and the boundary Pareto-optimal solutions are identified rapidly. For bi-objective optimization problems, the trapezoid index is calculated as the sum of the trapezoid's area formed by the Pareto-optimal solutions and one objective axis. It can serve as a measure whether the Pareto-optimal solutions converge to the Pareto front. Illustrative examples indicate that to obtain Pareto-optimal solutions, the method proposed needs fewer function evaluations than the traditional multi-objective particle swarm optimization method and the non-dominated sorting genetic algorithm II method, and both the accuracy and the computational efficiency are improved. The proposed method is also applied to the design of a deepwater composite riser example in which the structural performances are calculated by numerical analysis. The design aim was to enhance the tension strength and minimize the cost. Under the buckling constraint, the optimal trade-off of tensile strength and material volume is obtained. The results demonstrated that the proposed method can effectively deal with multi-objective optimizations with black-box functions.

  11. ON THE NATURE OF THE TERTIARY COMPANION TO FW TAU: ALMA CO OBSERVATIONS AND SED MODELING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Caceres, Claudio; Hardy, Adam; Schreiber, Matthias R.

    2015-06-20

    It is thought that planetary mass companions may form through gravitational disk instabilities or core accretion. Identifying such objects in the process of formation would provide the most direct test for the competing formation theories. One of the most promising candidates for a planetary mass object still in formation is the third object in the FW Tau system. We present here ALMA cycle 1 observations confirming the recently published 1.3 mm detection of a dust disk around this third object and present for the first time a clear detection of a single peak {sup 12}CO (2–1) line, providing direct evidencemore » for the simultaneous existence of a gas disk. We perform radiative transfer modeling of the third object in FW Tau and find that current observations are consistent with either a brown dwarf embedded in an edge-on disk or a planet embedded in a low inclination disk, which is externally irradiated by the binary companion. Further observations with ALMA, aiming for high SNR detections of non-contaminated gas lines, are required to conclusively unveil the nature of the third object in FW Tau.« less

  12. Advancing Land-Sea Conservation Planning: Integrating Modelling of Catchments, Land-Use Change, and River Plumes to Prioritise Catchment Management and Protection.

    PubMed

    Álvarez-Romero, Jorge G; Pressey, Robert L; Ban, Natalie C; Brodie, Jon

    2015-01-01

    Human-induced changes to river loads of nutrients and sediments pose a significant threat to marine ecosystems. Ongoing land-use change can further increase these loads, and amplify the impacts of land-based threats on vulnerable marine ecosystems. Consequently, there is a need to assess these threats and prioritise actions to mitigate their impacts. A key question regarding prioritisation is whether actions in catchments to maintain coastal-marine water quality can be spatially congruent with actions for other management objectives, such as conserving terrestrial biodiversity. In selected catchments draining into the Gulf of California, Mexico, we employed Land Change Modeller to assess the vulnerability of areas with native vegetation to conversion into crops, pasture, and urban areas. We then used SedNet, a catchment modelling tool, to map the sources and estimate pollutant loads delivered to the Gulf by these catchments. Following these analyses, we used modelled river plumes to identify marine areas likely influenced by land-based pollutants. Finally, we prioritised areas for catchment management based on objectives for conservation of terrestrial biodiversity and objectives for water quality that recognised links between pollutant sources and affected marine areas. Our objectives for coastal-marine water quality were to reduce sediment and nutrient discharges from anthropic areas, and minimise future increases in coastal sedimentation and eutrophication. Our objectives for protection of terrestrial biodiversity covered species of vertebrates. We used Marxan, a conservation planning tool, to prioritise interventions and explore spatial differences in priorities for both objectives. Notable differences in the distributions of land values for terrestrial biodiversity and coastal-marine water quality indicated the likely need for trade-offs between catchment management objectives. However, there were priority areas that contributed to both sets of objectives. Our study demonstrates a practical approach to integrating models of catchments, land-use change, and river plumes with conservation planning software to inform prioritisation of catchment management.

  13. Advancing Land-Sea Conservation Planning: Integrating Modelling of Catchments, Land-Use Change, and River Plumes to Prioritise Catchment Management and Protection

    PubMed Central

    Álvarez-Romero, Jorge G.; Pressey, Robert L.; Ban, Natalie C.; Brodie, Jon

    2015-01-01

    Human-induced changes to river loads of nutrients and sediments pose a significant threat to marine ecosystems. Ongoing land-use change can further increase these loads, and amplify the impacts of land-based threats on vulnerable marine ecosystems. Consequently, there is a need to assess these threats and prioritise actions to mitigate their impacts. A key question regarding prioritisation is whether actions in catchments to maintain coastal-marine water quality can be spatially congruent with actions for other management objectives, such as conserving terrestrial biodiversity. In selected catchments draining into the Gulf of California, Mexico, we employed Land Change Modeller to assess the vulnerability of areas with native vegetation to conversion into crops, pasture, and urban areas. We then used SedNet, a catchment modelling tool, to map the sources and estimate pollutant loads delivered to the Gulf by these catchments. Following these analyses, we used modelled river plumes to identify marine areas likely influenced by land-based pollutants. Finally, we prioritised areas for catchment management based on objectives for conservation of terrestrial biodiversity and objectives for water quality that recognised links between pollutant sources and affected marine areas. Our objectives for coastal-marine water quality were to reduce sediment and nutrient discharges from anthropic areas, and minimise future increases in coastal sedimentation and eutrophication. Our objectives for protection of terrestrial biodiversity covered species of vertebrates. We used Marxan, a conservation planning tool, to prioritise interventions and explore spatial differences in priorities for both objectives. Notable differences in the distributions of land values for terrestrial biodiversity and coastal-marine water quality indicated the likely need for trade-offs between catchment management objectives. However, there were priority areas that contributed to both sets of objectives. Our study demonstrates a practical approach to integrating models of catchments, land-use change, and river plumes with conservation planning software to inform prioritisation of catchment management. PMID:26714166

  14. Business model design for a wearable biofeedback system.

    PubMed

    Hidefjäll, Patrik; Titkova, Dina

    2015-01-01

    Wearable sensor technologies used to track daily activities have become successful in the consumer market. In order for wearable sensor technology to offer added value in the more challenging areas of stress-rehab care and occupational health stress-related biofeedback parameters need to be monitored and more elaborate business models are needed. To identify probable success factors for a wearable biofeedback system (Affective Health) in the two mentioned market segments in a Swedish setting, we conducted literature studies and interviews with relevant representatives. Data were collected and used first to describe the two market segments and then to define likely feasible business model designs, according to the Business Model Canvas framework. Needs of stakeholders were identified as inputs to business model design. Value propositions, a key building block of a business model, were defined for each segment. The value proposition for occupational health was defined as "A tool that can both identify employees at risk of stress-related disorders and reinforce healthy sustainable behavior" and for healthcare as: "Providing therapists with objective data about the patient's emotional state and motivating patients to better engage in the treatment process".

  15. Long-term scale adaptive tracking with kernel correlation filters

    NASA Astrophysics Data System (ADS)

    Wang, Yueren; Zhang, Hong; Zhang, Lei; Yang, Yifan; Sun, Mingui

    2018-04-01

    Object tracking in video sequences has broad applications in both military and civilian domains. However, as the length of input video sequence increases, a number of problems arise, such as severe object occlusion, object appearance variation, and object out-of-view (some portion or the entire object leaves the image space). To deal with these problems and identify the object being tracked from cluttered background, we present a robust appearance model using Speeded Up Robust Features (SURF) and advanced integrated features consisting of the Felzenszwalb's Histogram of Oriented Gradients (FHOG) and color attributes. Since re-detection is essential in long-term tracking, we develop an effective object re-detection strategy based on moving area detection. We employ the popular kernel correlation filters in our algorithm design, which facilitates high-speed object tracking. Our evaluation using the CVPR2013 Object Tracking Benchmark (OTB2013) dataset illustrates that the proposed algorithm outperforms reference state-of-the-art trackers in various challenging scenarios.

  16. A New Program Structuring Mechanism Based on Layered Graphs.

    DTIC Science & Technology

    1984-12-01

    which is a single-page diagram. Diagrams are constructed from some 40 symbols , chiefly A- boxes, arrows and annotations. A single model specifies a...are identified and used in describing it. 20The symbol "G" derives from the original use of the term "group" for "object slice". Since Ŕ" is already an...overloaded mathematical symbol , retaining "G" seems as good as any alternative. 21The names object slices and views reflect the interpretation placed

  17. Compassion Fatigue: An Application of the Concept to Informal Caregivers of Family Members with Dementia

    PubMed Central

    Day, Jennifer R.; Anderson, Ruth A.

    2011-01-01

    Introduction. Compassion fatigue is a concept used with increasing frequency in the nursing literature. The objective of this paper is to identify common themes across the literature and to apply these themes, and an existing model of compassion fatigue, to informal caregivers for family members with dementia. Findings. Caregivers for family members with dementia may be at risk for developing compassion fatigue. The model of compassion fatigue provides an informative framework for understanding compassion fatigue in the informal caregiver population. Limitations of the model when applied to this population were identified as traumatic memories and the emotional relationship between parent and child, suggesting areas for future research. Conclusions. Research is needed to better understand the impact of compassion fatigue on informal caregivers through qualitative interviews, to identify informal caregivers at risk for compassion fatigue, and to provide an empirical basis for developing nursing interventions for caregivers experiencing compassion fatigue. PMID:22229086

  18. National facilities study. Volume 4: Space operations facilities task group

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The principal objectives of the National Facilities Study (NFS) were to: (1) determine where U.S. facilities do not meet national aerospace needs; (2) define new facilities required to make U.S. capabilities 'world class' where such improvements are in the national interest; (3) define where consolidation and phase-out of existing facilities is appropriate; and (4) develop a long-term national plan for world-class facility acquisition and shared usage. The Space Operations Facilities Task Group defined discrete tasks to accomplish the above objectives within the scope of the study. An assessment of national space operations facilities was conducted to determine the nation's capability to meet the requirements of space operations during the next 30 years. The mission model used in the study to define facility requirements is described in Volume 3. Based on this model, the major focus of the Task Group was to identify any substantive overlap or underutilization of space operations facilities and to identify any facility shortfalls that would necessitate facility upgrades or new facilities. The focus of this initial study was directed toward facility recommendations related to consolidations, closures, enhancements, and upgrades considered necessary to efficiently and effectively support the baseline requirements model. Activities related to identifying facility needs or recommendations for enhancing U.S. international competitiveness and achieving world-class capability, where appropriate, were deferred to a subsequent study phase.

  19. Clinical prediction models for mortality and functional outcome following ischemic stroke: A systematic review and meta-analysis

    PubMed Central

    Crayton, Elise; Wolfe, Charles; Douiri, Abdel

    2018-01-01

    Objective We aim to identify and critically appraise clinical prediction models of mortality and function following ischaemic stroke. Methods Electronic databases, reference lists, citations were searched from inception to September 2015. Studies were selected for inclusion, according to pre-specified criteria and critically appraised by independent, blinded reviewers. The discrimination of the prediction models was measured by the area under the curve receiver operating characteristic curve or c-statistic in random effects meta-analysis. Heterogeneity was measured using I2. Appropriate appraisal tools and reporting guidelines were used in this review. Results 31395 references were screened, of which 109 articles were included in the review. These articles described 66 different predictive risk models. Appraisal identified poor methodological quality and a high risk of bias for most models. However, all models precede the development of reporting guidelines for prediction modelling studies. Generalisability of models could be improved, less than half of the included models have been externally validated(n = 27/66). 152 predictors of mortality and 192 predictors and functional outcome were identified. No studies assessing ability to improve patient outcome (model impact studies) were identified. Conclusions Further external validation and model impact studies to confirm the utility of existing models in supporting decision-making is required. Existing models have much potential. Those wishing to predict stroke outcome are advised to build on previous work, to update and adapt validated models to their specific contexts opposed to designing new ones. PMID:29377923

  20. Nanoscale reference materials for environmental, health and safety measurements: needs, gaps and opportunities.

    PubMed

    Stefaniak, Aleksandr B; Hackley, Vincent A; Roebben, Gert; Ehara, Kensei; Hankin, Steve; Postek, Michael T; Lynch, Iseult; Fu, Wei-En; Linsinger, Thomas P J; Thünemann, Andreas F

    2013-12-01

    The authors critically reviewed published lists of nano-objects and their physico-chemical properties deemed important for risk assessment and discussed metrological challenges associated with the development of nanoscale reference materials (RMs). Five lists were identified that contained 25 (classes of) nano-objects; only four (gold, silicon dioxide, silver, titanium dioxide) appeared on all lists. Twenty-three properties were identified for characterisation; only (specific) surface area appeared on all lists. The key themes that emerged from this review were: 1) various groups have prioritised nano-objects for development as "candidate RMs" with limited consensus; 2) a lack of harmonised terminology hinders accurate description of many nano-object properties; 3) many properties identified for characterisation are ill-defined or qualitative and hence are not metrologically traceable; 4) standardised protocols are critically needed for characterisation of nano-objects as delivered in relevant media and as administered to toxicological models; 5) the measurement processes being used to characterise a nano-object must be understood because instruments may measure a given sample in a different way; 6) appropriate RMs should be used for both accurate instrument calibration and for more general testing purposes (e.g., protocol validation); 7) there is a need to clarify that where RMs are not available, if "(representative) test materials" that lack reference or certified values may be useful for toxicology testing and 8) there is a need for consensus building within the nanotechnology and environmental, health and safety communities to prioritise RM needs and better define the required properties and (physical or chemical) forms of the candidate materials.

  1. Towards a methodology to formulate sustainable diets for livestock: accounting for environmental impact in diet formulation.

    PubMed

    Mackenzie, S G; Leinonen, I; Ferguson, N; Kyriazakis, I

    2016-05-28

    The objective of this study was to develop a novel methodology that enables pig diets to be formulated explicitly for environmental impact objectives using a Life Cycle Assessment (LCA) approach. To achieve this, the following methodological issues had to be addressed: (1) account for environmental impacts caused by both ingredient choice and nutrient excretion, (2) formulate diets for multiple environmental impact objectives and (3) allow flexibility to identify the optimal nutritional composition for each environmental impact objective. An LCA model based on Canadian pig farms was integrated into a diet formulation tool to compare the use of different ingredients in Eastern and Western Canada. By allowing the feed energy content to vary, it was possible to identify the optimum energy density for different environmental impact objectives, while accounting for the expected effect of energy density on feed intake. A least-cost diet was compared with diets formulated to minimise the following objectives: non-renewable resource use, acidification potential, eutrophication potential, global warming potential and a combined environmental impact score (using these four categories). The resulting environmental impacts were compared using parallel Monte Carlo simulations to account for shared uncertainty. When optimising diets to minimise a single environmental impact category, reductions in the said category were observed in all cases. However, this was at the expense of increasing the impact in other categories and higher dietary costs. The methodology can identify nutritional strategies to minimise environmental impacts, such as increasing the nutritional density of the diets, compared with the least-cost formulation.

  2. WIRED for EC: New White Dwarfs with WISE Infrared Excesses and New Classification Schemes from the Edinburgh–Cape Blue Object Survey

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dennihy, E.; Clemens, J. C.; Dunlap, B. H.

    We present a simple method for identifying candidate white dwarf systems with dusty exoplanetary debris based on a single temperature blackbody model fit to the infrared excess. We apply this technique to a sample of Southern Hemisphere white dwarfs from the recently completed Edinburgh–Cape Blue Object Survey and identify four new promising dusty debris disk candidates. We demonstrate the efficacy of our selection method by recovering three of the four Spitzer confirmed dusty debris disk systems in our sample. Further investigation using archival high-resolution imaging shows that Spitzer data of the unrecovered fourth object is likely contaminated by a line-of-sightmore » object that either led to a misclassification as a dusty disk in the literature or is confounding our method. Finally, in our diagnostic plot, we show that dusty white dwarfs, which also host gaseous debris, lie along a boundary of our dusty debris disk region, providing clues to the origin and evolution of these especially interesting systems.« less

  3. Multi-point objective-oriented sequential sampling strategy for constrained robust design

    NASA Astrophysics Data System (ADS)

    Zhu, Ping; Zhang, Siliang; Chen, Wei

    2015-03-01

    Metamodelling techniques are widely used to approximate system responses of expensive simulation models. In association with the use of metamodels, objective-oriented sequential sampling methods have been demonstrated to be effective in balancing the need for searching an optimal solution versus reducing the metamodelling uncertainty. However, existing infilling criteria are developed for deterministic problems and restricted to one sampling point in one iteration. To exploit the use of multiple samples and identify the true robust solution in fewer iterations, a multi-point objective-oriented sequential sampling strategy is proposed for constrained robust design problems. In this article, earlier development of objective-oriented sequential sampling strategy for unconstrained robust design is first extended to constrained problems. Next, a double-loop multi-point sequential sampling strategy is developed. The proposed methods are validated using two mathematical examples followed by a highly nonlinear automotive crashworthiness design example. The results show that the proposed method can mitigate the effect of both metamodelling uncertainty and design uncertainty, and identify the robust design solution more efficiently than the single-point sequential sampling approach.

  4. Sleuthing the Isolated Compact Stars

    NASA Astrophysics Data System (ADS)

    Drake, J. J.

    2004-08-01

    In the early 1990's, isolated thermally-emitting neutron stars accreting from the interstellar medium were predicted to show up in their thousands in the ROSAT soft X-ray all-sky survey. The glut of sources would provide unprecedented opportunities for probing the equation of state of ultra-dense matter. Only seven objects have been firmly identified to date. The reasons for this discrepency are discussed and recent high resolution X-ray spectroscopic observations of these objects are described. Spectra of the brightest of the isolated neutron star candidates, RX J1856.5-3754, continue to present interpretational difficulties for current neutron star model atmospheres and alternative models are briefly discussed. RX J1856.5-3754 remains a valid quark star candidate.

  5. Modeling Functional Neuroanatomy for an Anatomy Information System

    PubMed Central

    Niggemann, Jörg M.; Gebert, Andreas; Schulz, Stefan

    2008-01-01

    Objective Existing neuroanatomical ontologies, databases and information systems, such as the Foundational Model of Anatomy (FMA), represent outgoing connections from brain structures, but cannot represent the “internal wiring” of structures and as such, cannot distinguish between different independent connections from the same structure. Thus, a fundamental aspect of Neuroanatomy, the functional pathways and functional systems of the brain such as the pupillary light reflex system, is not adequately represented. This article identifies underlying anatomical objects which are the source of independent connections (collections of neurons) and uses these as basic building blocks to construct a model of functional neuroanatomy and its functional pathways. Design The basic representational elements of the model are unnamed groups of neurons or groups of neuron segments. These groups, their relations to each other, and the relations to the objects of macroscopic anatomy are defined. The resulting model can be incorporated into the FMA. Measurements The capabilities of the presented model are compared to the FMA and the Brain Architecture Management System (BAMS). Results Internal wiring as well as functional pathways can correctly be represented and tracked. Conclusion This model bridges the gap between representations of single neurons and their parts on the one hand and representations of spatial brain structures and areas on the other hand. It is capable of drawing correct inferences on pathways in a nervous system. The object and relation definitions are related to the Open Biomedical Ontology effort and its relation ontology, so that this model can be further developed into an ontology of neuronal functional systems. PMID:18579841

  6. Model for the evaluation of drug-dispensing services in primary health care

    PubMed Central

    Sartor, Vanessa de Bona; de Freitas, Sergio Fernando Torres

    2014-01-01

    OBJECTIVE To develop a model for evaluating the efficacy of drug-dispensing service in primary health care. METHODS An efficacy criterion was adopted to determine the level of achievement of the service objectives. The evaluation model was developed on the basis of a literature search and discussions with experts. The applicability test of the model was conducted in 15 primary health care units in the city of Florianópolis, state of Santa Catarina, in 2010, and data were recorded in structured and pretested questionnaires. RESULTS The model developed was evaluated using five dimensions of analysis for analysis. The model was suitable for evaluating service efficacy and helped to identify the critical points of each service dimension. CONCLUSIONS Adaptations to the data collection technique may be required to adjust for the reality and needs of each situation. The evaluation of the drug-dispensing service should promote adequate access to medications supplied through the public health system. PMID:25372174

  7. Research Analysis on MOOC Course Dropout and Retention Rates

    ERIC Educational Resources Information Center

    Gomez-Zermeno, Marcela Gerogina; Aleman de La Garza, Lorena

    2016-01-01

    This research's objective was to identify the terminal efficiency of the Massive Online Open Course "Educational Innovation with Open Resources" offered by a Mexican private university. A quantitative methodology was used, combining descriptive statistics and probabilistic models to analyze the levels of retention, completion, and…

  8. Educational Guidance for Adults. Identifying Competences.

    ERIC Educational Resources Information Center

    Oakeshott, Martin

    A brief study conducted for the Further Education Unit, Great Britain, defined the competencies associated with educational guidance for adults. The objective was to develop a qualification for educational guidance workers with adults. The project provided an example of applying a competence model to a "higher level" interpersonal field…

  9. Self-organized network of fractal-shaped components coupled through statistical interaction.

    PubMed

    Ugajin, R

    2001-09-01

    A dissipative dynamics is introduced to generate self-organized networks of interacting objects, which we call coupled-fractal networks. The growth model is constructed based on a growth hypothesis in which the growth rate of each object is a product of the probability of receiving source materials from faraway and the probability of receiving adhesives from other grown objects, where each object grows to be a random fractal if isolated, but connects with others if glued. The network is governed by the statistical interaction between fractal-shaped components, which can only be identified in a statistical manner over ensembles. This interaction is investigated using the degree of correlation between fractal-shaped components, enabling us to determine whether it is attractive or repulsive.

  10. Spatial but not object memory impairments in children with fetal alcohol syndrome.

    PubMed

    Uecker, A; Nadel, L

    1998-07-01

    Behavioral dissociations on tests of cognitive abilities are powerful tools that can help define the neuropsychology of developmentally disabling conditions. Animals gestationally exposed to alcohol demonstrate spatial (place) but not object (cue) memory impairments. Whether children with fetal alcohol syndrome demonstrate a similar dissociation has received little attention. In this experiment, 30 Native American children, 15 previously identified with fetal alcohol syndrome and 15 control children, were asked to recall places and objects in a task previously shown to be sensitive to memory skills in individuals with and without mental retardation. As in animal models, children with fetal alcohol syndrome demonstrated a spatial but not an object memory impairment. A possible role for the hippocampus was discussed.

  11. Detection of dominant flow and abnormal events in surveillance video

    NASA Astrophysics Data System (ADS)

    Kwak, Sooyeong; Byun, Hyeran

    2011-02-01

    We propose an algorithm for abnormal event detection in surveillance video. The proposed algorithm is based on a semi-unsupervised learning method, a kind of feature-based approach so that it does not detect the moving object individually. The proposed algorithm identifies dominant flow without individual object tracking using a latent Dirichlet allocation model in crowded environments. It can also automatically detect and localize an abnormally moving object in real-life video. The performance tests are taken with several real-life databases, and their results show that the proposed algorithm can efficiently detect abnormally moving objects in real time. The proposed algorithm can be applied to any situation in which abnormal directions or abnormal speeds are detected regardless of direction.

  12. Robust statistical reconstruction for charged particle tomography

    DOEpatents

    Schultz, Larry Joe; Klimenko, Alexei Vasilievich; Fraser, Andrew Mcleod; Morris, Christopher; Orum, John Christopher; Borozdin, Konstantin N; Sossong, Michael James; Hengartner, Nicolas W

    2013-10-08

    Systems and methods for charged particle detection including statistical reconstruction of object volume scattering density profiles from charged particle tomographic data to determine the probability distribution of charged particle scattering using a statistical multiple scattering model and determine a substantially maximum likelihood estimate of object volume scattering density using expectation maximization (ML/EM) algorithm to reconstruct the object volume scattering density. The presence of and/or type of object occupying the volume of interest can be identified from the reconstructed volume scattering density profile. The charged particle tomographic data can be cosmic ray muon tomographic data from a muon tracker for scanning packages, containers, vehicles or cargo. The method can be implemented using a computer program which is executable on a computer.

  13. The value of microgrants for community-based health promotion: two models for practice and policy.

    PubMed

    Hartwig, Kari A; Bobbitt-Cooke, Mary; Zaharek, Margot M; Nappi, Susan; Wykoff, Randolph F; Katz, David L

    2006-01-01

    In 2001, the Office of Disease Prevention and Health Promotion in the US Department of Health and Human Services announced its intention to (1) identify innovative ways to increase public awareness and focus on Healthy People 2010 objectives and (2) broaden the participation of community-based organizations, including agencies new to public health. The mechanism selected, microfinancing, was modeled after small venture loans for economic stimulus in developing countries. The Office of Disease Prevention and Health Promotion selected one state health department and one academic research organization from 80 applicants to test models of awarding "microgrants" of 2,010 dollars to community agencies. This article describes the two models, the types of agencies that were funded, the primary Healthy People 2010 objectives targeted, examples of how the monies were used and leveraged by grantees, and the implications of microgrants for public health practice and policy.

  14. Railway obstacle detection algorithm using neural network

    NASA Astrophysics Data System (ADS)

    Yu, Mingyang; Yang, Peng; Wei, Sen

    2018-05-01

    Aiming at the difficulty of detection of obstacle in outdoor railway scene, a data-oriented method based on neural network to obtain image objects is proposed. First, we mark objects of images(such as people, trains, animals) acquired on the Internet. and then use the residual learning units to build Fast R-CNN framework. Then, the neural network is trained to get the target image characteristics by using stochastic gradient descent algorithm. Finally, a well-trained model is used to identify an outdoor railway image. if it includes trains and other objects, it will issue an alert. Experiments show that the correct rate of warning reached 94.85%.

  15. Stochastic resource allocation in emergency departments with a multi-objective simulation optimization algorithm.

    PubMed

    Feng, Yen-Yi; Wu, I-Chin; Chen, Tzu-Li

    2017-03-01

    The number of emergency cases or emergency room visits rapidly increases annually, thus leading to an imbalance in supply and demand and to the long-term overcrowding of hospital emergency departments (EDs). However, current solutions to increase medical resources and improve the handling of patient needs are either impractical or infeasible in the Taiwanese environment. Therefore, EDs must optimize resource allocation given limited medical resources to minimize the average length of stay of patients and medical resource waste costs. This study constructs a multi-objective mathematical model for medical resource allocation in EDs in accordance with emergency flow or procedure. The proposed mathematical model is complex and difficult to solve because its performance value is stochastic; furthermore, the model considers both objectives simultaneously. Thus, this study develops a multi-objective simulation optimization algorithm by integrating a non-dominated sorting genetic algorithm II (NSGA II) with multi-objective computing budget allocation (MOCBA) to address the challenges of multi-objective medical resource allocation. NSGA II is used to investigate plausible solutions for medical resource allocation, and MOCBA identifies effective sets of feasible Pareto (non-dominated) medical resource allocation solutions in addition to effectively allocating simulation or computation budgets. The discrete event simulation model of ED flow is inspired by a Taiwan hospital case and is constructed to estimate the expected performance values of each medical allocation solution as obtained through NSGA II. Finally, computational experiments are performed to verify the effectiveness and performance of the integrated NSGA II and MOCBA method, as well as to derive non-dominated medical resource allocation solutions from the algorithms.

  16. Probabilistic Plan Management

    DTIC Science & Technology

    2009-11-17

    set of chains , the step adds scheduled methods that have an a priori likelihood of a failure outcome (Lines 3-5). It identifies the max eul value of the...activity meeting its objective, as well as its expected contribution to the schedule. By explicitly calculating these values , PADS is able to summarize the...variables. One of the main difficulties of this model is convolving the probability density functions and value functions while solving the model; this

  17. Robust Sensitivity Analysis for Multi-Attribute Deterministic Hierarchical Value Models

    DTIC Science & Technology

    2002-03-01

    such as weighted sum method, weighted 5 product method, and the Analytic Hierarchy Process ( AHP ). This research focuses on only weighted sum...different groups. They can be termed as deterministic, stochastic, or fuzzy multi-objective decision methods if they are classified according to the...weighted product model (WPM), and analytic hierarchy process ( AHP ). His method attempts to identify the most important criteria weight and the most

  18. Aspects of job scheduling

    NASA Technical Reports Server (NTRS)

    Phillips, K.

    1976-01-01

    A mathematical model for job scheduling in a specified context is presented. The model uses both linear programming and combinatorial methods. While designed with a view toward optimization of scheduling of facility and plant operations at the Deep Space Communications Complex, the context is sufficiently general to be widely applicable. The general scheduling problem including options for scheduling objectives is discussed and fundamental parameters identified. Mathematical algorithms for partitioning problems germane to scheduling are presented.

  19. A Watershed Modeling System for Fort Benning, GA Using the US EPA BASINS Framework

    DTIC Science & Technology

    2013-01-01

    Benning watersheds. The objective of this project was to identify, adapt , and develop watershed management models for Fort Benning that address impacts on...of Need (SON) (SERDP, 2005) which recognized that military installations needed the identification, adaptation , and development of watershed...capabilities. To accomplish these goals the Strategic Plan for SEMP (2005) notes the need for both fundamental and applied ( adaptive ) research; this need

  20. Identifying the Cost of Non-monetary Incentives (ICONIC)

    DTIC Science & Technology

    2009-12-01

    topics. a. Inspection Optimization Model The Environmental Protection Agency (EPA) developed a linear programming model designed for a state air...and other special pays that can distort the environment and amenities that the next assignment offers.23 The primary objective of this work is to...Government Printing Office, 2005). http://www.gao.gov/new.items/d06125.pdf (accessed September 28, 2008). Van Boening, Mark, Tanja F. Blackstone

  1. To Pass or Not to Pass: Modeling the Movement and Affordance Dynamics of a Pick and Place Task

    PubMed Central

    Lamb, Maurice; Kallen, Rachel W.; Harrison, Steven J.; Di Bernardo, Mario; Minai, Ali; Richardson, Michael J.

    2017-01-01

    Humans commonly engage in tasks that require or are made more efficient by coordinating with other humans. In this paper we introduce a task dynamics approach for modeling multi-agent interaction and decision making in a pick and place task where an agent must move an object from one location to another and decide whether to act alone or with a partner. Our aims were to identify and model (1) the affordance related dynamics that define an actor's choice to move an object alone or to pass it to their co-actor and (2) the trajectory dynamics of an actor's hand movements when moving to grasp, relocate, or pass the object. Using a virtual reality pick and place task, we demonstrate that both the decision to pass or not pass an object and the movement trajectories of the participants can be characterized in terms of a behavioral dynamics model. Simulations suggest that the proposed behavioral dynamics model exhibits features observed in human participants including hysteresis in decision making, non-straight line trajectories, and non-constant velocity profiles. The proposed model highlights how the same low-dimensional behavioral dynamics can operate to constrain multiple (and often nested) levels of human activity and suggests that knowledge of what, when, where and how to move or act during pick and place behavior may be defined by these low dimensional task dynamics and, thus, can emerge spontaneously and in real-time with little a priori planning. PMID:28701975

  2. A FRAMEWORK TO DESIGN AND OPTIMIZE CHEMICAL FLOODING PROCESSES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori

    2005-07-01

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectivesmore » of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.« less

  3. A Framework to Design and Optimize Chemical Flooding Processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori

    2006-08-31

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectivesmore » of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.« less

  4. A FRAMEWORK TO DESIGN AND OPTIMIZE CHEMICAL FLOODING PROCESSES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori

    2004-11-01

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectivesmore » of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.« less

  5. International Planetary Data Alliance (IPDA) Information Model

    NASA Technical Reports Server (NTRS)

    Hughes, John Steven; Beebe, R.; Guinness, E.; Heather, D.; Huang, M.; Kasaba, Y.; Osuna, P.; Rye, E.; Savorskiy, V.

    2007-01-01

    This document is the third deliverable of the International Planetary Data Alliance (IPDA) Archive Data Standards Requirements Identification project. The goal of the project is to identify a subset of the standards currently in use by NASAs Planetary Data System (PDS) that are appropriate for internationalization. As shown in the highlighted sections of Figure 1, the focus of this project is the Information Model component of the Data Architecture Standards, namely the object models, a data dictionary, and a set of data formats.

  6. Application of SIR-C SAR to Hydrology

    NASA Technical Reports Server (NTRS)

    Engman, Edwin T.; ONeill, Peggy; Wood, Eric; Pauwels, Valentine; Hsu, Ann; Jackson, Tom; Shi, J. C.; Prietzsch, Corinna

    1996-01-01

    The progress, results and future plans regarding the following objectives are presented: (1) Determine and compare soil moisture patterns within one or more humid watersheds using SAR data, ground-based measurements, and hydrologic modeling; (2) Use radar data to characterize the hydrologic regime within a catchment and to identify the runoff producing characteristics of humid zone watersheds; and (3) Use radar data as the basis for scaling up from small scale, near-point process models to larger scale water balance models necessary to define and quantify the land phase of GCM's (Global Circulation Models).

  7. Diabetes risk score in the United Arab Emirates: a screening tool for the early detection of type 2 diabetes mellitus

    PubMed Central

    Sulaiman, Nabil; Hussein, Amal; Elbadawi, Salah; Abusnana, Salah; Zimmet, Paul

    2018-01-01

    Objective The objective of this study was to develop a simple non-invasive risk score, specific to the United Arab Emirates (UAE) citizens, to identify individuals at increased risk of having undiagnosed type 2 diabetes mellitus. Research design and methods A retrospective analysis of the UAE National Diabetes and Lifestyle data was conducted. The data included demographic and anthropometric measurements, and fasting blood glucose. Univariate analyses were used to identify the risk factors for diabetes. The risk score was developed for UAE citizens using a stepwise forward regression model. Results A total of 872 UAE citizens were studied. The overall prevalence of diabetes in the UAE adult citizens in the Northern Emirates was 25.1%. The significant risk factors identified for diabetes were age (≥35 years), a family history of diabetes mellitus, hypertension, body mass index ≥30.0 and waist-to-hip ratio ≥0.90 for males and ≥0.85 for females. The performance of the model was moderate in terms of sensitivity (75.4%, 95% CI 68.3 to 81.7) and specificity (70%, 95% CI 65.8 to 73.9). The area under the receiver-operator characteristic curve was 0.82 (95% CI 0.78 to 0.86). Conclusions A simple, non-invasive risk score model was developed to help to identify those at high risk of having diabetes among UAE citizens. This score could contribute to the efficient and less expensive earlier detection of diabetes in this high-risk population. PMID:29629178

  8. Climate-induced lake drying causes heterogeneous reductions in waterfowl species richness

    USGS Publications Warehouse

    Roach, Jennifer K.; Griffith, Dennis B.

    2015-01-01

    ContextLake size has declined on breeding grounds for international populations of waterfowl.ObjectivesOur objectives were to (1) model the relationship between waterfowl species richness and lake size; (2) use the model and trends in lake size to project historical, contemporary, and future richness at 2500+ lakes; (3) evaluate mechanisms for the species–area relationship (SAR); and (4) identify species most vulnerable to shrinking lakes.MethodsMonte Carlo simulations of the richness model were used to generate projections. Correlations between richness and both lake size and habitat diversity were compared to identify mechanisms for the SAR. Patterns of nestedness were used to identify vulnerable species.ResultsSpecies richness was greatest at lakes that were larger, closer to rivers, had more wetlands along their perimeters and were within 5 km of a large lake. Average richness per lake was projected to decline by 11 % from 1986 to 2050 but was heterogeneous across sub-regions and lakes. Richness in sub-regions with species-rich lakes was projected to remain stable, while richness in the sub-region with species-poor lakes was projected to decline. Lake size had a greater effect on richness than did habitat diversity, suggesting that large lakes have more species because they provide more habitat but not more habitat types. The vulnerability of species to shrinking lakes was related to species rarity rather than foraging guild.ConclusionsOur maps of projected changes in species richness and rank-ordered list of species most vulnerable to shrinking lakes can be used to identify targets for conservation or monitoring.

  9. Object-oriented Approach to High-level Network Monitoring and Management

    NASA Technical Reports Server (NTRS)

    Mukkamala, Ravi

    2000-01-01

    An absolute prerequisite for the management of large investigating methods to build high-level monitoring computer networks is the ability to measure their systems that are built on top of existing monitoring performance. Unless we monitor a system, we cannot tools. Due to the heterogeneous nature of the hope to manage and control its performance. In this underlying systems at NASA Langley Research Center, paper, we describe a network monitoring system that we use an object-oriented approach for the design, we are currently designing and implementing. Keeping, first, we use UML (Unified Modeling Language) to in mind the complexity of the task and the required model users' requirements. Second, we identify the flexibility for future changes, we use an object-oriented existing capabilities of the underlying monitoring design methodology. The system is built using the system. Third, we try to map the former with the latter. APIs offered by the HP OpenView system.

  10. Fatigue design of a cellular phone folder using regression model-based multi-objective optimization

    NASA Astrophysics Data System (ADS)

    Kim, Young Gyun; Lee, Jongsoo

    2016-08-01

    In a folding cellular phone, the folding device is repeatedly opened and closed by the user, which eventually results in fatigue damage, particularly to the front of the folder. Hence, it is important to improve the safety and endurance of the folder while also reducing its weight. This article presents an optimal design for the folder front that maximizes its fatigue endurance while minimizing its thickness. Design data for analysis and optimization were obtained experimentally using a test jig. Multi-objective optimization was carried out using a nonlinear regression model. Three regression methods were employed: back-propagation neural networks, logistic regression and support vector machines. The AdaBoost ensemble technique was also used to improve the approximation. Two-objective Pareto-optimal solutions were identified using the non-dominated sorting genetic algorithm (NSGA-II). Finally, a numerically optimized solution was validated against experimental product data, in terms of both fatigue endurance and thickness index.

  11. Information footprint of different ecohydrological data sources: using multi-objective calibration of a physically-based model as hypothesis testing

    NASA Astrophysics Data System (ADS)

    Kuppel, S.; Soulsby, C.; Maneta, M. P.; Tetzlaff, D.

    2017-12-01

    The utility of field measurements to help constrain the model solution space and identify feasible model configurations has been an increasingly central issue in hydrological model calibration. Sufficiently informative observations are necessary to ensure that the goodness of model-data fit attained effectively translates into more physically-sound information for the internal model parameters, as a basis for model structure evaluation. Here we assess to which extent the diversity of information content can inform on the suitability of a complex, process-based ecohydrological model to simulate key water flux and storage dynamics at a long-term research catchment in the Scottish Highlands. We use the fully-distributed ecohydrological model EcH2O, calibrated against long-term datasets that encompass hydrologic and energy exchanges and ecological measurements: stream discharge, soil moisture, net radiation above canopy, and pine stand transpiration. Diverse combinations of these constraints were applied using a multi-objective cost function specifically designed to avoid compensatory effects between model-data metrics. Results revealed that calibration against virtually all datasets enabled the model to reproduce streamflow reasonably well. However, parameterizing the model to adequately capture local flux and storage dynamics, such as soil moisture or transpiration, required calibration with specific observations. This indicates that the footprint of the information contained in observations varies for each type of dataset, and that a diverse database informing about the different compartments of the domain, is critical to test hypotheses of catchment function and identify a consistent model parameterization. The results foster confidence in using EcH2O to help understanding current and future ecohydrological couplings in Northern catchments.

  12. Simultaneous segmentation of the bone and cartilage surfaces of a knee joint in 3D

    NASA Astrophysics Data System (ADS)

    Yin, Y.; Zhang, X.; Anderson, D. D.; Brown, T. D.; Hofwegen, C. Van; Sonka, M.

    2009-02-01

    We present a novel framework for the simultaneous segmentation of multiple interacting surfaces belonging to multiple mutually interacting objects. The method is a non-trivial extension of our previously reported optimal multi-surface segmentation. Considering an example application of knee-cartilage segmentation, the framework consists of the following main steps: 1) Shape model construction: Building a mean shape for each bone of the joint (femur, tibia, patella) from interactively segmented volumetric datasets. Using the resulting mean-shape model - identification of cartilage, non-cartilage, and transition areas on the mean-shape bone model surfaces. 2) Presegmentation: Employment of iterative optimal surface detection method to achieve approximate segmentation of individual bone surfaces. 3) Cross-object surface mapping: Detection of inter-bone equidistant separating sheets to help identify corresponding vertex pairs for all interacting surfaces. 4) Multi-object, multi-surface graph construction and final segmentation: Construction of a single multi-bone, multi-surface graph so that two surfaces (bone and cartilage) with zero and non-zero intervening distances can be detected for each bone of the joint, according to whether or not cartilage can be locally absent or present on the bone. To define inter-object relationships, corresponding vertex pairs identified using the separating sheets were interlinked in the graph. The graph optimization algorithm acted on the entire multiobject, multi-surface graph to yield a globally optimal solution. The segmentation framework was tested on 16 MR-DESS knee-joint datasets from the Osteoarthritis Initiative database. The average signed surface positioning error for the 6 detected surfaces ranged from 0.00 to 0.12 mm. When independently initialized, the signed reproducibility error of bone and cartilage segmentation ranged from 0.00 to 0.26 mm. The results showed that this framework provides robust, accurate, and reproducible segmentation of the knee joint bone and cartilage surfaces of the femur, tibia, and patella. As a general segmentation tool, the developed framework can be applied to a broad range of multi-object segmentation problems.

  13. Classification of ephemeral, intermittent, and perennial stream reaches using a TOPMODEL-based approach

    USGS Publications Warehouse

    Williamson, Tanja N.; Agouridis, Carmen T.; Barton, Christopher D.; Villines, Jonathan A.; Lant, Jeremiah G.

    2015-01-01

    Whether a waterway is temporary or permanent influences regulatory protection guidelines, however, classification can be subjective due to a combination of factors, including time of year, antecedent moisture conditions, and previous experience of the field investigator. Our objective was to develop a standardized protocol using publically available spatial information to classify ephemeral, intermittent, and perennial streams. Our hypothesis was that field observations of flow along the stream channel could be compared to results from a hydrologic model, providing an objective method of how these stream reaches can be identified. Flow-state sensors were placed at ephemeral, intermittent, and perennial stream reaches from May to December 2011 in the Appalachian coal basin of eastern Kentucky. This observed flow record was then used to calibrate the simulated saturation deficit in each channel reach based on the topographic wetness index used by TOPMODEL. Saturation deficit values were categorized as flow or no-flow days, and the simulated record of streamflow was compared to the observed record. The hydrologic model was more accurate for simulating flow during the spring and fall seasons. However, the model effectively identified stream reaches as intermittent and perennial in each of the two basins.

  14. A Point Kinetics Model for Estimating Neutron Multiplication of Bare Uranium Metal in Tagged Neutron Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tweardy, Matthew C.; McConchie, Seth; Hayward, Jason P.

    An extension of the point kinetics model is developed in this paper to describe the neutron multiplicity response of a bare uranium object under interrogation by an associated particle imaging deuterium-tritium (D-T) measurement system. This extended model is used to estimate the total neutron multiplication of the uranium. Both MCNPX-PoliMi simulations and data from active interrogation measurements of highly enriched and depleted uranium geometries are used to evaluate the potential of this method and to identify the sources of systematic error. The detection efficiency correction for measured coincidence response is identified as a large source of systematic error. If themore » detection process is not considered, results suggest that the method can estimate total multiplication to within 13% of the simulated value. Values for multiplicity constants in the point kinetics equations are sensitive to enrichment due to (n, xn) interactions by D-T neutrons and can introduce another significant source of systematic bias. This can theoretically be corrected if isotopic composition is known a priori. Finally, the spatial dependence of multiplication is also suspected of introducing further systematic bias for high multiplication uranium objects.« less

  15. A Point Kinetics Model for Estimating Neutron Multiplication of Bare Uranium Metal in Tagged Neutron Measurements

    DOE PAGES

    Tweardy, Matthew C.; McConchie, Seth; Hayward, Jason P.

    2017-06-13

    An extension of the point kinetics model is developed in this paper to describe the neutron multiplicity response of a bare uranium object under interrogation by an associated particle imaging deuterium-tritium (D-T) measurement system. This extended model is used to estimate the total neutron multiplication of the uranium. Both MCNPX-PoliMi simulations and data from active interrogation measurements of highly enriched and depleted uranium geometries are used to evaluate the potential of this method and to identify the sources of systematic error. The detection efficiency correction for measured coincidence response is identified as a large source of systematic error. If themore » detection process is not considered, results suggest that the method can estimate total multiplication to within 13% of the simulated value. Values for multiplicity constants in the point kinetics equations are sensitive to enrichment due to (n, xn) interactions by D-T neutrons and can introduce another significant source of systematic bias. This can theoretically be corrected if isotopic composition is known a priori. Finally, the spatial dependence of multiplication is also suspected of introducing further systematic bias for high multiplication uranium objects.« less

  16. Visual Object Recognition and Tracking of Tools

    NASA Technical Reports Server (NTRS)

    English, James; Chang, Chu-Yin; Tardella, Neil

    2011-01-01

    A method has been created to automatically build an algorithm off-line, using computer-aided design (CAD) models, and to apply this at runtime. The object type is discriminated, and the position and orientation are identified. This system can work with a single image and can provide improved performance using multiple images provided from videos. The spatial processing unit uses three stages: (1) segmentation; (2) initial type, pose, and geometry (ITPG) estimation; and (3) refined type, pose, and geometry (RTPG) calculation. The image segmentation module files all the tools in an image and isolates them from the background. For this, the system uses edge-detection and thresholding to find the pixels that are part of a tool. After the pixels are identified, nearby pixels are grouped into blobs. These blobs represent the potential tools in the image and are the product of the segmentation algorithm. The second module uses matched filtering (or template matching). This approach is used for condensing synthetic images using an image subspace that captures key information. Three degrees of orientation, three degrees of position, and any number of degrees of freedom in geometry change are included. To do this, a template-matching framework is applied. This framework uses an off-line system for calculating template images, measurement images, and the measurements of the template images. These results are used online to match segmented tools against the templates. The final module is the RTPG processor. Its role is to find the exact states of the tools given initial conditions provided by the ITPG module. The requirement that the initial conditions exist allows this module to make use of a local search (whereas the ITPG module had global scope). To perform the local search, 3D model matching is used, where a synthetic image of the object is created and compared to the sensed data. The availability of low-cost PC graphics hardware allows rapid creation of synthetic images. In this approach, a function of orientation, distance, and articulation is defined as a metric on the difference between the captured image and a synthetic image with an object in the given orientation, distance, and articulation. The synthetic image is created using a model that is looked up in an object-model database. A composable software architecture is used for implementation. Video is first preprocessed to remove sensor anomalies (like dead pixels), and then is processed sequentially by a prioritized list of tracker-identifiers.

  17. Object-oriented design tools for supramolecular devices and biomedical nanotechnology.

    PubMed

    Lee, Stephen C; Bhalerao, Khaustaub; Ferrari, Mauro

    2004-05-01

    Nanotechnology provides multifunctional agents for in vivo use that increasingly blur the distinction between pharmaceuticals and medical devices. Realization of such therapeutic nanodevices requires multidisciplinary effort that is difficult for individual device developers to sustain, and identification of appropriate collaborations outside ones own field can itself be challenging. Further, as in vivo nanodevices become increasingly complex, their design will increasingly demand systems level thinking. System engineering tools such as object-oriented analysis, object-oriented design (OOA/D) and unified modeling language (UML) are applicable to nanodevices built from biological components, help logically manage the knowledge needed to design them, and help identify useful collaborative relationships for device designers. We demonstrate the utility of these systems engineering tools by reverse engineering an existing molecular device (the bacmid molecular cloning system) using them, and illustrate how object-oriented approaches identify fungible components (objects) in nanodevices in a way that facilitates design of families of related devices, rather than single inventions. We also explore the utility of object-oriented approaches for design of another class of therapeutic nanodevices, vaccines. While they are useful for design of current nanodevices, the power of systems design tools for biomedical nanotechnology will become increasingly apparent as the complexity and sophistication of in vivo nanosystems increases. The nested, hierarchical nature of object-oriented approaches allows treatment of devices as objects in higher-order structures, and so will facilitate concatenation of multiple devices into higher-order, higher-function nanosystems.

  18. Cost model for biobanks.

    PubMed

    Gonzalez-Sanchez, M Beatriz; Lopez-Valeiras, Ernesto; Morente, Manuel M; Fernández Lago, Orlando

    2013-10-01

    Current economic conditions and budget constraints in publicly funded biomedical research have brought about a renewed interest in analyzing the cost and economic viability of research infrastructures. However, there are no proposals for specific cost accounting models for these types of organizations in the international scientific literature. The aim of this paper is to present the basis of a cost analysis model useful for any biobank regardless of the human biological samples that it stores for biomedical research. The development of a unique cost model for biobanks can be a complicated task due to the diversity of the biological samples they store. Different types of samples (DNA, tumor tissues, blood, serum, etc.) require different production processes. Nonetheless, the common basic steps of the production process can be identified. Thus, the costs incurred in each step can be analyzed in detail to provide cost information. Six stages and four cost objects were obtained by taking the production processes of biobanks belonging to the Spanish National Biobank Network as a starting point. Templates and examples are provided to help managers to identify and classify the costs involved in their own biobanks to implement the model. The application of this methodology will provide accurate information on cost objects, along with useful information to give an economic value to the stored samples, to analyze the efficiency of the production process and to evaluate the viability of some sample collections.

  19. Classification and Feature Selection Algorithms for Modeling Ice Storm Climatology

    NASA Astrophysics Data System (ADS)

    Swaminathan, R.; Sridharan, M.; Hayhoe, K.; Dobbie, G.

    2015-12-01

    Ice storms account for billions of dollars of winter storm loss across the continental US and Canada. In the future, increasing concentration of human populations in areas vulnerable to ice storms such as the northeastern US will only exacerbate the impacts of these extreme events on infrastructure and society. Quantifying the potential impacts of global climate change on ice storm prevalence and frequency is challenging, as ice storm climatology is driven by complex and incompletely defined atmospheric processes, processes that are in turn influenced by a changing climate. This makes the underlying atmospheric and computational modeling of ice storm climatology a formidable task. We propose a novel computational framework that uses sophisticated stochastic classification and feature selection algorithms to model ice storm climatology and quantify storm occurrences from both reanalysis and global climate model outputs. The framework is based on an objective identification of ice storm events by key variables derived from vertical profiles of temperature, humidity and geopotential height. Historical ice storm records are used to identify days with synoptic-scale upper air and surface conditions associated with ice storms. Evaluation using NARR reanalysis and historical ice storm records corresponding to the northeastern US demonstrates that an objective computational model with standard performance measures, with a relatively high degree of accuracy, identify ice storm events based on upper-air circulation patterns and provide insights into the relationships between key climate variables associated with ice storms.

  20. The role of cognitive reserve and memory self-efficacy in compensatory strategy use: A structural equation approach.

    PubMed

    Simon, Christa; Schmitter-Edgecombe, Maureen

    2016-08-01

    The use of compensatory strategies plays an important role in the ability of older adults to adapt to late-life memory changes. Even with the benefits associated with compensatory strategy use, little research has explored specific mechanisms associated with memory performance and compensatory strategies. Rather than an individual's objective memory performance directly predicting their use of compensatory strategies, it is possible that some other variables are indirectly influencing that relationship. The purpose of this study was to: (a) examine the moderating effects of cognitive reserve (CR) and (b) evaluate the potential mediating effects of memory self-efficacy on the relationship between objective memory performance and compensatory strategy use. Two structural equation models (SEM) were used to evaluate CR (latent moderator model) and memory self-efficacy (mediator model) in a sample of 155 community-dwelling older adults over the age of 55. The latent variable moderator model indicated that CR was not substantiated as a moderator variable in this sample (p = .861). However, memory self-efficacy significantly mediated the association between objective memory performance and compensatory strategy use (β = .22, 95% confidence interval, CI [.002, .437]). More specifically, better objective memory was associated with lower compensatory strategy use because of its relation to higher memory self-efficacy. These findings provide initial support for an explanatory framework of the relation between objective memory and compensatory strategy use in a healthy older adult population by identifying the importance of an individual's memory perceptions.

  1. Comparison of organs' shapes with geometric and Zernike 3D moments.

    PubMed

    Broggio, D; Moignier, A; Ben Brahim, K; Gardumi, A; Grandgirard, N; Pierrat, N; Chea, M; Derreumaux, S; Desbrée, A; Boisserie, G; Aubert, B; Mazeron, J-J; Franck, D

    2013-09-01

    The morphological similarity of organs is studied with feature vectors based on geometric and Zernike 3D moments. It is particularly investigated if outliers and average models can be identified. For this purpose, the relative proximity to the mean feature vector is defined, principal coordinate and clustering analyses are also performed. To study the consistency and usefulness of this approach, 17 livers and 76 hearts voxel models from several sources are considered. In the liver case, models with similar morphological feature are identified. For the limited amount of studied cases, the liver of the ICRP male voxel model is identified as a better surrogate than the female one. For hearts, the clustering analysis shows that three heart shapes represent about 80% of the morphological variations. The relative proximity and clustering analysis rather consistently identify outliers and average models. For the two cases, identification of outliers and surrogate of average models is rather robust. However, deeper classification of morphological feature is subject to caution and can only be performed after cross analysis of at least two kinds of feature vectors. Finally, the Zernike moments contain all the information needed to re-construct the studied objects and thus appear as a promising tool to derive statistical organ shapes. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  2. Women's journey to safety - the Transtheoretical model in clinical practice when working with women experiencing Intimate Partner Violence: a scientific review and clinical guidance.

    PubMed

    Reisenhofer, Sonia; Taft, Angela

    2013-12-01

    Review the applicability of the Transtheoretical model and provide updated guidance for clinicians working with women experiencing intimate partner violence. Critical review of related primary research conducted from 1990 to March 2013. Women's experiences of creating change within abusive relationships can be located within a stages of change continuum by identifying dominant behavioral clusters. The processes of change and constructs of decisional-balance and turning-points are evident in women's decision-making when they engage in change. Clinicians can use the stages of change to provide a means of assessing women's movement toward their nominated outcomes, and the processes of change, decisional-balance and turning-points, to enhance understanding of, and promote women's movement across stages in their journey to safety. Clinicians should assess women individually for immediate and ongoing safety and well-being, and identify their overarching stage of change. Clinicians can support women in identifying and implementing their personal objectives to enhance self-efficacy and create positive change movement across stages. The three primary objectives identified for clinician support are: 1. Minimizing harm and promoting well-being within an abusive relationship, 2. Achieving safety and well-being within the relationship; halting the abuse, or 3. Achieving safety by ending/leaving intimate relationships. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  3. Linking Science and Management in an Interactive Geospatial, Mutli-Criterion, Structured Decision Support Framework: Use Case Studies of the "Future Forests Geo-visualization and Decision Support Tool

    NASA Astrophysics Data System (ADS)

    Pontius, J.; Duncan, J.

    2017-12-01

    Land managers are often faced with balancing management activities to accomplish a diversity of management objectives, in systems faced with many stress agents. Advances in ecosystem modeling provide a rich source of information to inform management. Coupled with advances in decision support techniques and computing capabilities, interactive tools are now accessible for a broad audience of stakeholders. Here we present one such tool designed to capture information on how climate change may impact forested ecosystems, and how that impact varies spatially across the landscape. This tool integrates empirical models of current and future forest structure and function in a structured decision framework that allows users to customize weights for multiple management objectives and visualize suitability outcomes across the landscape. Combined with climate projections, the resulting products allow stakeholders to compare the relative success of various management objectives on a pixel by pixel basis and identify locations where management outcomes are most likely to be met. Here we demonstrate this approach with the integration of several of the preliminary models developed to map species distributions, sugar maple health, forest fragmentation risk and hemlock vulnerability to hemlock woolly adelgid under current and future climate scenarios. We compare three use case studies with objective weightings designed to: 1) Identify key parcels for sugarbush conservation and management, 2) Target state lands that may serve as hemlock refugia from hemlock woolly adelgid induced mortality, and 3) Examine how climate change may alter the success of managing for both sugarbush and hemlock across privately owned lands. This tool highlights the value of flexible models that can be easily run with customized weightings in a dynamic, integrated assessment that allows users to hone in on their potentially complex management objectives, and to visualize and prioritize locations across the landscape. It also demonstrates the importance of including climate considerations for long-term management. This merging of scientific knowledge with the diversity of stakeholder needs is an important step towards using science to inform management and policy decisions.

  4. A formulation of multidimensional growth models for the assessment and forecast of technology attributes

    NASA Astrophysics Data System (ADS)

    Danner, Travis W.

    Developing technology systems requires all manner of investment---engineering talent, prototypes, test facilities, and more. Even for simple design problems the investment can be substantial; for complex technology systems, the development costs can be staggering. The profitability of a corporation in a technology-driven industry is crucially dependent on maximizing the effectiveness of research and development investment. Decision-makers charged with allocation of this investment are forced to choose between the further evolution of existing technologies and the pursuit of revolutionary technologies. At risk on the one hand is excessive investment in an evolutionary technology which has only limited availability for further improvement. On the other hand, the pursuit of a revolutionary technology may mean abandoning momentum and the potential for substantial evolutionary improvement resulting from the years of accumulated knowledge. The informed answer to this question, evolutionary or revolutionary, requires knowledge of the expected rate of improvement and the potential a technology offers for further improvement. This research is dedicated to formulating the assessment and forecasting tools necessary to acquire this knowledge. The same physical laws and principles that enable the development and improvement of specific technologies also limit the ultimate capability of those technologies. Researchers have long used this concept as the foundation for modeling technological advancement through extrapolation by analogy to biological growth models. These models are employed to depict technology development as it asymptotically approaches limits established by the fundamental principles on which the technological approach is based. This has proven an effective and accurate approach to modeling and forecasting simple single-attribute technologies. With increased system complexity and the introduction of multiple system objectives, however, the usefulness of this modeling technique begins to diminish. With the introduction of multiple objectives, researchers often abandon technology growth models for scoring models and technology frontiers. While both approaches possess advantages over current growth models for the assessment of multi-objective technologies, each lacks a necessary dimension for comprehensive technology assessment. By collapsing multiple system metrics into a single, non-intuitive technology measure, scoring models provide a succinct framework for multi-objective technology assessment and forecasting. Yet, with no consideration of physical limits, scoring models provide no insight as to the feasibility of a particular combination of system capabilities. They only indicate that a given combination of system capabilities yields a particular score. Conversely, technology frontiers are constructed with the distinct objective of providing insight into the feasibility of system capability combinations. Yet again, upper limits to overall system performance are ignored. Furthermore, the data required to forecast subsequent technology frontiers is often inhibitive. In an attempt to reincorporate the fundamental nature of technology advancement as bound by physical principles, researchers have sought to normalize multi-objective systems whereby the variability of a single system objective is eliminated as a result of changes in the remaining objectives. This drastically limits the applicability of the resulting technology model because it is only applicable for a single setting of all other system attributes. Attempts to maintain the interaction between the growth curves of each technical objective of a complex system have thus far been limited to qualitative and subjective consideration. This research proposes the formulation of multidimensional growth models as an approach to simulating the advancement of multi-objective technologies towards their upper limits. Multidimensional growth models were formulated by noticing and exploiting the correlation between technology growth models and technology frontiers. Both are frontiers in actuality. The technology growth curve is a frontier between capability levels of a single attribute and time, while a technology frontier is a frontier between the capability levels of two or more attributes. Multidimensional growth models are formulated by exploiting the mathematical significance of this correlation. The result is a model that can capture both the interaction between multiple system attributes and their expected rates of improvement over time. The fundamental nature of technology development is maintained, and interdependent growth curves are generated for each system metric with minimal data requirements. Being founded on the basic nature of technology advancement, relative to physical limits, the availability for further improvement can be determined for a single metric relative to other system measures of merit. A by-product of this modeling approach is a single n-dimensional technology frontier linking all n system attributes with time. This provides an environment capable of forecasting future system capability in the form of advancing technology frontiers. The ability of a multidimensional growth model to capture the expected improvement of a specific technological approach is dependent on accurately identifying the physical limitations to each pertinent attribute. This research investigates two potential approaches to identifying those physical limits, a physics-based approach and a regression-based approach. The regression-based approach has found limited acceptance among forecasters, although it does show potential for estimating upper limits with a specified degree of uncertainty. Forecasters have long favored physics-based approaches for establishing the upper limit to unidimensional growth models. The task of accurately identifying upper limits has become increasingly difficult with the extension of growth models into multiple dimensions. A lone researcher may be able to identify the physical limitation to a single attribute of a simple system; however, as system complexity and the number of attributes increases, the attention of researchers from multiple fields of study is required. Thus, limit identification is itself an area of research and development requiring some level of investment. Whether estimated by physics or regression-based approaches, predicted limits will always have some degree of uncertainty. This research takes the approach of quantifying the impact of that uncertainty on model forecasts rather than heavily endorsing a single technique to limit identification. In addition to formulating the multidimensional growth model, this research provides a systematic procedure for applying that model to specific technology architectures. Researchers and decision-makers are able to investigate the potential for additional improvement within that technology architecture and to estimate the expected cost of each incremental improvement relative to the cost of past improvements. In this manner, multidimensional growth models provide the necessary information to set reasonable program goals for the further evolution of a particular technological approach or to establish the need for revolutionary approaches in light of the constraining limits of conventional approaches.

  5. A public health decision support system model using reasoning methods.

    PubMed

    Mera, Maritza; González, Carolina; Blobel, Bernd

    2015-01-01

    Public health programs must be based on the real health needs of the population. However, the design of efficient and effective public health programs is subject to availability of information that can allow users to identify, at the right time, the health issues that require special attention. The objective of this paper is to propose a case-based reasoning model for the support of decision-making in public health. The model integrates a decision-making process and case-based reasoning, reusing past experiences for promptly identifying new population health priorities. A prototype implementation of the model was performed, deploying the case-based reasoning framework jColibri. The proposed model contributes to solve problems found today when designing public health programs in Colombia. Current programs are developed under uncertain environments, as the underlying analyses are carried out on the basis of outdated and unreliable data.

  6. Improvement of the R-SWAT-FME framework to support multiple variables and multi-objective functions

    USGS Publications Warehouse

    Wu, Yiping; Liu, Shu-Guang

    2014-01-01

    Application of numerical models is a common practice in the environmental field for investigation and prediction of natural and anthropogenic processes. However, process knowledge, parameter identifiability, sensitivity, and uncertainty analyses are still a challenge for large and complex mathematical models such as the hydrological/water quality model, Soil and Water Assessment Tool (SWAT). In this study, the previously developed R program language-SWAT-Flexible Modeling Environment (R-SWAT-FME) was improved to support multiple model variables and objectives at multiple time steps (i.e., daily, monthly, and annually). This expansion is significant because there is usually more than one variable (e.g., water, nutrients, and pesticides) of interest for environmental models like SWAT. To further facilitate its easy use, we also simplified its application requirements without compromising its merits, such as the user-friendly interface. To evaluate the performance of the improved framework, we used a case study focusing on both streamflow and nitrate nitrogen in the Upper Iowa River Basin (above Marengo) in the United States. Results indicated that the R-SWAT-FME performs well and is comparable to the built-in auto-calibration tool in multi-objective model calibration. Overall, the enhanced R-SWAT-FME can be useful for the SWAT community, and the methods we used can also be valuable for wrapping potential R packages with other environmental models.

  7. Infrared identification of internal overheating components inside an electric control cabinet by inverse heat transfer problem

    NASA Astrophysics Data System (ADS)

    Yang, Li; Wang, Ye; Liu, Huikai; Yan, Guanghui; Kou, Wei

    2014-11-01

    The components overheating inside an object, such as inside an electric control cabinet, a moving object, and a running machine, can easily lead to equipment failure or fire accident. The infrared remote sensing method is used to inspect the surface temperature of object to identify the overheating components inside the object in recent years. It has important practical application of using infrared thermal imaging surface temperature measurement to identify the internal overheating elements inside an electric control cabinet. In this paper, through the establishment of test bench of electric control cabinet, the experimental study was conducted on the inverse identification technology of internal overheating components inside an electric control cabinet using infrared thermal imaging. The heat transfer model of electric control cabinet was built, and the temperature distribution of electric control cabinet with internal overheating element is simulated using the finite volume method (FVM). The outer surface temperature of electric control cabinet was measured using the infrared thermal imager. Combining the computer image processing technology and infrared temperature measurement, the surface temperature distribution of electric control cabinet was extracted, and using the identification algorithm of inverse heat transfer problem (IHTP) the position and temperature of internal overheating element were identified. The results obtained show that for single element overheating inside the electric control cabinet the identifying errors of the temperature and position were 2.11% and 5.32%. For multiple elements overheating inside the electric control cabinet the identifying errors of the temperature and positions were 3.28% and 15.63%. The feasibility and effectiveness of the method of IHTP and the correctness of identification algorithm of FVM were validated.

  8. Predictors of Adolescent Breakfast Consumption: Longitudinal Findings from Project EAT

    ERIC Educational Resources Information Center

    Bruening, Meg; Larson, Nicole; Story, Mary; Neumark-Sztainer, Dianne; Hannan, Peter

    2011-01-01

    Objective: To identify predictors of breakfast consumption among adolescents. Methods: Five-year longitudinal study Project EAT (Eating Among Teens). Baseline surveys were completed in Minneapolis-St. Paul schools and by mail at follow-up by youth (n = 800) transitioning from middle to high school. Linear regression models examined associations…

  9. Protective Factors Based Model for Screening for Posttraumatic Distress in Adolescents

    ERIC Educational Resources Information Center

    Pat-Horenczyk, Ruth; Kenan, Avraham Max; Achituv, Michal; Bachar, Eytan

    2014-01-01

    Background: There is growing application of school-based screening to identify post-traumatic distress in students following exposure to trauma. The consensus method is based on self-report questionnaires that assess posttraumatic symptoms, functional impairment, depression or anxiety. Objective: The current research explored the possibility of…

  10. Differential Effects of Treatments for Chronic Depression: A Latent Growth Model Reanalysis

    ERIC Educational Resources Information Center

    Stulz, Niklaus; Thase, Michael E.; Klein, Daniel N.; Manber, Rachel; Crits-Christoph, Paul

    2010-01-01

    Objective: Psychotherapy-pharmacotherapy combinations are frequently recommended for the treatment of chronic depressive disorders. Our aim in this novel reanalysis of archival data was to identify patient subgroups on the basis of symptom trajectories and examine the clinical significance of the resultant classification on basis of differential…

  11. Professionalism Deficits among Medical Students: Models of Identification and Intervention

    ERIC Educational Resources Information Center

    Bennett, Aurora J.; Roman, Brenda; Arnold, Lesley M.; Kay, Jerald; Goldenhar, Linda M.

    2005-01-01

    Objective: This study compares the instruments and interventions utilized to identify and remediate unprofessional behaviors in medical students across U.S. psychiatry clerkships. Methods: A 20-item questionnaire was distributed to 120 psychiatry clerkship directors and directors of medical student education, in the U.S., inquiring into the…

  12. Development of Leaf Spectral Models for Evaluating Large Numbers of Sugarcane Genotypes

    USDA-ARS?s Scientific Manuscript database

    Leaf reflectance has been used to estimate crop leaf chemical and physiological characters. Sugarcane (Saccharum spp.) leaf N, C, and chlorophyll levels are important traits for high yields and perhaps useful for genotype evaluation. The objectives of this study were to identify sugarcane genotypic ...

  13. MoveU? Assessing a Social Marketing Campaign to Promote Physical Activity

    ERIC Educational Resources Information Center

    Scarapicchia, Tanya M. F.; Sabiston, Catherine M. F.; Brownrigg, Michelle; Blackburn-Evans, Althea; Cressy, Jill; Robb, Janine; Faulkner, Guy E. J.

    2015-01-01

    Objective: MoveU is a social marketing initiative aimed at increasing moderate-to-vigorous physical activity (MVPA) among undergraduate students. Using the Hierarchy of Effects model (HOEM), this study identified awareness of MoveU and examined associations between awareness, outcome expectations, self-efficacy, intentions, and MVPA. Participants:…

  14. Need to Address Evidence-Based Practice in Educational Administration

    ERIC Educational Resources Information Center

    Kowalski, Theodore

    2009-01-01

    Purpose: This article presents a case for addressing evidence-based practice (EBP) in educational administration. Content is arranged around four objectives: (a) summarizing the status of educational administration as a profession, (b) defining evidence and the model, (c) explaining EBP's social and professional merit, and (d) identifying barriers…

  15. A Cluster Analytic Study of Osteoprotective Behavior in Undergraduates

    ERIC Educational Resources Information Center

    Sharp, Katherine; Thombs, Dennis L.

    2003-01-01

    Objective: To derive an empirical taxonomy of osteoprotective stages using the Precaution Adoption Process Model (PAPM) and to identify the predisposing factors associated with each stage. Methods: An anonymous survey was completed by 504 undergraduates at a Midwestern public university. Results: Cluster analytic findings indicate that only 2…

  16. Association of Perceived Interest Major Fit and Objective Interest Major Fit with Academic Achievement

    ERIC Educational Resources Information Center

    Vahidi, Naghmeh; Roslan, Samsilah; Abdullah, Maria Chong; Omar, Zoharah

    2016-01-01

    Recently, despite the high budget allocating for education in Malaysia, the educational performance among students is low (Blueprint, 2013). Pascarella and Terenzini (2005; 1991) have identified four theories and models that affect students' learning: (a) psychosocial, (b) cognitive-structural, (c) typological, and (d) person-environment…

  17. Range 7 Scanner Integration with PaR Robot Scanning System

    NASA Technical Reports Server (NTRS)

    Schuler, Jason; Burns, Bradley; Carlson, Jeffrey; Minich, Mark

    2011-01-01

    An interface bracket and coordinate transformation matrices were designed to allow the Range 7 scanner to be mounted on the PaR Robot detector arm for scanning the heat shield or other object placed in the test cell. A process was designed for using Rapid Form XOR to stitch data from multiple scans together to provide an accurate 3D model of the object scanned. An accurate model was required for the design and verification of an existing heat shield. The large physical size and complex shape of the heat shield does not allow for direct measurement of certain features in relation to other features. Any imaging devices capable of imaging the entire heat shield in its entirety suffers a reduced resolution and cannot image sections that are blocked from view. Prior methods involved tools such as commercial measurement arms, taking images with cameras, then performing manual measurements. These prior methods were tedious and could not provide a 3D model of the object being scanned, and were typically limited to a few tens of measurement points at prominent locations. Integration of the scanner with the robot allows for large complex objects to be scanned at high resolution, and for 3D Computer Aided Design (CAD) models to be generated for verification of items to the original design, and to generate models of previously undocumented items. The main components are the mounting bracket for the scanner to the robot and the coordinate transformation matrices used for stitching the scanner data into a 3D model. The steps involve mounting the interface bracket to the robot's detector arm, mounting the scanner to the bracket, and then scanning sections of the object and recording the location of the tool tip (in this case the center of the scanner's focal point). A novel feature is the ability to stitch images together by coordinates instead of requiring each scan data set to have overlapping identifiable features. This setup allows models of complex objects to be developed even if the object is large and featureless, or has sections that don't have visibility to other parts of the object for use as a reference. In addition, millions of points can be used for creation of an accurate model [i.e. within 0.03 in. (=0.8 mm) over a span of 250 in. (=635 mm)].

  18. Evaluating child welfare policies with decision-analytic simulation models.

    PubMed

    Goldhaber-Fiebert, Jeremy D; Bailey, Stephanie L; Hurlburt, Michael S; Zhang, Jinjin; Snowden, Lonnie R; Wulczyn, Fred; Landsverk, John; Horwitz, Sarah M

    2012-11-01

    The objective was to demonstrate decision-analytic modeling in support of Child Welfare policymakers considering implementing evidence-based interventions. Outcomes included permanency (e.g., adoptions) and stability (e.g., foster placement changes). Analyses of a randomized trial of KEEP-a foster parenting intervention-and NSCAW-1 estimated placement change rates and KEEP's effects. A microsimulation model generalized these findings to other Child Welfare systems. The model projected that KEEP could increase permanency and stability, identifying strategies targeting higher-risk children and geographical regions that achieve benefits efficiently. Decision-analytic models enable planners to gauge the value of potential implementations.

  19. Anomaly clustering in hyperspectral images

    NASA Astrophysics Data System (ADS)

    Doster, Timothy J.; Ross, David S.; Messinger, David W.; Basener, William F.

    2009-05-01

    The topological anomaly detection algorithm (TAD) differs from other anomaly detection algorithms in that it uses a topological/graph-theoretic model for the image background instead of modeling the image with a Gaussian normal distribution. In the construction of the model, TAD produces a hard threshold separating anomalous pixels from background in the image. We build on this feature of TAD by extending the algorithm so that it gives a measure of the number of anomalous objects, rather than the number of anomalous pixels, in a hyperspectral image. This is done by identifying, and integrating, clusters of anomalous pixels via a graph theoretical method combining spatial and spectral information. The method is applied to a cluttered HyMap image and combines small groups of pixels containing like materials, such as those corresponding to rooftops and cars, into individual clusters. This improves visualization and interpretation of objects.

  20. Operational Tree Species Mapping in a Diverse Tropical Forest with Airborne Imaging Spectroscopy.

    PubMed

    Baldeck, Claire A; Asner, Gregory P; Martin, Robin E; Anderson, Christopher B; Knapp, David E; Kellner, James R; Wright, S Joseph

    2015-01-01

    Remote identification and mapping of canopy tree species can contribute valuable information towards our understanding of ecosystem biodiversity and function over large spatial scales. However, the extreme challenges posed by highly diverse, closed-canopy tropical forests have prevented automated remote species mapping of non-flowering tree crowns in these ecosystems. We set out to identify individuals of three focal canopy tree species amongst a diverse background of tree and liana species on Barro Colorado Island, Panama, using airborne imaging spectroscopy data. First, we compared two leading single-class classification methods--binary support vector machine (SVM) and biased SVM--for their performance in identifying pixels of a single focal species. From this comparison we determined that biased SVM was more precise and created a multi-species classification model by combining the three biased SVM models. This model was applied to the imagery to identify pixels belonging to the three focal species and the prediction results were then processed to create a map of focal species crown objects. Crown-level cross-validation of the training data indicated that the multi-species classification model had pixel-level producer's accuracies of 94-97% for the three focal species, and field validation of the predicted crown objects indicated that these had user's accuracies of 94-100%. Our results demonstrate the ability of high spatial and spectral resolution remote sensing to accurately detect non-flowering crowns of focal species within a diverse tropical forest. We attribute the success of our model to recent classification and mapping techniques adapted to species detection in diverse closed-canopy forests, which can pave the way for remote species mapping in a wider variety of ecosystems.

  1. Operational Tree Species Mapping in a Diverse Tropical Forest with Airborne Imaging Spectroscopy

    PubMed Central

    Baldeck, Claire A.; Asner, Gregory P.; Martin, Robin E.; Anderson, Christopher B.; Knapp, David E.; Kellner, James R.; Wright, S. Joseph

    2015-01-01

    Remote identification and mapping of canopy tree species can contribute valuable information towards our understanding of ecosystem biodiversity and function over large spatial scales. However, the extreme challenges posed by highly diverse, closed-canopy tropical forests have prevented automated remote species mapping of non-flowering tree crowns in these ecosystems. We set out to identify individuals of three focal canopy tree species amongst a diverse background of tree and liana species on Barro Colorado Island, Panama, using airborne imaging spectroscopy data. First, we compared two leading single-class classification methods—binary support vector machine (SVM) and biased SVM—for their performance in identifying pixels of a single focal species. From this comparison we determined that biased SVM was more precise and created a multi-species classification model by combining the three biased SVM models. This model was applied to the imagery to identify pixels belonging to the three focal species and the prediction results were then processed to create a map of focal species crown objects. Crown-level cross-validation of the training data indicated that the multi-species classification model had pixel-level producer’s accuracies of 94–97% for the three focal species, and field validation of the predicted crown objects indicated that these had user’s accuracies of 94–100%. Our results demonstrate the ability of high spatial and spectral resolution remote sensing to accurately detect non-flowering crowns of focal species within a diverse tropical forest. We attribute the success of our model to recent classification and mapping techniques adapted to species detection in diverse closed-canopy forests, which can pave the way for remote species mapping in a wider variety of ecosystems. PMID:26153693

  2. Exploring silver as a contrast agent for contrast-enhanced dual-energy X-ray breast imaging

    PubMed Central

    Tsourkas, A; Maidment, A D A

    2014-01-01

    Objective: Through prior monoenergetic modelling, we have identified silver as a potential alternative to iodine in dual-energy (DE) X-ray breast imaging. The purpose of this study was to compare the performance of silver and iodine contrast agents in a commercially available DE imaging system through a quantitative analysis of signal difference-to-noise ratio (SDNR). Methods: A polyenergetic simulation algorithm was developed to model the signal intensity and noise. The model identified the influence of various technique parameters on SDNR. The model was also used to identify the optimal imaging techniques for silver and iodine, so that the two contrast materials could be objectively compared. Results: The major influences on the SDNR were the low-energy dose fraction and breast thickness. An increase in the value of either of these parameters resulted in a decrease in SDNR. The SDNR for silver was on average 43% higher than that for iodine when imaged at their respective optimal conditions, and 40% higher when both were imaged at the optimal conditions for iodine. Conclusion: A silver contrast agent should provide benefit over iodine, even when translated to the clinic without modification of imaging system or protocol. If the system were slightly modified to reflect the lower k-edge of silver, the difference in SDNR between the two materials would be increased. Advances in knowledge: These data are the first to demonstrate the suitability of silver as a contrast material in a clinical contrast-enhanced DE image acquisition system. PMID:24998157

  3. Transcriptional Reversion of Cardiac Myocyte Fate During Mammalian Cardiac Regeneration

    PubMed Central

    O’Meara, Caitlin C.; Wamstad, Joseph A.; Gladstone, Rachel; Fomovsky, Gregory M.; Butty, Vincent L.; Shrikumar, Avanti; Gannon, Joseph; Boyer, Laurie A.; Lee, Richard T.

    2014-01-01

    Rationale Neonatal mice have the capacity to regenerate their hearts in response to injury, but this potential is lost after the first week of life. The transcriptional changes that underpin mammalian cardiac regeneration have not been fully characterized at the molecular level. Objective The objectives of our study were to determine if myocytes revert the transcriptional phenotype to a less differentiated state during regeneration and to systematically interrogate the transcriptional data to identify and validate potential regulators of this process. Methods and Results We derived a core transcriptional signature of injury-induced cardiac myocyte regeneration in mouse by comparing global transcriptional programs in a dynamic model of in vitro and in vivo cardiac myocyte differentiation, in vitro cardiac myocyte explant model, as well as a neonatal heart resection model. The regenerating mouse heart revealed a transcriptional reversion of cardiac myocyte differentiation processes including reactivation of latent developmental programs similar to those observed during de-stabilization of a mature cardiac myocyte phenotype in the explant model. We identified potential upstream regulators of the core network, including interleukin 13 (IL13), which induced cardiac myocyte cell cycle entry and STAT6/STAT3 signaling in vitro. We demonstrate that STAT3/periostin and STAT6 signaling are critical mediators of IL13 signaling in cardiac myocytes. These downstream signaling molecules are also modulated in the regenerating mouse heart. Conclusions Our work reveals new insights into the transcriptional regulation of mammalian cardiac regeneration and provides the founding circuitry for identifying potential regulators for stimulating heart regeneration. PMID:25477501

  4. Searching for Unresolved Binary Brown Dwarfs

    NASA Astrophysics Data System (ADS)

    Albretsen, Jacob; Stephens, Denise

    2007-10-01

    There are currently L and T brown dwarfs (BDs) with errors in their classification of +/- 1 to 2 spectra types. Metallicity and gravitational differences have accounted for some of these discrepancies, and recent studies have shown unresolved binary BDs may offer some explanation as well. However limitations in technology and resources often make it difficult to clearly resolve an object that may be binary in nature. Stephens and Noll (2006) identified statistically strong binary source candidates from Hubble Space Telescope (HST) images of Trans-Neptunian Objects (TNOs) that were apparently unresolved using model point-spread functions for single and binary sources. The HST archive contains numerous observations of BDs using the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) that have never been rigorously analyzed for binary properties. Using methods developed by Stephens and Noll (2006), BD observations from the HST data archive are being analyzed for possible unresolved binaries. Preliminary results will be presented. This technique will identify potential candidates for future observations to determine orbital information.

  5. In Search of Black Swans: Identifying Students at Risk of Failing Licensing Examinations.

    PubMed

    Barber, Cassandra; Hammond, Robert; Gula, Lorne; Tithecott, Gary; Chahine, Saad

    2018-03-01

    To determine which admissions variables and curricular outcomes are predictive of being at risk of failing the Medical Council of Canada Qualifying Examination Part 1 (MCCQE1), how quickly student risk of failure can be predicted, and to what extent predictive modeling is possible and accurate in estimating future student risk. Data from five graduating cohorts (2011-2015), Schulich School of Medicine & Dentistry, Western University, were collected and analyzed using hierarchical generalized linear models (HGLMs). Area under the receiver operating characteristic curve (AUC) was used to evaluate the accuracy of predictive models and determine whether they could be used to predict future risk, using the 2016 graduating cohort. Four predictive models were developed to predict student risk of failure at admissions, year 1, year 2, and pre-MCCQE1. The HGLM analyses identified gender, MCAT verbal reasoning score, two preclerkship course mean grades, and the year 4 summative objective structured clinical examination score as significant predictors of student risk. The predictive accuracy of the models varied. The pre-MCCQE1 model was the most accurate at predicting a student's risk of failing (AUC 0.66-0.93), while the admissions model was not predictive (AUC 0.25-0.47). Key variables predictive of students at risk were found. The predictive models developed suggest, while it is not possible to identify student risk at admission, we can begin to identify and monitor students within the first year. Using such models, programs may be able to identify and monitor students at risk quantitatively and develop tailored intervention strategies.

  6. Time-resolved infrared spectrophotometric observations of high area to mass ratio (HAMR) objects in GEO

    NASA Astrophysics Data System (ADS)

    Skinner, Mark A.; Russell, Ray W.; Rudy, Richard J.; Gutierrez, David J.; Kim, Daryl L.; Crawford, Kirk; Gregory, Steve; Kelecy, Tom

    2011-12-01

    Optical surveys have identified a class of high area-to-mass ratio (HAMR) objects in the vicinity of the Geostationary Earth Orbit (GEO) ring [1]. The exact origin and nature of these objects are not well known, although their proximity to the GEO ring poses a hazard to active GEO satellites. Due to their high area-to-mass ratios, solar radiation pressure perturbs their orbits in ways that makes it difficult to predict their orbital trajectories over periods of time exceeding a week. To better understand these objects and their origins, observations that allow us to derive physical characteristics are required in order to improve the non-conservative force modeling for orbit determination and prediction. Information on their temperatures, areas, emissivities, and albedos may be obtained from thermal infrared, mid-wave infrared (MWIR), and visible measurements. Spectral features may help to identify the composition of the material, and thus possible origins for these objects. We have collected observational data on various HAMR objects from the AMOS observatory 3.6 m AEOS telescope. The thermal-IR spectra of these low-earth orbit objects acquired by the Broadband Array Spectrograph System (BASS) span wavelengths 3-13 μm and constitute a unique data set, providing a means of measuring, as a function of time, object fluxes. These, in turn, allow temperatures and emissivity-area products to be calculated. In some instances we have also collected simultaneous filtered visible photometric data on the observed objects. The multi-wavelength observations of the objects provide possible clues as to the nature of the observed objects. We describe briefly the nature and status of the instrumental programs used to acquire the data, our data of record, our data analysis techniques, and our current results, as well as future plans.

  7. Coordinating the Provision of Health Services in Humanitarian Crises: a Systematic Review of Suggested Models.

    PubMed

    Lotfi, Tamara; Bou-Karroum, Lama; Darzi, Andrea; Hajjar, Rayan; El Rahyel, Ahmed; El Eid, Jamale; Itani, Mira; Brax, Hneine; Akik, Chaza; Osman, Mona; Hassan, Ghayda; El-Jardali, Fadi; Akl, Elie

    2016-08-03

    Our objective was to identify published models of coordination between entities funding or delivering health services in humanitarian crises, whether the coordination took place during or after the crises. We included reports describing models of coordination in sufficient detail to allow reproducibility. We also included reports describing implementation of identified models, as case studies. We searched Medline, PubMed, EMBASE, Cochrane Central Register of Controlled Trials, CINAHL, PsycINFO, and the WHO Global Health Library. We also searched websites of relevant organizations. We followed standard systematic review methodology. Our search captured 14,309 citations. The screening process identified 34 eligible papers describing five models of coordination of delivering health services: the "Cluster Approach" (with 16 case studies), the 4Ws "Who is Where, When, doing What" mapping tool (with four case studies), the "Sphere Project" (with two case studies), the "5x5" model (with one case study), and the "model of information coordination" (with one case study). The 4Ws and the 5x5 focus on coordination of services for mental health, the remaining models do not focus on a specific health topic. The Cluster approach appears to be the most widely used. One case study was a mixed implementation of the Cluster approach and the Sphere model. We identified no model of coordination for funding of health service. This systematic review identified five proposed coordination models that have been implemented by entities funding or delivering health service in humanitarian crises. There is a need to compare the effect of these different models on outcomes such as availability of and access to health services.

  8. Using an interdisciplinary approach to identify factors that affect clinicians' compliance with evidence-based guidelines.

    PubMed

    Gurses, Ayse P; Marsteller, Jill A; Ozok, A Ant; Xiao, Yan; Owens, Sharon; Pronovost, Peter J

    2010-08-01

    Our objective was to identify factors that affect clinicians' compliance with the evidence-based guidelines using an interdisciplinary approach and develop a conceptual framework that can provide a comprehensive and practical guide for designing effective interventions. A literature review and a brainstorming session with 11 researchers from a variety of scientific disciplines were used to identify theoretical and conceptual models describing clinicians' guideline compliance. MEDLINE, EMBASE, CINAHL, and the bibliographies of the papers identified were used as data sources for identifying the relevant theoretical and conceptual models. Thirteen different models that originated from various disciplines including medicine, rural sociology, psychology, human factors and systems engineering, organizational management, marketing, and health education were identified. Four main categories of factors that affect compliance emerged from our analysis: clinician characteristics, guideline characteristics, system characteristics, and implementation characteristics. Based on these findings, we developed an interdisciplinary conceptual framework that specifies the expected interrelationships among these four categories of factors and their impact on clinicians' compliance. An interdisciplinary approach is needed to improve clinicians' compliance with evidence-based guidelines. The conceptual framework from this research can provide a comprehensive and systematic guide to identify barriers to guideline compliance and design effective interventions to improve patient safety.

  9. Cheminformatics-aided pharmacovigilance: application to Stevens-Johnson Syndrome

    PubMed Central

    Low, Yen S; Caster, Ola; Bergvall, Tomas; Fourches, Denis; Zang, Xiaoling; Norén, G Niklas; Rusyn, Ivan; Edwards, Ralph

    2016-01-01

    Objective Quantitative Structure-Activity Relationship (QSAR) models can predict adverse drug reactions (ADRs), and thus provide early warnings of potential hazards. Timely identification of potential safety concerns could protect patients and aid early diagnosis of ADRs among the exposed. Our objective was to determine whether global spontaneous reporting patterns might allow chemical substructures associated with Stevens-Johnson Syndrome (SJS) to be identified and utilized for ADR prediction by QSAR models. Materials and Methods Using a reference set of 364 drugs having positive or negative reporting correlations with SJS in the VigiBase global repository of individual case safety reports (Uppsala Monitoring Center, Uppsala, Sweden), chemical descriptors were computed from drug molecular structures. Random Forest and Support Vector Machines methods were used to develop QSAR models, which were validated by external 5-fold cross validation. Models were employed for virtual screening of DrugBank to predict SJS actives and inactives, which were corroborated using knowledge bases like VigiBase, ChemoText, and MicroMedex (Truven Health Analytics Inc, Ann Arbor, Michigan). Results We developed QSAR models that could accurately predict if drugs were associated with SJS (area under the curve of 75%–81%). Our 10 most active and inactive predictions were substantiated by SJS reports (or lack thereof) in the literature. Discussion Interpretation of QSAR models in terms of significant chemical descriptors suggested novel SJS structural alerts. Conclusions We have demonstrated that QSAR models can accurately identify SJS active and inactive drugs. Requiring chemical structures only, QSAR models provide effective computational means to flag potentially harmful drugs for subsequent targeted surveillance and pharmacoepidemiologic investigations. PMID:26499102

  10. Early identification of patients requiring massive transfusion, embolization, or hemostatic surgery for traumatic hemorrhage: a systematic review protocol.

    PubMed

    Tran, Alexandre; Matar, Maher; Steyerberg, Ewout W; Lampron, Jacinthe; Taljaard, Monica; Vaillancourt, Christian

    2017-04-13

    Hemorrhage is a major cause of early mortality following a traumatic injury. The progression and consequences of significant blood loss occur quickly as death from hemorrhagic shock or exsanguination often occurs within the first few hours. The mainstay of treatment therefore involves early identification of patients at risk for hemorrhagic shock in order to provide blood products and control of the bleeding source if necessary. The intended scope of this review is to identify and assess combinations of predictors informing therapeutic decision-making for clinicians during the initial trauma assessment. The primary objective of this systematic review is to identify and critically assess any existing multivariable models predicting significant traumatic hemorrhage that requires intervention, defined as a composite outcome comprising massive transfusion, surgery for hemostasis, or angiography with embolization for the purpose of external validation or updating in other study populations. If no suitable existing multivariable models are identified, the secondary objective is to identify candidate predictors to inform the development of a new prediction rule. We will search the EMBASE and MEDLINE databases for all randomized controlled trials and prospective and retrospective cohort studies developing or validating predictors of intervention for traumatic hemorrhage in adult patients 16 years of age or older. Eligible predictors must be available to the clinician during the first hour of trauma resuscitation and may be clinical, lab-based, or imaging-based. Outcomes of interest include the need for surgical intervention, angiographic embolization, or massive transfusion within the first 24 h. Data extraction will be performed independently by two reviewers. Items for extraction will be based on the CHARMS checklist. We will evaluate any existing models for relevance, quality, and the potential for external validation and updating in other populations. Relevance will be described in terms of appropriateness of outcomes and predictors. Quality criteria will include variable selection strategies, adequacy of sample size, handling of missing data, validation techniques, and measures of model performance. This systematic review will describe the availability of multivariable prediction models and summarize evidence regarding predictors that can be used to identify the need for intervention in patients with traumatic hemorrhage. PROSPERO CRD42017054589.

  11. Monitoring asthma control in children with allergies by soft computing of lung function and exhaled nitric oxide.

    PubMed

    Pifferi, Massimo; Bush, Andrew; Pioggia, Giovanni; Di Cicco, Maria; Chinellato, Iolanda; Bodini, Alessandro; Macchia, Pierantonio; Boner, Attilio L

    2011-02-01

    Asthma control is emphasized by new guidelines but remains poor in many children. Evaluation of control relies on subjective patient recall and may be overestimated by health-care professionals. This study assessed the value of spirometry and fractional exhaled nitric oxide (FeNO) measurements, used alone or in combination, in models developed by a machine learning approach in the objective classification of asthma control according to Global Initiative for Asthma guidelines and tested the model in a second group of children with asthma. Fifty-three children with persistent atopic asthma underwent two to six evaluations of asthma control, including spirometry and FeNO. Soft computing evaluation was performed by means of artificial neural networks and principal component analysis. The model was then tested in a cross-sectional study in an additional 77 children with allergic asthma. The machine learning method was not able to distinguish different levels of control using either spirometry or FeNO values alone. However, their use in combination modeled by soft computing was able to discriminate levels of asthma control. In particular, the model is able to recognize all children with uncontrolled asthma and correctly identify 99.0% of children with totally controlled asthma. In the cross-sectional study, the model prospectively identified correctly all the uncontrolled children and 79.6% of the controlled children. Soft computing analysis of spirometry and FeNO allows objective categorization of asthma control status.

  12. Locating an Imaging Radar in Canada for Identifying Spaceborne Objects

    DTIC Science & Technology

    1992-12-01

    of residents. Daskin (11:48) extended that model to account for the chance that when a demand arrives at the system it will not be covered since all...Journal of Operational Research, 50: 280-297 (February 1991). 11. Daskin , Mark S. " A Maximum Expected Covering Location Model: Formulation...continue with this thesis-, and Dr. William Wiesel for his instruction and help in developing a satellite coordinate frame and understanding the mechanics

  13. Automation life-cycle cost model

    NASA Technical Reports Server (NTRS)

    Gathmann, Thomas P.; Reeves, Arlinda J.; Cline, Rick; Henrion, Max; Ruokangas, Corinne

    1992-01-01

    The problem domain being addressed by this contractual effort can be summarized by the following list: Automation and Robotics (A&R) technologies appear to be viable alternatives to current, manual operations; Life-cycle cost models are typically judged with suspicion due to implicit assumptions and little associated documentation; and Uncertainty is a reality for increasingly complex problems and few models explicitly account for its affect on the solution space. The objectives for this effort range from the near-term (1-2 years) to far-term (3-5 years). In the near-term, the envisioned capabilities of the modeling tool are annotated. In addition, a framework is defined and developed in the Decision Modelling System (DEMOS) environment. Our approach is summarized as follows: Assess desirable capabilities (structure into near- and far-term); Identify useful existing models/data; Identify parameters for utility analysis; Define tool framework; Encode scenario thread for model validation; and Provide transition path for tool development. This report contains all relevant, technical progress made on this contractual effort.

  14. Target Recognition Using Neural Networks for Model Deformation Measurements

    NASA Technical Reports Server (NTRS)

    Ross, Richard W.; Hibler, David L.

    1999-01-01

    Optical measurements provide a non-invasive method for measuring deformation of wind tunnel models. Model deformation systems use targets mounted or painted on the surface of the model to identify known positions, and photogrammetric methods are used to calculate 3-D positions of the targets on the model from digital 2-D images. Under ideal conditions, the reflective targets are placed against a dark background and provide high-contrast images, aiding in target recognition. However, glints of light reflecting from the model surface, or reduced contrast caused by light source or model smoothness constraints, can compromise accurate target determination using current algorithmic methods. This paper describes a technique using a neural network and image processing technologies which increases the reliability of target recognition systems. Unlike algorithmic methods, the neural network can be trained to identify the characteristic patterns that distinguish targets from other objects of similar size and appearance and can adapt to changes in lighting and environmental conditions.

  15. Blue Marble Matches: Using Earth for Planetary Comparisons

    NASA Technical Reports Server (NTRS)

    Graff, Paige Valderrama

    2009-01-01

    Goal: This activity is designed to introduce students to geologic processes on Earth and model how scientists use Earth to gain a better understanding of other planetary bodies in the solar system. Objectives: Students will: 1. Identify common descriptor characteristics used by scientists to describe geologic features in images. 2. Identify geologic features and how they form on Earth. 3. Create a list of defining/distinguishing characteristics of geologic features 4. Identify geologic features in images of other planetary bodies. 5. List observations and interpretations about planetary body comparisons. 6. Create summary statements about planetary body comparisons.

  16. Analysis of sensitivity of simulated recharge to selected parameters for seven watersheds modeled using the precipitation-runoff modeling system

    USGS Publications Warehouse

    Ely, D. Matthew

    2006-01-01

    Recharge is a vital component of the ground-water budget and methods for estimating it range from extremely complex to relatively simple. The most commonly used techniques, however, are limited by the scale of application. One method that can be used to estimate ground-water recharge includes process-based models that compute distributed water budgets on a watershed scale. These models should be evaluated to determine which model parameters are the dominant controls in determining ground-water recharge. Seven existing watershed models from different humid regions of the United States were chosen to analyze the sensitivity of simulated recharge to model parameters. Parameter sensitivities were determined using a nonlinear regression computer program to generate a suite of diagnostic statistics. The statistics identify model parameters that have the greatest effect on simulated ground-water recharge and that compare and contrast the hydrologic system responses to those parameters. Simulated recharge in the Lost River and Big Creek watersheds in Washington State was sensitive to small changes in air temperature. The Hamden watershed model in west-central Minnesota was developed to investigate the relations that wetlands and other landscape features have with runoff processes. Excess soil moisture in the Hamden watershed simulation was preferentially routed to wetlands, instead of to the ground-water system, resulting in little sensitivity of any parameters to recharge. Simulated recharge in the North Fork Pheasant Branch watershed, Wisconsin, demonstrated the greatest sensitivity to parameters related to evapotranspiration. Three watersheds were simulated as part of the Model Parameter Estimation Experiment (MOPEX). Parameter sensitivities for the MOPEX watersheds, Amite River, Louisiana and Mississippi, English River, Iowa, and South Branch Potomac River, West Virginia, were similar and most sensitive to small changes in air temperature and a user-defined flow routing parameter. Although the primary objective of this study was to identify, by geographic region, the importance of the parameter value to the simulation of ground-water recharge, the secondary objectives proved valuable for future modeling efforts. The value of a rigorous sensitivity analysis can (1) make the calibration process more efficient, (2) guide additional data collection, (3) identify model limitations, and (4) explain simulated results.

  17. A Model for Sustainable Building Energy Efficiency Retrofit (BEER) Using Energy Performance Contracting (EPC) Mechanism for Hotel Buildings in China

    NASA Astrophysics Data System (ADS)

    Xu, Pengpeng

    Hotel building is one of the high-energy-consuming building types, and retrofitting hotel buildings is an untapped solution to help cut carbon emissions contributing towards sustainable development. Energy Performance Contracting (EPC) has been promulgated as a market mechanism for the delivery of energy efficiency projects. EPC mechanism has been introduced into China relatively recently, and it has not been implemented successfully in building energy efficiency retrofit projects. The aim of this research is to develop a model for achieving the sustainability of Building Energy Efficiency Retrofit (BEER) in hotel buildings under the Energy Performance Contracting (EPC) mechanism. The objectives include: • To identify a set of Key Performance Indicators (KPIs) for measuring the sustainability of BEER in hotel buildings; • To identify Critical Success Factors (CSFs) under EPC mechanism that have a strong correlation with sustainable BEER project; • To develop a model explaining the relationships between the CSFs and the sustainability performance of BEER in hotel building. Literature reviews revealed the essence of sustainable BEER and EPC, which help to develop a conceptual framework for analyzing sustainable BEER under EPC mechanism in hotel buildings. 11 potential KPIs for sustainable BEER and 28 success factors of EPC were selected based on the developed framework. A questionnaire survey was conducted to ascertain the importance of selected performance indicators and success factors. Fuzzy set theory was adopted in identifying the KPIs. Six KPIs were identified from the 11 selected performance indicators. Through a questionnaire survey, out of the 28 success factors, 21 Critical Success Factors (CSFs) were also indentified. Using the factor analysis technique, the 21 identified CSFs in this study were grouped into six clusters to help explain project success of sustainable BEER. Finally, AHP/ANP approach was used in this research to develop a model to examine the interrelationships among the identified CSFs, KPIs, and sustainable dimensions of BEER. The findings indicate that the success of sustainable BEER in hotel buildings under the EPC mechanism is mainly decided by project objectives control mechanism, available technology, organizing capacity of team leader, trust among partners, accurate M&V, and team workers' technical skills.

  18. Identification of mutated driver pathways in cancer using a multi-objective optimization model.

    PubMed

    Zheng, Chun-Hou; Yang, Wu; Chong, Yan-Wen; Xia, Jun-Feng

    2016-05-01

    New-generation high-throughput technologies, including next-generation sequencing technology, have been extensively applied to solve biological problems. As a result, large cancer genomics projects such as the Cancer Genome Atlas (TCGA) and the International Cancer Genome Consortium are producing large amount of rich and diverse data in multiple cancer types. The identification of mutated driver genes and driver pathways from these data is a significant challenge. Genome aberrations in cancer cells can be divided into two types: random 'passenger mutation' and functional 'driver mutation'. In this paper, we introduced a Multi-objective Optimization model based on a Genetic Algorithm (MOGA) to solve the maximum weight submatrix problem, which can be employed to identify driver genes and driver pathways promoting cancer proliferation. The maximum weight submatrix problem defined to find mutated driver pathways is based on two specific properties, i.e., high coverage and high exclusivity. The multi-objective optimization model can adjust the trade-off between high coverage and high exclusivity. We proposed an integrative model by combining gene expression data and mutation data to improve the performance of the MOGA algorithm in a biological context. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Modeling a terminology-based electronic nursing record system: an object-oriented approach.

    PubMed

    Park, Hyeoun-Ae; Cho, InSook; Byeun, NamSoo

    2007-10-01

    The aim of this study was to present our perspectives on healthcare information analysis at a conceptual level and the lessons learned from our experience with the development of a terminology-based enterprise electronic nursing record system - which was one of components in an EMR system at a tertiary teaching hospital in Korea - using an object-oriented system analysis and design concept. To ensure a systematic approach and effective collaboration, the department of nursing constituted a system modeling team comprising a project manager, systems analysts, user representatives, an object-oriented methodology expert, and healthcare informaticists (including the authors). A rational unified process (RUP) and the Unified Modeling Language were used as a development process and for modeling notation, respectively. From the scenario and RUP approach, user requirements were formulated into use case sets and the sequence of activities in the scenario was depicted in an activity diagram. The structure of the system was presented in a class diagram. This approach allowed us to identify clearly the structural and behavioral states and important factors of a terminology-based ENR system (e.g., business concerns and system design concerns) according to the viewpoints of both domain and technical experts.

  20. Do we have an internal model of the outside world?

    PubMed Central

    Land, Michael F.

    2014-01-01

    Our phenomenal world remains stationary in spite of movements of the eyes, head and body. In addition, we can point or turn to objects in the surroundings whether or not they are in the field of view. In this review, I argue that these two features of experience and behaviour are related. The ability to interact with objects we cannot see implies an internal memory model of the surroundings, available to the motor system. And, because we maintain this ability when we move around, the model must be updated, so that the locations of object memories change continuously to provide accurate directional information. The model thus contains an internal representation of both the surroundings and the motions of the head and body: in other words, a stable representation of space. Recent functional MRI studies have provided strong evidence that this egocentric representation has a location in the precuneus, on the medial surface of the superior parietal cortex. This is a region previously identified with ‘self-centred mental imagery’, so it seems likely that the stable egocentric representation, required by the motor system, is also the source of our conscious percept of a stable world. PMID:24395972

  1. Object-graphs for context-aware visual category discovery.

    PubMed

    Lee, Yong Jae; Grauman, Kristen

    2012-02-01

    How can knowing about some categories help us to discover new ones in unlabeled images? Unsupervised visual category discovery is useful to mine for recurring objects without human supervision, but existing methods assume no prior information and thus tend to perform poorly for cluttered scenes with multiple objects. We propose to leverage knowledge about previously learned categories to enable more accurate discovery, and address challenges in estimating their familiarity in unsegmented, unlabeled images. We introduce two variants of a novel object-graph descriptor to encode the 2D and 3D spatial layout of object-level co-occurrence patterns relative to an unfamiliar region and show that by using them to model the interaction between an image’s known and unknown objects, we can better detect new visual categories. Rather than mine for all categories from scratch, our method identifies new objects while drawing on useful cues from familiar ones. We evaluate our approach on several benchmark data sets and demonstrate clear improvements in discovery over conventional purely appearance-based baselines.

  2. Properties of resonant trans-Neptunian objects based on Herschel Space Observatory data

    NASA Astrophysics Data System (ADS)

    Farkas Anikó, Takácsné; Kiss, Csaba; Mueller, Thomas G.; Mommert, Michael; Vilenius, Esa

    2016-10-01

    The goal of our work is to characterise the physical characteristics of resonant, detached and scattered disk objects in the trans-Neptunian region, observed in the framework of the "TNOs are Cool!" Herschel Open Time Key Program. Based on thermal emission measurements with the Herschel/PACS and Spitzer/MIPS instruments we were able to determine size, albedo, and surface thermal properties for 23 objects using radiometric modelling techniques. This is the first analysis in which the physical properties of objects in the outer resonances are determined for a larger sample. In addition to the results for individual objects, we have compared these characteristic with the bulk properties of other populations of the trans-Neptunian region. The newly analysed objects show e.g. a large variety of beaming factors, indicating diverse surfaces, and in general they follow the albedo-colour clustering identified earlier for Kuiper belt objects and Centaurs, further strengthening the evidence for a compositional discontinuity in the young solar system.

  3. Myoelectric hand prosthesis force control through servo motor current feedback.

    PubMed

    Sono, Tálita Saemi Payossim; Menegaldo, Luciano Luporini

    2009-10-01

    This paper presents the prehension force closed-loop control design of a mechanical finger commanded by electromyographic signal (EMG) from a patient's arm. The control scheme was implemented and tested in a mechanical finger prototype with three degrees of freedom and one actuator, driven by arm muscles EMG of normal volunteers. Real-time indirect estimation of prehension force was assessed by measuring the DC servo motor actuator current. A model of the plant comprising finger, motor, and grasped object was proposed. Model parameters were identified experimentally and a classical feedback phase-lead compensator was designed. The controlled mechanical finger was able to provide a more accurate prehension force modulation of a compliant object when compared to open-loop control.

  4. Three-Dimensional Reconstruction of Coronary Arteries and Its Application in Localization of Coronary Artery Segments Corresponding to Myocardial Segments Identified by Transthoracic Echocardiography

    PubMed Central

    Zhong, Chunyan; Guo, Yanli; Huang, Haiyun; Tan, Liwen; Wu, Yi; Wang, Wenting

    2013-01-01

    Objectives. To establish 3D models of coronary arteries (CA) and study their application in localization of CA segments identified by Transthoracic Echocardiography (TTE). Methods. Sectional images of the heart collected from the first CVH dataset and contrast CT data were used to establish 3D models of the CA. Virtual dissection was performed on the 3D models to simulate the conventional sections of TTE. Then, we used 2D ultrasound, speckle tracking imaging (STI), and 2D ultrasound plus 3D CA models to diagnose 170 patients and compare the results to coronary angiography (CAG). Results. 3D models of CA distinctly displayed both 3D structure and 2D sections of CA. This simulated TTE imaging in any plane and showed the CA segments that corresponded to 17 myocardial segments identified by TTE. The localization accuracy showed a significant difference between 2D ultrasound and 2D ultrasound plus 3D CA model in the severe stenosis group (P < 0.05) and in the mild-to-moderate stenosis group (P < 0.05). Conclusions. These innovative modeling techniques help clinicians identify the CA segments that correspond to myocardial segments typically shown in TTE sectional images, thereby increasing the accuracy of the TTE-based diagnosis of CHD. PMID:24348745

  5. The probability of object-scene co-occurrence influences object identification processes.

    PubMed

    Sauvé, Geneviève; Harmand, Mariane; Vanni, Léa; Brodeur, Mathieu B

    2017-07-01

    Contextual information allows the human brain to make predictions about the identity of objects that might be seen and irregularities between an object and its background slow down perception and identification processes. Bar and colleagues modeled the mechanisms underlying this beneficial effect suggesting that the brain stocks information about the statistical regularities of object and scene co-occurrence. Their model suggests that these recurring regularities could be conceptualized along a continuum in which the probability of seeing an object within a given scene can be high (probable condition), moderate (improbable condition) or null (impossible condition). In the present experiment, we propose to disentangle the electrophysiological correlates of these context effects by directly comparing object-scene pairs found along this continuum. We recorded the event-related potentials of 30 healthy participants (18-34 years old) and analyzed their brain activity in three time windows associated with context effects. We observed anterior negativities between 250 and 500 ms after object onset for the improbable and impossible conditions (improbable more negative than impossible) compared to the probable condition as well as a parieto-occipital positivity (improbable more positive than impossible). The brain may use different processing pathways to identify objects depending on whether the probability of co-occurrence with the scene is moderate (rely more on top-down effects) or null (rely more on bottom-up influences). The posterior positivity could index error monitoring aimed to ensure that no false information is integrated into mental representations of the world.

  6. Basic research on design analysis methods for rotorcraft vibrations

    NASA Technical Reports Server (NTRS)

    Hanagud, S.

    1991-01-01

    The objective of the present work was to develop a method for identifying physically plausible finite element system models of airframe structures from test data. The assumed models were based on linear elastic behavior with general (nonproportional) damping. Physical plausibility of the identified system matrices was insured by restricting the identification process to designated physical parameters only and not simply to the elements of the system matrices themselves. For example, in a large finite element model the identified parameters might be restricted to the moduli for each of the different materials used in the structure. In the case of damping, a restricted set of damping values might be assigned to finite elements based on the material type and on the fabrication processes used. In this case, different damping values might be associated with riveted, bolted and bonded elements. The method itself is developed first, and several approaches are outlined for computing the identified parameter values. The method is applied first to a simple structure for which the 'measured' response is actually synthesized from an assumed model. Both stiffness and damping parameter values are accurately identified. The true test, however, is the application to a full-scale airframe structure. In this case, a NASTRAN model and actual measured modal parameters formed the basis for the identification of a restricted set of physically plausible stiffness and damping parameters.

  7. Frontal–Occipital Connectivity During Visual Search

    PubMed Central

    Pantazatos, Spiro P.; Yanagihara, Ted K.; Zhang, Xian; Meitzler, Thomas

    2012-01-01

    Abstract Although expectation- and attention-related interactions between ventral and medial prefrontal cortex and stimulus category-selective visual regions have been identified during visual detection and discrimination, it is not known if similar neural mechanisms apply to other tasks such as visual search. The current work tested the hypothesis that high-level frontal regions, previously implicated in expectation and visual imagery of object categories, interact with visual regions associated with object recognition during visual search. Using functional magnetic resonance imaging, subjects searched for a specific object that varied in size and location within a complex natural scene. A model-free, spatial-independent component analysis isolated multiple task-related components, one of which included visual cortex, as well as a cluster within ventromedial prefrontal cortex (vmPFC), consistent with the engagement of both top-down and bottom-up processes. Analyses of psychophysiological interactions showed increased functional connectivity between vmPFC and object-sensitive lateral occipital cortex (LOC), and results from dynamic causal modeling and Bayesian Model Selection suggested bidirectional connections between vmPFC and LOC that were positively modulated by the task. Using image-guided diffusion-tensor imaging, functionally seeded, probabilistic white-matter tracts between vmPFC and LOC, which presumably underlie this effective interconnectivity, were also observed. These connectivity findings extend previous models of visual search processes to include specific frontal–occipital neuronal interactions during a natural and complex search task. PMID:22708993

  8. Object-based modeling, identification, and labeling of medical images for content-based retrieval by querying on intervals of attribute values

    NASA Astrophysics Data System (ADS)

    Thies, Christian; Ostwald, Tamara; Fischer, Benedikt; Lehmann, Thomas M.

    2005-04-01

    The classification and measuring of objects in medical images is important in radiological diagnostics and education, especially when using large databases as knowledge resources, for instance a picture archiving and communication system (PACS). The main challenge is the modeling of medical knowledge and the diagnostic context to label the sought objects. This task is referred to as closing the semantic gap between low-level pixel information and high level application knowledge. This work describes an approach which allows labeling of a-priori unknown objects in an intuitive way. Our approach consists of four main components. At first an image is completely decomposed into all visually relevant partitions on different scales. This provides a hierarchical organized set of regions. Afterwards, for each of the obtained regions a set of descriptive features is computed. In this data structure objects are represented by regions with characteristic attributes. The actual object identification is the formulation of a query. It consists of attributes on which intervals are defined describing those regions that correspond to the sought objects. Since the objects are a-priori unknown, they are described by a medical expert by means of an intuitive graphical user interface (GUI). This GUI is the fourth component. It enables complex object definitions by browsing the data structure and examinating the attributes to formulate the query. The query is executed and if the sought objects have not been identified its parameterization is refined. By using this heuristic approach, object models for hand radiographs have been developed to extract bones from a single hand in different anatomical contexts. This demonstrates the applicability of the labeling concept. By using a rule for metacarpal bones on a series of 105 images, this type of bone could be retrieved with a precision of 0.53 % and a recall of 0.6%.

  9. A Model for Data Citation in Astronomical Research Using Digital Object Identifiers (DOIs)

    NASA Astrophysics Data System (ADS)

    Novacescu, Jenny; Peek, Joshua E. G.; Weissman, Sarah; Fleming, Scott W.; Levay, Karen; Fraser, Elizabeth

    2018-05-01

    Standardizing and incentivizing the use of digital object identifiers (DOIs) to aggregate and identify both data analyzed and data generated by a research project will advance the field of astronomy to match best practices in other research fields like geoscience and medicine. An increase in the use of DOIs will prepare the discipline for changing expectations among funding agencies and publishers, who increasingly expect accurate and thorough data citation to accompany scientific outputs. The use of DOIs ensures a robust, sustainable, and interoperable approach to data citation in which due credit is given to the researchers and institutions who produce and maintain the primary data. We describe in this work the advantages of DOIs for data citation and best practices for integrating a DOI service in an astronomical archive. We report on a pilot project carried out in collaboration with AAS journals. During the course of the 1.5-year long pilot, over 75% of submitting authors opted to use the integrated DOI service to clearly identify data analyzed during their research project when prompted at the time of paper submission.

  10. Sensitivity Analysis of Genetic Algorithm Parameters for Optimal Groundwater Monitoring Network Design

    NASA Astrophysics Data System (ADS)

    Abdeh-Kolahchi, A.; Satish, M.; Datta, B.

    2004-05-01

    A state art groundwater monitoring network design is introduced. The method combines groundwater flow and transport results with optimization Genetic Algorithm (GA) to identify optimal monitoring well locations. Optimization theory uses different techniques to find a set of parameter values that minimize or maximize objective functions. The suggested groundwater optimal monitoring network design is based on the objective of maximizing the probability of tracking a transient contamination plume by determining sequential monitoring locations. The MODFLOW and MT3DMS models included as separate modules within the Groundwater Modeling System (GMS) are used to develop three dimensional groundwater flow and contamination transport simulation. The groundwater flow and contamination simulation results are introduced as input to the optimization model, using Genetic Algorithm (GA) to identify the groundwater optimal monitoring network design, based on several candidate monitoring locations. The groundwater monitoring network design model is used Genetic Algorithms with binary variables representing potential monitoring location. As the number of decision variables and constraints increase, the non-linearity of the objective function also increases which make difficulty to obtain optimal solutions. The genetic algorithm is an evolutionary global optimization technique, which is capable of finding the optimal solution for many complex problems. In this study, the GA approach capable of finding the global optimal solution to a groundwater monitoring network design problem involving 18.4X 1018 feasible solutions will be discussed. However, to ensure the efficiency of the solution process and global optimality of the solution obtained using GA, it is necessary that appropriate GA parameter values be specified. The sensitivity analysis of genetic algorithms parameters such as random number, crossover probability, mutation probability, and elitism are discussed for solution of monitoring network design.

  11. Interventions developed with the Intervention Mapping protocol in the field of cancer: A systematic review.

    PubMed

    Lamort-Bouché, Marion; Sarnin, Philippe; Kok, Gerjo; Rouat, Sabrina; Péron, Julien; Letrilliart, Laurent; Fassier, Jean-Baptiste

    2018-04-01

    The Intervention Mapping (IM) protocol provides a structured framework to develop, implement, and evaluate complex interventions. The main objective of this review was to identify and describe the content of the interventions developed in the field of cancer with the IM protocol. Secondary objectives were to assess their fidelity to the IM protocol and to review their theoretical frameworks. Medline, Web of Science, PsycINFO, PASCAL, FRANCIS, and BDSP databases were searched. All titles and abstracts were reviewed. A standardized extraction form was developed. All included studies were reviewed by 2 reviewers blinded to each other. Sixteen studies were identified, and these reported 15 interventions. The objectives were to increase cancer screening participation (n = 7), early consultation (n = 1), and aftercare/quality of life among cancer survivors (n = 7). Six reported a complete participatory planning group, and 7 described a complete logic model of the problem. Ten studies described a complete logic model of change. The main theoretical frameworks used were the theory of planned behaviour (n = 8), the transtheoretical model (n = 6), the health belief model (n = 6), and the social cognitive theory (n = 6). The environment was rarely integrated in the interventions (n = 4). Five interventions were reported as effective. Culturally relevant interventions were developed with the IM protocol that were effective to increase cancer screening and reduce social disparities, particularly when they were developed through a participative approach and integrated the environment. Stakeholders' involvement and the role of the environment were heterogeneously integrated in the interventions. Copyright © 2017 John Wiley & Sons, Ltd.

  12. The determination of operational and support requirements and costs during the conceptual design of space systems

    NASA Technical Reports Server (NTRS)

    Ebeling, Charles

    1991-01-01

    The primary objective is to develop a methodology for predicting operational and support parameters and costs of proposed space systems. The first phase consists of: (1) the identification of data sources; (2) the development of a methodology for determining system reliability and maintainability parameters; (3) the implementation of the methodology through the use of prototypes; and (4) support in the development of an integrated computer model. The phase 1 results are documented and a direction is identified to proceed to accomplish the overall objective.

  13. Univers: The construction of an internet-wide descriptive naming system

    NASA Technical Reports Server (NTRS)

    Bowman, C. Mic

    1990-01-01

    Descriptive naming systems allow clients to identify a set of objects by description. Described here is the construction of a descriptive naming system, called Univers, based on a model in which clients provide both an object description and some meta-information. The meta-information describes beliefs about the query and the naming system. Specifically, it is an ordering on a set of perfect world approximations, and it describes the preferred methods for accommodating imperfect information. The description is then resolved in a way that respects the preferred approximations.

  14. Rotationally-supported disks around Class I sources in Taurus: disk formation constraints

    NASA Astrophysics Data System (ADS)

    Harsono, D.; Jørgensen, J. K.; van Dishoeck, E. F.; Hogerheijde, M. R.; Bruderer, S.; Persson, M. V.; Mottram, J. C.

    2014-02-01

    Context. Disks are observed around pre-main sequence stars, but how and when they form is still heavily debated. While disks around young stellar objects have been identified through thermal dust emission, spatially and spectrally resolved molecular line observations are needed to determine their nature. Only a handful of embedded rotationally supported disks have been identified to date. Aims: We identify and characterize rotationally supported disks near the end of the main accretion phase of low-mass protostars by comparing their gas and dust structures. Methods: Subarcsecond observations of dust and gas toward four Class I low-mass young stellar objects in Taurus are presented at significantly higher sensitivity than previous studies. The 13CO and C18O J = 2-1 transitions at 220 GHz were observed with the Plateau de Bure Interferometer at a spatial resolution of ≤0.8″ (56 AU radius at 140 pc) and analyzed using uv-space position velocity diagrams to determine the nature of their observed velocity gradient. Results: Rotationally supported disks (RSDs) are detected around 3 of the 4 Class I sources studied. The derived masses identify them as Stage I objects; i.e., their stellar mass is higher than their envelope and disk masses. The outer radii of the Keplerian disks toward our sample of Class I sources are ≤100 AU. The lack of on-source C18O emission for TMR1 puts an upper limit of 50 AU on its size. Flattened structures at radii >100 AU around these sources are dominated by infalling motion (υ ∝ r-1). A large-scale envelope model is required to estimate the basic parameters of the flattened structure from spatially resolved continuum data. Similarities and differences between the gas and dust disk are discussed. Combined with literature data, the sizes of the RSDs around Class I objects are best described with evolutionary models with an initial rotation of Ω = 10-14 Hz and slow sound speeds. Based on the comparison of gas and dust disk masses, little CO is frozen out within 100 AU in these disks. Conclusions: Rotationally supported disks with radii up to 100 AU are present around Class I embedded objects. Larger surveys of both Class 0 and I objects are needed to determine whether most disks form late or early in the embedded phase. Based on observations carried out with the IRAM Plateau de Bure Interferometer. IRAM is supported by INSU/CNBRS (France), MPG (Germany) and IGN (Spain).Appendices are available in electronic form at http://www.aanda.org

  15. Logistic regression modeling to assess groundwater vulnerability to contamination in Hawaii, USA.

    PubMed

    Mair, Alan; El-Kadi, Aly I

    2013-10-01

    Capture zone analysis combined with a subjective susceptibility index is currently used in Hawaii to assess vulnerability to contamination of drinking water sources derived from groundwater. In this study, we developed an alternative objective approach that combines well capture zones with multiple-variable logistic regression (LR) modeling and applied it to the highly-utilized Pearl Harbor and Honolulu aquifers on the island of Oahu, Hawaii. Input for the LR models utilized explanatory variables based on hydrogeology, land use, and well geometry/location. A suite of 11 target contaminants detected in the region, including elevated nitrate (>1 mg/L), four chlorinated solvents, four agricultural fumigants, and two pesticides, was used to develop the models. We then tested the ability of the new approach to accurately separate groups of wells with low and high vulnerability, and the suitability of nitrate as an indicator of other types of contamination. Our results produced contaminant-specific LR models that accurately identified groups of wells with the lowest/highest reported detections and the lowest/highest nitrate concentrations. Current and former agricultural land uses were identified as significant explanatory variables for eight of the 11 target contaminants, while elevated nitrate was a significant variable for five contaminants. The utility of the combined approach is contingent on the availability of hydrologic and chemical monitoring data for calibrating groundwater and LR models. Application of the approach using a reference site with sufficient data could help identify key variables in areas with similar hydrogeology and land use but limited data. In addition, elevated nitrate may also be a suitable indicator of groundwater contamination in areas with limited data. The objective LR modeling approach developed in this study is flexible enough to address a wide range of contaminants and represents a suitable addition to the current subjective approach. © 2013 Elsevier B.V. All rights reserved.

  16. Detection of Cases of Noncompliance to Drug Treatment in Patient Forum Posts: Topic Model Approach

    PubMed Central

    Foulquié, Pierre; Texier, Nathalie; Faviez, Carole; Burgun, Anita; Schück, Stéphane

    2018-01-01

    Background Medication nonadherence is a major impediment to the management of many health conditions. A better understanding of the factors underlying noncompliance to treatment may help health professionals to address it. Patients use peer-to-peer virtual communities and social media to share their experiences regarding their treatments and diseases. Using topic models makes it possible to model themes present in a collection of posts, thus to identify cases of noncompliance. Objective The aim of this study was to detect messages describing patients’ noncompliant behaviors associated with a drug of interest. Thus, the objective was the clustering of posts featuring a homogeneous vocabulary related to nonadherent attitudes. Methods We focused on escitalopram and aripiprazole used to treat depression and psychotic conditions, respectively. We implemented a probabilistic topic model to identify the topics that occurred in a corpus of messages mentioning these drugs, posted from 2004 to 2013 on three of the most popular French forums. Data were collected using a Web crawler designed by Kappa Santé as part of the Detec’t project to analyze social media for drug safety. Several topics were related to noncompliance to treatment. Results Starting from a corpus of 3650 posts related to an antidepressant drug (escitalopram) and 2164 posts related to an antipsychotic drug (aripiprazole), the use of latent Dirichlet allocation allowed us to model several themes, including interruptions of treatment and changes in dosage. The topic model approach detected cases of noncompliance behaviors with a recall of 98.5% (272/276) and a precision of 32.6% (272/844). Conclusions Topic models enabled us to explore patients’ discussions on community websites and to identify posts related with noncompliant behaviors. After a manual review of the messages in the noncompliance topics, we found that noncompliance to treatment was present in 6.17% (276/4469) of the posts. PMID:29540337

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Manteufel, R.D.; Ahola, M.P.; Turner, D.R.

    A literature review has been conducted to determine the state of knowledge available in the modeling of coupled thermal (T), hydrologic (H), mechanical (M), and chemical (C) processes relevant to the design and/or performance of the proposed high-level waste (HLW) repository at Yucca Mountain, Nevada. The review focuses on identifying coupling mechanisms between individual processes and assessing their importance (i.e., if the coupling is either important, potentially important, or negligible). The significance of considering THMC-coupled processes lies in whether or not the processes impact the design and/or performance objectives of the repository. A review, such as reported here, is usefulmore » in identifying which coupled effects will be important, hence which coupled effects will need to be investigated by the US Nuclear Regulatory Commission in order to assess the assumptions, data, analyses, and conclusions in the design and performance assessment of a geologic reposit``. Although this work stems from regulatory interest in the design of the geologic repository, it should be emphasized that the repository design implicitly considers all of the repository performance objectives, including those associated with the time after permanent closure. The scope of this review is considered beyond previous assessments in that it attempts with the current state-of-knowledge) to determine which couplings are important, and identify which computer codes are currently available to model coupled processes.« less

  18. Multiobjective analysis of a public wellfield using artificial neural networks

    USGS Publications Warehouse

    Coppola, E.A.; Szidarovszky, F.; Davis, D.; Spayd, S.; Poulton, M.M.; Roman, E.

    2007-01-01

    As competition for increasingly scarce ground water resources grows, many decision makers may come to rely upon rigorous multiobjective techniques to help identify appropriate and defensible policies, particularly when disparate stakeholder groups are involved. In this study, decision analysis was conducted on a public water supply wellfield to balance water supply needs with well vulnerability to contamination from a nearby ground water contaminant plume. With few alternative water sources, decision makers must balance the conflicting objectives of maximizing water supply volume from noncontaminated wells while minimizing their vulnerability to contamination from the plume. Artificial neural networks (ANNs) were developed with simulation data from a numerical ground water flow model developed for the study area. The ANN-derived state transition equations were embedded into a multiobjective optimization model, from which the Pareto frontier or trade-off curve between water supply and wellfield vulnerability was identified. Relative preference values and power factors were assigned to the three stakeholders, namely the company whose waste contaminated the aquifer, the community supplied by the wells, and the water utility company that owns and operates the wells. A compromise pumping policy that effectively balances the two conflicting objectives in accordance with the preferences of the three stakeholder groups was then identified using various distance-based methods. ?? 2006 National Ground Water Association.

  19. Identification of fuel cycle simulator functionalities for analysis of transition to a new fuel cycle

    DOE PAGES

    Brown, Nicholas R.; Carlsen, Brett W.; Dixon, Brent W.; ...

    2016-06-09

    Dynamic fuel cycle simulation tools are intended to model holistic transient nuclear fuel cycle scenarios. As with all simulation tools, fuel cycle simulators require verification through unit tests, benchmark cases, and integral tests. Model validation is a vital aspect as well. Although compara-tive studies have been performed, there is no comprehensive unit test and benchmark library for fuel cycle simulator tools. The objective of this paper is to identify the must test functionalities of a fuel cycle simulator tool within the context of specific problems of interest to the Fuel Cycle Options Campaign within the U.S. Department of Energy smore » Office of Nuclear Energy. The approach in this paper identifies the features needed to cover the range of promising fuel cycle options identified in the DOE-NE Fuel Cycle Evaluation and Screening (E&S) and categorizes these features to facilitate prioritization. Features were categorized as essential functions, integrating features, and exemplary capabilities. One objective of this paper is to propose a library of unit tests applicable to each of the essential functions. Another underlying motivation for this paper is to encourage an international dialog on the functionalities and standard test methods for fuel cycle simulator tools.« less

  20. Developmental Programming: State-of-the-Science and Future Directions

    PubMed Central

    Sutton, Elizabeth F.; Gilmore, L. Anne; Dunger, David B.; Heijmans, Bas T.; Hivert, Marie-France; Ling, Charlotte; Martinez, J. Alfredo; Ozanne, Susan E.; Simmons, Rebecca A.; Szyf, Moshe; Waterland, Robert A.; Redman, Leanne M.; Ravussin, Eric

    2016-01-01

    Objective On December 8–9, 2014, the Pennington Biomedical Research Center convened a scientific symposium to review the state-of-the-science and future directions for the study of developmental programming of obesity and chronic disease. The objectives of the symposium were to discuss: (i) past and current scientific advances in animal models, population-based cohort studies and human clinical trials, (ii) the state-of-the-science of epigenetic-based research, and (iii) considerations for future studies. Results The overarching goal was to provide a comprehensive assessment of the state of the scientific field, to identify research gaps and opportunities for future research in order to identify and understand the mechanisms contributing to the developmental programming of health and disease. Conclusions Identifying the mechanisms which cause or contribute to developmental programming of future generations will be invaluable to the scientific and medical community. The ability to intervene during critical periods of prenatal and early postnatal life to promote lifelong health is the ultimate goal. Considerations for future research including the use of animal models, the study design in human cohorts with considerations about the timing of the intrauterine exposure and the resulting tissue specific epigenetic signature were extensively discussed and are presented in this meeting summary. PMID:27037645

  1. Counter-terrorism threat prediction architecture

    NASA Astrophysics Data System (ADS)

    Lehman, Lynn A.; Krause, Lee S.

    2004-09-01

    This paper will evaluate the feasibility of constructing a system to support intelligence analysts engaged in counter-terrorism. It will discuss the use of emerging techniques to evaluate a large-scale threat data repository (or Infosphere) and comparing analyst developed models to identify and discover potential threat-related activity with a uncertainty metric used to evaluate the threat. This system will also employ the use of psychological (or intent) modeling to incorporate combatant (i.e. terrorist) beliefs and intent. The paper will explore the feasibility of constructing a hetero-hierarchical (a hierarchy of more than one kind or type characterized by loose connection/feedback among elements of the hierarchy) agent based framework or "family of agents" to support "evidence retrieval" defined as combing, or searching the threat data repository and returning information with an uncertainty metric. The counter-terrorism threat prediction architecture will be guided by a series of models, constructed to represent threat operational objectives, potential targets, or terrorist objectives. The approach would compare model representations against information retrieved by the agent family to isolate or identify patterns that match within reasonable measures of proximity. The central areas of discussion will be the construction of an agent framework to search the available threat related information repository, evaluation of results against models that will represent the cultural foundations, mindset, sociology and emotional drive of typical threat combatants (i.e. the mind and objectives of a terrorist), and the development of evaluation techniques to compare result sets with the models representing threat behavior and threat targets. The applicability of concepts surrounding Modeling Field Theory (MFT) will be discussed as the basis of this research into development of proximity measures between the models and result sets and to provide feedback in support of model adaptation (learning). The increasingly complex demands facing analysts evaluating activity threatening to the security of the United States make the family of agent-based data collection (fusion) a promising area. This paper will discuss a system to support the collection and evaluation of potential threat activity as well as an approach fro presentation of the information.

  2. Novel object exploration in the C58/J mouse model of autistic-like behavior.

    PubMed

    Blick, Mikkal G; Puchalski, Breann H; Bolanos, Veronica J; Wolfe, Kaitlin M; Green, Matthew C; Ryan, Bryce C

    2015-04-01

    Mouse models of autistic like behaviors are a valuable tool to use when studying the causes, symptoms, and potential treatments for autism. The inbred C58/J strain is a strain of interest for this model and has previously been shown to possess face validity for some of the core traits of autism, including low social behavior and elevated motor stereotypies. Higher order repetitive behaviors have not been extensively studied in this strain, or in mice in general. In this study, we looked for evidence of higher-order repetitive behaviors in the C58/J strain using a novel object assay. This assay utilized a mouse's natural exploratory behavior among unfamiliar objects to identify potential sequencing patterns in motor activity. The motor stereotypies displayed by the C58/J strain during testing were consistent with past studies. The C58/J strain also displayed a high preference for a single object in the round arena assays and the females demonstrating elevated sequencing patterns in the round arena. Although the C58/J strain did not show pervasive evidence of higher-order repetitive behaviors across all measures, there was evidence of higher order repetitive behaviors in certain situations. This study further demonstrates the potential of the C58/J mouse strains as a model for lower-order and potentially, higher-order repetitive behaviors. This study also demonstrates that the shape of the novel object arena can change the behavior displayed by the test animals. Further studies utilizing the C58/J strain and further validation of the novel object assay are warranted. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Electric resistivity distribution in the Earth's crust and upper mantle for the southern East European Platform and Crimea from area-wide 2D models

    NASA Astrophysics Data System (ADS)

    Logvinov, Igor M.; Tarasov, Viktor N.

    2018-03-01

    Previously obtained magnetotelluric 2D models for 30 profiles made it possible to create an overview model of electric resistivity for the territory between 28°E and 36°E and between 44.5°N and 52.5°N. It allows us to distinguish a number of low resistivity objects (LRO) with resistivities lower than 100 Ω m the Earth's crust and mantle. Two regional conductivity anomalies are traced. The Kirovograd conductivity anomaly extends south to the Crimea mountains. A new regional conductivity anomaly (Konkskaya) can be distinguished along the southern slope of the Ukrainian Shield from 29° to 34°E. In addition, many local LROs have been identified. According to the modeling results, the local low resistivity objects on the East European Platform appear along fault zones activated during last 5-7 M years and the model suggests their relation to known zones of graphitization and polymetallic ore deposits. Local LROs in the Dnieper-Donets Basin correlate with the main oil and natural gas fields in this area. The depth of the anomalous objects amounts to 5-22 km. This is consistent with the hypotheses that hydrocarbon deposits are related to generation and transport zones of carbon-bearing fluids.

  4. A biologically plausible computational model for auditory object recognition.

    PubMed

    Larson, Eric; Billimoria, Cyrus P; Sen, Kamal

    2009-01-01

    Object recognition is a task of fundamental importance for sensory systems. Although this problem has been intensively investigated in the visual system, relatively little is known about the recognition of complex auditory objects. Recent work has shown that spike trains from individual sensory neurons can be used to discriminate between and recognize stimuli. Multiple groups have developed spike similarity or dissimilarity metrics to quantify the differences between spike trains. Using a nearest-neighbor approach the spike similarity metrics can be used to classify the stimuli into groups used to evoke the spike trains. The nearest prototype spike train to the tested spike train can then be used to identify the stimulus. However, how biological circuits might perform such computations remains unclear. Elucidating this question would facilitate the experimental search for such circuits in biological systems, as well as the design of artificial circuits that can perform such computations. Here we present a biologically plausible model for discrimination inspired by a spike distance metric using a network of integrate-and-fire model neurons coupled to a decision network. We then apply this model to the birdsong system in the context of song discrimination and recognition. We show that the model circuit is effective at recognizing individual songs, based on experimental input data from field L, the avian primary auditory cortex analog. We also compare the performance and robustness of this model to two alternative models of song discrimination: a model based on coincidence detection and a model based on firing rate.

  5. Embedding Sustainability Instruction across Content Areas: best Classroom Practices from Informal Environmental Education

    NASA Astrophysics Data System (ADS)

    Clary, R. M.; Walker, R. M.; Wissehr, C.

    2017-12-01

    Environmental education (EE) facilitates students' scientific and environmental literacy, and addresses content areas including sustainability, ecology, and civic responsibility. However, U.S. science content compartmentalization and EE's interdisciplinary nature historically made it a fragmented curriculum within U.S. schools. To gain a better understanding of effective EE instruction that can be transferred to traditional K-12 classrooms, we researched the interactions between a recognized environmental residential camp and students and teachers from six participating schools using grounded theory methodology. Our research identified the residential learning center's objectives, methods of instruction, and objectives' alignment to the delivered curricula. Data generated included lesson plans, survey responses, and interviews. Students (n = 215) identified wilderness and geology activities as the activities they wanted to experience more; they also identified developing curiosity and a sense of discovery as the most meaningful. Whereas most student-identified meaningful experiences aligned with the center's curricular objectives within the optional units, categories emerged that were not explicitly targeted in the unit activities but were embedded throughout the curriculum in sustainable practices, data collection, and reflections. We propose that embedded activities and implicit instruction can be included across content areas within K-12 classrooms. Teacher modeling and implicit instruction will require minimal classroom time, and facilitate students' scientific and environmental literacy in topics such as sustainability and citizen responsibility.

  6. A habitat assessment for Florida panther population expansion into central Florida

    USGS Publications Warehouse

    Thatcher, C.A.; Van Manen, F.T.; Clark, J.D.

    2009-01-01

    One of the goals of the Florida panther (Puma concolor coryi) recovery plan is to expand panther range north of the Caloosahatchee River in central Florida. Our objective was to evaluate the potential of that region to support panthers. We used a geographic information system and the Mahalanobis distance statistic to develop a habitat model based on landscape characteristics associated with panther home ranges. We used cross-validation and an independent telemetry data set to test the habitat model. We also conducted a least-cost path analysis to identify potential habitat linkages and to provide a relative measure of connectivity among habitat patches. Variables in our model were paved road density, major highways, human population density, percentage of the area permanently or semipermanently flooded, and percentage of the area in natural land cover. Our model clearly identified habitat typical of that found within panther home ranges based on model testing with recent telemetry data. We identified 4 potential translocation sites that may support a total of approximately 36 panthers. Although we identified potential habitat linkages, our least-cost path analyses highlighted the extreme isolation of panther habitat in portions of the study area. Human intervention will likely be required if the goal is to establish female panthers north of the Caloosahatchee in the near term.

  7. Tracking target objects orbiting earth using satellite-based telescopes

    DOEpatents

    De Vries, Willem H; Olivier, Scot S; Pertica, Alexander J

    2014-10-14

    A system for tracking objects that are in earth orbit via a constellation or network of satellites having imaging devices is provided. An object tracking system includes a ground controller and, for each satellite in the constellation, an onboard controller. The ground controller receives ephemeris information for a target object and directs that ephemeris information be transmitted to the satellites. Each onboard controller receives ephemeris information for a target object, collects images of the target object based on the expected location of the target object at an expected time, identifies actual locations of the target object from the collected images, and identifies a next expected location at a next expected time based on the identified actual locations of the target object. The onboard controller processes the collected image to identify the actual location of the target object and transmits the actual location information to the ground controller.

  8. Implementing parallel spreadsheet models for health policy decisions: The impact of unintentional errors on model projections

    PubMed Central

    Bailey, Stephanie L.; Bono, Rose S.; Nash, Denis; Kimmel, April D.

    2018-01-01

    Background Spreadsheet software is increasingly used to implement systems science models informing health policy decisions, both in academia and in practice where technical capacity may be limited. However, spreadsheet models are prone to unintentional errors that may not always be identified using standard error-checking techniques. Our objective was to illustrate, through a methodologic case study analysis, the impact of unintentional errors on model projections by implementing parallel model versions. Methods We leveraged a real-world need to revise an existing spreadsheet model designed to inform HIV policy. We developed three parallel versions of a previously validated spreadsheet-based model; versions differed by the spreadsheet cell-referencing approach (named single cells; column/row references; named matrices). For each version, we implemented three model revisions (re-entry into care; guideline-concordant treatment initiation; immediate treatment initiation). After standard error-checking, we identified unintentional errors by comparing model output across the three versions. Concordant model output across all versions was considered error-free. We calculated the impact of unintentional errors as the percentage difference in model projections between model versions with and without unintentional errors, using +/-5% difference to define a material error. Results We identified 58 original and 4,331 propagated unintentional errors across all model versions and revisions. Over 40% (24/58) of original unintentional errors occurred in the column/row reference model version; most (23/24) were due to incorrect cell references. Overall, >20% of model spreadsheet cells had material unintentional errors. When examining error impact along the HIV care continuum, the percentage difference between versions with and without unintentional errors ranged from +3% to +16% (named single cells), +26% to +76% (column/row reference), and 0% (named matrices). Conclusions Standard error-checking techniques may not identify all errors in spreadsheet-based models. Comparing parallel model versions can aid in identifying unintentional errors and promoting reliable model projections, particularly when resources are limited. PMID:29570737

  9. Implementing parallel spreadsheet models for health policy decisions: The impact of unintentional errors on model projections.

    PubMed

    Bailey, Stephanie L; Bono, Rose S; Nash, Denis; Kimmel, April D

    2018-01-01

    Spreadsheet software is increasingly used to implement systems science models informing health policy decisions, both in academia and in practice where technical capacity may be limited. However, spreadsheet models are prone to unintentional errors that may not always be identified using standard error-checking techniques. Our objective was to illustrate, through a methodologic case study analysis, the impact of unintentional errors on model projections by implementing parallel model versions. We leveraged a real-world need to revise an existing spreadsheet model designed to inform HIV policy. We developed three parallel versions of a previously validated spreadsheet-based model; versions differed by the spreadsheet cell-referencing approach (named single cells; column/row references; named matrices). For each version, we implemented three model revisions (re-entry into care; guideline-concordant treatment initiation; immediate treatment initiation). After standard error-checking, we identified unintentional errors by comparing model output across the three versions. Concordant model output across all versions was considered error-free. We calculated the impact of unintentional errors as the percentage difference in model projections between model versions with and without unintentional errors, using +/-5% difference to define a material error. We identified 58 original and 4,331 propagated unintentional errors across all model versions and revisions. Over 40% (24/58) of original unintentional errors occurred in the column/row reference model version; most (23/24) were due to incorrect cell references. Overall, >20% of model spreadsheet cells had material unintentional errors. When examining error impact along the HIV care continuum, the percentage difference between versions with and without unintentional errors ranged from +3% to +16% (named single cells), +26% to +76% (column/row reference), and 0% (named matrices). Standard error-checking techniques may not identify all errors in spreadsheet-based models. Comparing parallel model versions can aid in identifying unintentional errors and promoting reliable model projections, particularly when resources are limited.

  10. A Columnar Primary Visual Cortex (V1) Model Emulation Using a PS3 Cell-Be Array

    DTIC Science & Technology

    2011-09-01

    23 July 2010, pp 1-8, Barcelona , Spain, ISSN: 1098-7576, Print ISBN: 978-1-4244-6916, INSPEC Accession No.: 11593936, Digital Object Identifier...98subfields) X (128 FCs per subfield) X (64 minicolumns/ FC ) works out to 802816 minicolumns per hemisphere. All minicolumns within a

  11. Stability of Initial Autism Spectrum Disorder Diagnoses in Community Settings

    ERIC Educational Resources Information Center

    Daniels, Amy M.; Rosenberg, Rebecca E.; Law, J. Kiely; Lord, Catherine; Kaufmann, Walter E.; Law, Paul A.

    2011-01-01

    The study's objectives were to assess diagnostic stability of initial autism spectrum disorder (ASD) diagnoses in community settings and identify factors associated with diagnostic instability using data from a national Web-based autism registry. A Cox proportional hazards model was used to assess the relative risk of change in initial ASD…

  12. Relationship among Food-Safety Knowledge, Beliefs, and Risk-Reduction Behavior in University Students in Japan

    ERIC Educational Resources Information Center

    Takeda, Sayaka; Akamatsu, Rie; Horiguchi, Itsuko; Marui, Eiji

    2011-01-01

    Objective: To identify whether university students who have both food-safety knowledge and beliefs perform risk-reduction behaviors. Design: Cross-sectional research using a questionnaire that included food-safety knowledge, perceptions, risk-reduction behavior, stages for the selection of safer food based on the Transtheoretical Model, and…

  13. Integrated Arts-Based Teaching (IAT) Model for Brain-Based Learning

    ERIC Educational Resources Information Center

    Inocian, Reynaldo B.

    2015-01-01

    This study analyzes teaching strategies among the eight books in Principles and Methods of Teaching recommended for use in the College of Teacher Education in the Philippines. It seeks to answer the following objectives: (1) identify the most commonly used teaching strategies congruent with the integrated arts-based teaching (IAT) and (2) design…

  14. Depression and Anxiety Symptoms: Onset, Developmental Course and Risk Factors during Early Childhood

    ERIC Educational Resources Information Center

    Cote, Sylvana M.; Boivin, Michel; Liu, Xuecheng; Nagin, Daniel S.; Zoccolillo, Mark; Tremblay, Richard E.

    2009-01-01

    Background: Depressive and anxiety disorders are among the top ten leading causes of disabilities. We know little, however, about the onset, developmental course and early risk factors for depressive and anxiety symptoms (DAS). Objective: Model the developmental trajectories of DAS during early childhood and to identify risk factors for atypically…

  15. Modeling of Students' Profile and Learning Chronicle with Data Cubes

    ERIC Educational Resources Information Center

    Ola, Ade G.; Bai, Xue; Omojokun, Emmanuel E.

    2014-01-01

    Over the years, companies have relied on On-Line Analytical Processing (OLAP) to answer complex questions relating to issues in business environments such as identifying profitability, trends, correlations, and patterns. This paper addresses the application of OLAP in education and learning. The objective of the research presented in the paper is…

  16. Conceptual Model for Assessing Restoration of Puget Sound Nearshore Ecosystems

    DTIC Science & Technology

    2006-10-01

    the response “push” to a “receiver” object (i.e., pressure–state–response concept; Pieri et al. 1995), we can designate or identify the need for...Program. 12 p. plus appendices. Pieri , C., J. Dumanski, A. Hamblin, and A. Young. 1995. Land quality indicators. World Bank Discussion Papers, No

  17. Improving the Quality of Home Visitation: An Exploratory Study of Difficult Situations

    ERIC Educational Resources Information Center

    LeCroy, Craig Winston; Whitaker, Kate

    2005-01-01

    Objective: The primary purpose of this study was to use an ecological assessment model to obtain a better understanding of difficult situations that home visitors confront when implementing home visitation services. Method: A mixed method study was used which included conducting focus groups to identify specific situations faced by home visitors…

  18. Planned Change in Future Models of Project Follow Through: A Concept Paper.

    ERIC Educational Resources Information Center

    Simpkins, Edward; Brown, Asa

    The three chapters included in this paper establish a basis for organizing future implementations of Project Follow Through. Specifically, chapter 1 identifies four planning objectives for coordinating such programs. Emphasis is given to the need to focus on one fundamental, pervasive variable possibly accounting for program success: time…

  19. Hazardous Drinking and Military Community Functioning: Identifying Mediating Risk Factors

    ERIC Educational Resources Information Center

    Foran, Heather M.; Heyman, Richard E.; Slep, Amy M. Smith

    2011-01-01

    Objective: Hazardous drinking is a serious societal concern in military populations. Efforts to reduce hazardous drinking among military personnel have been limited in effectiveness. There is a need for a deeper understanding of how community-based prevention models apply to hazardous drinking in the military. Community-wide prevention efforts may…

  20. Validation of SWEEP for contrasting agricultural land use types in the Tarim Basin

    USDA-ARS?s Scientific Manuscript database

    In order to aid in identifying land management practices with the potential to control soil erosion, models such as the Wind Erosion Prediction System (WEPS) have been developed to assess soil erosion. The objective of this study was to test the performance of the WEPS erosion submodel (the Single-e...

  1. Long-Term Outcomes for the Promoting CARE Suicide Prevention Program

    ERIC Educational Resources Information Center

    Hooven, Carole; Herting, Jerald R.; Snedker, Karen A.

    2010-01-01

    Objectives: To provide a long-term look at suicide risk from adolescence to young adulthood for former participants in Promoting CARE, an indicated suicide prevention program. Methods: Five hundred ninety-three suicide-vulnerable high school youth were involved in a long-term follow-up study. Latent class growth models identify patterns of change…

  2. The Choctaw Home-Centered Family Education Demonstration Project. Final Report.

    ERIC Educational Resources Information Center

    Quigley, Patrick A.; And Others

    The final report of the four-year (1972-1976) Choctaw Family Education Project identifies the major goal, "to demonstrate a workable early childhood model for rural reservation groups," and consists of an executive summary, a comprehensive report, and a personalized report. Objectives of the project, carried out in 6 Choctaw reservation…

  3. Plug-in hybrid electric vehicle value proposition study. Phase 1, task 2, select value proposition/business model for further study

    DOT National Transportation Integrated Search

    2008-04-01

    The objective of Task 2 is to identify the combination of value propositions that is : believed to be achievable by 2030 and collectively hold promise for a sustainable : PHEV market by 2030. This deliverable outlines what the project team (with inpu...

  4. Applying AN Object-Oriented Database Model to a Scientific Database Problem: Managing Experimental Data at Cebaf.

    NASA Astrophysics Data System (ADS)

    Ehlmann, Bryon K.

    Current scientific experiments are often characterized by massive amounts of very complex data and the need for complex data analysis software. Object-oriented database (OODB) systems have the potential of improving the description of the structure and semantics of this data and of integrating the analysis software with the data. This dissertation results from research to enhance OODB functionality and methodology to support scientific databases (SDBs) and, more specifically, to support a nuclear physics experiments database for the Continuous Electron Beam Accelerator Facility (CEBAF). This research to date has identified a number of problems related to the practical application of OODB technology to the conceptual design of the CEBAF experiments database and other SDBs: the lack of a generally accepted OODB design methodology, the lack of a standard OODB model, the lack of a clear conceptual level in existing OODB models, and the limited support in existing OODB systems for many common object relationships inherent in SDBs. To address these problems, the dissertation describes an Object-Relationship Diagram (ORD) and an Object-oriented Database Definition Language (ODDL) that provide tools that allow SDB design and development to proceed systematically and independently of existing OODB systems. These tools define multi-level, conceptual data models for SDB design, which incorporate a simple notation for describing common types of relationships that occur in SDBs. ODDL allows these relationships and other desirable SDB capabilities to be supported by an extended OODB system. A conceptual model of the CEBAF experiments database is presented in terms of ORDs and the ODDL to demonstrate their functionality and use and provide a foundation for future development of experimental nuclear physics software using an OODB approach.

  5. Neural representations of the concepts in simple sentences: Concept activation prediction and context effects.

    PubMed

    Just, Marcel Adam; Wang, Jing; Cherkassky, Vladimir L

    2017-08-15

    Although it has been possible to identify individual concepts from a concept's brain activation pattern, there have been significant obstacles to identifying a proposition from its fMRI signature. Here we demonstrate the ability to decode individual prototype sentences from readers' brain activation patterns, by using theory-driven regions of interest and semantic properties. It is possible to predict the fMRI brain activation patterns evoked by propositions and words which are entirely new to the model with reliably above-chance rank accuracy. The two core components implemented in the model that reflect the theory were the choice of intermediate semantic features and the brain regions associated with the neurosemantic dimensions. This approach also predicts the neural representation of object nouns across participants, studies, and sentence contexts. Moreover, we find that the neural representation of an agent-verb-object proto-sentence is more accurately characterized by the neural signatures of its components as they occur in a similar context than by the neural signatures of these components as they occur in isolation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Frailty measurements and dysphagia in the outpatient setting.

    PubMed

    Hathaway, Bridget; Vaezi, Alec; Egloff, Ann Marie; Smith, Libby; Wasserman-Wincko, Tamara; Johnson, Jonas T

    2014-09-01

    Deconditioning and frailty may contribute to dysphagia and aspiration. Early identification of patients at risk of aspiration is important. Aspiration prevention would lead to reduced morbidity and health care costs. We therefore wondered whether objective measurements of frailty could help identify patients at risk for dysphagia and aspiration. Consecutive patients (n = 183) were enrolled. Patient characteristics and objective measures of frailty were recorded prospectively. Variables tested included age, body mass index, grip strength, and 5 meter walk pace. Statistical analysis tested for association between these parameters and dysphagia or aspiration, diagnosed by instrumental swallowing examination. Of variables tested for association with grip strength, only age category (P = .003) and ambulatory status (P < .001) were significantly associated with grip strength in linear regression models. Whereas walk speed was not associated with dysphagia or aspiration, ambulatory status was significantly associated with dysphagia and aspiration in multivariable model building. Nonambulatory status is a predictor of aspiration and should be included in risk assessments for dysphagia. The relationship between frailty and dysphagia deserves further investigation. Frailty assessments may help identify those at risk for complications of dysphagia. © The Author(s) 2014.

  7. Evidence-based dentistry: a model for clinical practice.

    PubMed

    Faggion, Clóvis M; Tu, Yu-Kang

    2007-06-01

    Making decisions in dentistry should be based on the best evidence available. The objective of this study was to demonstrate a practical procedure and model that clinicians can use to apply the results of well-conducted studies to patient care by critically appraising the evidence with checklists and letter grade scales. To demonstrate application of this model for critically appraising the quality of research evidence, a hypothetical case involving an adult male with chronic periodontitis is used as an example. To determine the best clinical approach for this patient, a four-step, evidence-based model is demonstrated, consisting of the following: definition of a research question using the PICO format, search and selection of relevant literature, critical appraisal of identified research reports using checklists, and the application of evidence. In this model, the quality of research evidence was assessed quantitatively based on different levels of quality that are assigned letter grades of A, B, and C by evaluating the studies against the QUOROM (Quality of Reporting Meta-Analyses) and CONSORT (Consolidated Standards of Reporting Trials) checklists in a tabular format. For this hypothetical periodontics case, application of the model identified the best available evidence for clinical decision making, i.e., one randomized controlled trial and one systematic review of randomized controlled trials. Both studies showed similar answers for the research question. The use of a letter grade scale allowed an objective analysis of the quality of evidence. A checklist-driven model that assesses and applies evidence to dental practice may substantially improve dentists' decision making skill.

  8. Identifying Chinese Microblog Users With High Suicide Probability Using Internet-Based Profile and Linguistic Features: Classification Model

    PubMed Central

    Guan, Li; Hao, Bibo; Cheng, Qijin; Yip, Paul SF

    2015-01-01

    Background Traditional offline assessment of suicide probability is time consuming and difficult in convincing at-risk individuals to participate. Identifying individuals with high suicide probability through online social media has an advantage in its efficiency and potential to reach out to hidden individuals, yet little research has been focused on this specific field. Objective The objective of this study was to apply two classification models, Simple Logistic Regression (SLR) and Random Forest (RF), to examine the feasibility and effectiveness of identifying high suicide possibility microblog users in China through profile and linguistic features extracted from Internet-based data. Methods There were nine hundred and nine Chinese microblog users that completed an Internet survey, and those scoring one SD above the mean of the total Suicide Probability Scale (SPS) score, as well as one SD above the mean in each of the four subscale scores in the participant sample were labeled as high-risk individuals, respectively. Profile and linguistic features were fed into two machine learning algorithms (SLR and RF) to train the model that aims to identify high-risk individuals in general suicide probability and in its four dimensions. Models were trained and then tested by 5-fold cross validation; in which both training set and test set were generated under the stratified random sampling rule from the whole sample. There were three classic performance metrics (Precision, Recall, F1 measure) and a specifically defined metric “Screening Efficiency” that were adopted to evaluate model effectiveness. Results Classification performance was generally matched between SLR and RF. Given the best performance of the classification models, we were able to retrieve over 70% of the labeled high-risk individuals in overall suicide probability as well as in the four dimensions. Screening Efficiency of most models varied from 1/4 to 1/2. Precision of the models was generally below 30%. Conclusions Individuals in China with high suicide probability are recognizable by profile and text-based information from microblogs. Although there is still much space to improve the performance of classification models in the future, this study may shed light on preliminary screening of risky individuals via machine learning algorithms, which can work side-by-side with expert scrutiny to increase efficiency in large-scale-surveillance of suicide probability from online social media. PMID:26543921

  9. A New Objective Technique for Verifying Mesoscale Numerical Weather Prediction Models

    NASA Technical Reports Server (NTRS)

    Case, Jonathan L.; Manobianco, John; Lane, John E.; Immer, Christopher D.

    2003-01-01

    This report presents a new objective technique to verify predictions of the sea-breeze phenomenon over east-central Florida by the Regional Atmospheric Modeling System (RAMS) mesoscale numerical weather prediction (NWP) model. The Contour Error Map (CEM) technique identifies sea-breeze transition times in objectively-analyzed grids of observed and forecast wind, verifies the forecast sea-breeze transition times against the observed times, and computes the mean post-sea breeze wind direction and speed to compare the observed and forecast winds behind the sea-breeze front. The CEM technique is superior to traditional objective verification techniques and previously-used subjective verification methodologies because: It is automated, requiring little manual intervention, It accounts for both spatial and temporal scales and variations, It accurately identifies and verifies the sea-breeze transition times, and It provides verification contour maps and simple statistical parameters for easy interpretation. The CEM uses a parallel lowpass boxcar filter and a high-order bandpass filter to identify the sea-breeze transition times in the observed and model grid points. Once the transition times are identified, CEM fits a Gaussian histogram function to the actual histogram of transition time differences between the model and observations. The fitted parameters of the Gaussian function subsequently explain the timing bias and variance of the timing differences across the valid comparison domain. Once the transition times are all identified at each grid point, the CEM computes the mean wind direction and speed during the remainder of the day for all times and grid points after the sea-breeze transition time. The CEM technique performed quite well when compared to independent meteorological assessments of the sea-breeze transition times and results from a previously published subjective evaluation. The algorithm correctly identified a forecast or observed sea-breeze occurrence or absence 93% of the time during the two- month evaluation period from July and August 2000. Nearly all failures in CEM were the result of complex precipitation features (observed or forecast) that contaminated the wind field, resulting in a false identification of a sea-breeze transition. A qualitative comparison between the CEM timing errors and the subjectively determined observed and forecast transition times indicate that the algorithm performed very well overall. Most discrepancies between the CEM results and the subjective analysis were again caused by observed or forecast areas of precipitation that led to complex wind patterns. The CEM also failed on a day when the observed sea- breeze transition affected only a very small portion of the verification domain. Based on the results of CEM, the RAMS tended to predict the onset and movement of the sea-breeze transition too early and/or quickly. The domain-wide timing biases provided by CEM indicated an early bias on 30 out of 37 days when both an observed and forecast sea breeze occurred over the same portions of the analysis domain. These results are consistent with previous subjective verifications of the RAMS sea breeze predictions. A comparison of the mean post-sea breeze winds indicate that RAMS has a positive wind-speed bias for .all days, which is also consistent with the early bias in the sea-breeze transition time since the higher wind speeds resulted in a faster inland penetration of the sea breeze compared to reality.

  10. Principal Approaches to Understanding Occupation and Occupational Science Found in the Chilean Journal of Occupational Therapy (2001–2012)

    PubMed Central

    Gómez, Silvia; Tapia, María Jesús; Rueda, Laura

    2017-01-01

    Background The progression of occupational science in Chile is documented in the main scientific publication of the field, the Chilean Journal of Occupational Therapy (RChTO). Objective Identify approaches to understanding and applying occupation and occupational science as elucidated in the RChTO. Methodology A systematic qualitative review of the journal (2001–2012) identified articles elucidating an approach to understanding and application operationally defined as references to specific authors, theories, models/paradigms, definitions, and other fields that support approaches to O/OS. Results The study identified two main approaches. The first considers occupation/occupational science from a practical perspective or as a means to explain human behavior; the second considers occupation/occupational science as an object of study. Each approach is further divided into categories. Conclusion This study provides a novel perspective on regional use of occupational science concepts. These findings contribute to our understanding of this science in context and to recognition of the cultural relevance of these scientific concepts. PMID:29097971

  11. Determination of land use in Minnesota by automatic interpretation of ERTS MSS data

    NASA Technical Reports Server (NTRS)

    Zirkle, R. E.; Pile, D. R.

    1973-01-01

    This program aims to determine the feasibility of identifying land use in Minnesota by automatic interpretation of ERTS-MSS data. Ultimate objectives include establishment of land use delineation and quantification by computer processing with a minimum of human operator interaction. This implies not only that reflectivity as a function of calendar time can be catalogued effectively, but also that the effects of uncontrolled variables can be identified and compensated. Clouds are the major uncontrollable data pollutant, so part of the initial effort is devoted to determining their effect and the construction of a model to help correct or justifiably ignore affected data. Other short range objectives are to identify and verify measurements giving results of importance to land managers. Lake-counting is a prominent example. Open water is easily detected in band 7 data with some support from either band 4 or band 5 to remove ambiguities. Land managers and conservationists commission studies periodically to measure water bodies and total water count within specified areas.

  12. Screening for Dyslexia Using Eye Tracking during Reading.

    PubMed

    Nilsson Benfatto, Mattias; Öqvist Seimyr, Gustaf; Ygge, Jan; Pansell, Tony; Rydberg, Agneta; Jacobson, Christer

    2016-01-01

    Dyslexia is a neurodevelopmental reading disability estimated to affect 5-10% of the population. While there is yet no full understanding of the cause of dyslexia, or agreement on its precise definition, it is certain that many individuals suffer persistent problems in learning to read for no apparent reason. Although it is generally agreed that early intervention is the best form of support for children with dyslexia, there is still a lack of efficient and objective means to help identify those at risk during the early years of school. Here we show that it is possible to identify 9-10 year old individuals at risk of persistent reading difficulties by using eye tracking during reading to probe the processes that underlie reading ability. In contrast to current screening methods, which rely on oral or written tests, eye tracking does not depend on the subject to produce some overt verbal response and thus provides a natural means to objectively assess the reading process as it unfolds in real-time. Our study is based on a sample of 97 high-risk subjects with early identified word decoding difficulties and a control group of 88 low-risk subjects. These subjects were selected from a larger population of 2165 school children attending second grade. Using predictive modeling and statistical resampling techniques, we develop classification models from eye tracking records less than one minute in duration and show that the models are able to differentiate high-risk subjects from low-risk subjects with high accuracy. Although dyslexia is fundamentally a language-based learning disability, our results suggest that eye movements in reading can be highly predictive of individual reading ability and that eye tracking can be an efficient means to identify children at risk of long-term reading difficulties.

  13. Genome–wide association study of carcass weight in commercial Hanwoo cattle

    PubMed Central

    Edea, Zewdu; Jeoung, Yeong Ho; Shin, Sung-Sub; Ku, Jaeul; Seo, Sungbo; Kim, Il-Hoi; Kim, Sang-Wook

    2018-01-01

    Objective The objective of the present study was to validate genes and genomic regions associated with carcass weight using a low-density single nucleotide polymorphism (SNP) Chip in Hanwoo cattle breed. Methods Commercial Hanwoo steers (n = 220) were genotyped with 20K GeneSeek genomic profiler BeadChip. After applying the quality control of criteria of a call rate ≥90% and minor allele frequency (MAF) ≥0.01, a total of 15,235 autosomal SNPs were left for genome-wide association (GWA) analysis. The GWA tests were performed using single-locus mixed linear model. Age at slaughter was fitted as fixed effect and sire included as a covariate. The level of genome-wide significance was set at 3.28×10−6 (0.05/15,235), corresponding to Bonferroni correction for 15,235 multiple independent tests. Results By employing EMMAX approach which is based on a mixed linear model and accounts for population stratification and relatedness, we identified 17 and 16 loci significantly (p<0.001) associated with carcass weight for the additive and dominant models, respectively. The second most significant (p = 0.000049) SNP (ARS-BFGL-NGS-28234) on bovine chromosome 4 (BTA4) at 21 Mb had an allele substitution effect of 43.45 kg. Some of the identified regions on BTA2, 6, 14, 22, and 24 were previously reported to be associated with quantitative trait loci for carcass weight in several beef cattle breeds. Conclusion This is the first genome-wide association study using SNP chips on commercial Hanwoo steers, and some of the loci newly identified in this study may help to better DNA markers that determine increased beef production in commercial Hanwoo cattle. Further studies using a larger sample size will allow confirmation of the candidates identified in this study. PMID:29103288

  14. Health Professionals' Explanations of Suicidal Behaviour: Effects of Professional Group, Theoretical Intervention Model, and Patient Suicide Experience.

    PubMed

    Rothes, Inês Areal; Henriques, Margarida Rangel

    2017-12-01

    In a help relation with a suicidal person, the theoretical models of suicidality can be essential to guide the health professional's comprehension of the client/patient. The objectives of this study were to identify health professionals' explanations of suicidal behaviors and to study the effects of professional group, theoretical intervention models, and patient suicide experience in professionals' representations. Two hundred and forty-two health professionals filled out a self-report questionnaire. Exploratory principal components analysis was used. Five explanatory models were identified: psychological suffering, affective cognitive, sociocommunicational, adverse life events, and psychopathological. Results indicated that the psychological suffering and psychopathological models were the most valued by the professionals, while the sociocommunicational was seen as the least likely to explain suicidal behavior. Differences between professional groups were found. We concluded that training and reflection on theoretical models in general and in communicative issues in particular are needed in the education of health professionals.

  15. A diagnostic model for chronic hypersensitivity pneumonitis

    PubMed Central

    Johannson, Kerri A; Elicker, Brett M; Vittinghoff, Eric; Assayag, Deborah; de Boer, Kaïssa; Golden, Jeffrey A; Jones, Kirk D; King, Talmadge E; Koth, Laura L; Lee, Joyce S; Ley, Brett; Wolters, Paul J; Collard, Harold R

    2017-01-01

    The objective of this study was to develop a diagnostic model that allows for a highly specific diagnosis of chronic hypersensitivity pneumonitis using clinical and radiological variables alone. Chronic hypersensitivity pneumonitis and other interstitial lung disease cases were retrospectively identified from a longitudinal database. High-resolution CT scans were blindly scored for radiographic features (eg, ground-glass opacity, mosaic perfusion) as well as the radiologist’s diagnostic impression. Candidate models were developed then evaluated using clinical and radiographic variables and assessed by the cross-validated C-statistic. Forty-four chronic hypersensitivity pneumonitis and eighty other interstitial lung disease cases were identified. Two models were selected based on their statistical performance, clinical applicability and face validity. Key model variables included age, down feather and/or bird exposure, radiographic presence of ground-glass opacity and mosaic perfusion and moderate or high confidence in the radiographic impression of chronic hypersensitivity pneumonitis. Models were internally validated with good performance, and cut-off values were established that resulted in high specificity for a diagnosis of chronic hypersensitivity pneumonitis. PMID:27245779

  16. Rasmussen's model of human behavior in laparoscopy training.

    PubMed

    Wentink, M; Stassen, L P S; Alwayn, I; Hosman, R J A W; Stassen, H G

    2003-08-01

    Compared to aviation, where virtual reality (VR) training has been standardized and simulators have proven their benefits, the objectives, needs, and means of VR training in minimally invasive surgery (MIS) still have to be established. The aim of the study presented is to introduce Rasmussen's model of human behavior as a practical framework for the definition of the training objectives, needs, and means in MIS. Rasmussen distinguishes three levels of human behavior: skill-, rule-, and knowledge-based behaviour. The training needs of a laparoscopic novice can be determined by identifying the specific skill-, rule-, and knowledge-based behavior that is required for performing safe laparoscopy. Future objectives of VR laparoscopy trainers should address all three levels of behavior. Although most commercially available simulators for laparoscopy aim at training skill-based behavior, especially the training of knowledge-based behavior during complications in surgery will improve safety levels. However, the cost and complexity of a training means increases when the training objectives proceed from the training of skill-based behavior to the training of complex knowledge-based behavior. In aviation, human behavior models have been used successfully to integrate the training of skill-, rule-, and knowledge-based behavior in a full flight simulator. Understanding surgeon behavior is one of the first steps towards a future full-scale laparoscopy simulator.

  17. Disinfection by-products (DBPs) in drinking water and predictive models for their occurrence: a review.

    PubMed

    Sadiq, Rehan; Rodriguez, Manuel J

    2004-04-05

    Disinfection for drinking water reduces the risk of pathogenic infection but may pose chemical threat to human health due to disinfection residues and their by-products (DBPs) when the organic and inorganic precursors are present in water. More than 250 DBPs have been identified, but the behavioural profile of only approximately 20 DBPs are adequately known. In the last 2 decades, many modelling attempts have been made to predict the occurrence of DBPs in drinking water. Models have been developed based on data generated in laboratory-scaled and field-scaled investigations. The objective of this paper is to review DBPs predictive models, identify their advantages and limitations, and examine their potential applications as decision-making tools for water treatment analysis, epidemiological studies and regulatory concerns. The paper concludes with a discussion about the future research needs in this area.

  18. The Mediating Effect of Innovation between Total Quality Management (TQM) and Business Performance

    NASA Astrophysics Data System (ADS)

    Shan, Ang Wei; Fauzi Ahmad, Mohd; Hisyamudin Muhd Nor, Nik

    2016-11-01

    Both TQM and Innovation are the competitive key factors that intensely embedded into organizational products, service and process. In order to achieve higher business performance, organizations are needed to adopt both quality and innovation. Therefore, the main objective of this paper is to identify the relationship between TQM and business performance with a mediator's effect of Innovation. After detailed review the extensive literature, a new TQM model is presented. The proposed model integrates the TQM practices and different type of innovation attempt to develop a theoretical knowledge to help academician and manufacturer to understand the relationship that design quality in product and service and engaging innovation in the activities. To this end, the SEM-PLS (Structural Equation Modelling - Partial Least Squares Structural) is used to identify and evaluate the relationship among TQM, Innovation and business performance in establishing a new TQM model.

  19. An objective rationale for the choice of regularisation parameter with application to global multiple-frequency S-wave tomography

    NASA Astrophysics Data System (ADS)

    Zaroli, C.; Sambridge, M.; Lévêque, J.-J.; Debayle, E.; Nolet, G.

    2013-06-01

    In a linear ill-posed inverse problem, the regularisation parameter (damping) controls the balance between minimising both the residual data misfit and the model norm. Poor knowledge of data uncertainties often makes the selection of damping rather arbitrary. To go beyond that subjectivity, an objective rationale for the choice of damping is presented, which is based on the coherency of delay-time estimates in different frequency bands. Our method is tailored to the problem of global Multiple-Frequency Tomography (MFT), using a data set of 287 078 S-wave delay-times measured in five frequency bands (10, 15, 22, 34, 51 s central periods). Whereas for each ray path the delay-time estimates should vary coherently from one period to the other, the noise most likely is not coherent. Thus, the lack of coherency of the information in different frequency bands is exploited, using an analogy with the cross-validation method, to identify models dominated by noise. In addition, a sharp change of behaviour of the model ℓ∞-norm, as the damping becomes lower than a threshold value, is interpreted as the signature of data noise starting to significantly pollute at least one model component. Models with damping larger than this threshold are diagnosed as being constructed with poor data exploitation. Finally, a preferred model is selected from the remaining range of permitted model solutions. This choice is quasi-objective in terms of model interpretation, as the selected model shows a high degree of similarity with almost all other permitted models (correlation superior to 98% up to spherical harmonic degree 80). The obtained tomographic model is displayed in mid lower-mantle (660-1910 km depth), and is shown to be compatible with three other recent global shear-velocity models. A wider application of the presented rationale should permit us to converge towards more objective seismic imaging of the Earth's mantle.

  20. An objective rationale for the choice of regularisation parameter with application to global multiple-frequency S-wave tomography

    NASA Astrophysics Data System (ADS)

    Zaroli, C.; Sambridge, M.; Lévêque, J.-J.; Debayle, E.; Nolet, G.

    2013-10-01

    In a linear ill-posed inverse problem, the regularisation parameter (damping) controls the balance between minimising both the residual data misfit and the model norm. Poor knowledge of data uncertainties often makes the selection of damping rather arbitrary. To go beyond that subjectivity, an objective rationale for the choice of damping is presented, which is based on the coherency of delay-time estimates in different frequency bands. Our method is tailored to the problem of global multiple-frequency tomography (MFT), using a data set of 287 078 S-wave delay times measured in five frequency bands (10, 15, 22, 34, and 51 s central periods). Whereas for each ray path the delay-time estimates should vary coherently from one period to the other, the noise most likely is not coherent. Thus, the lack of coherency of the information in different frequency bands is exploited, using an analogy with the cross-validation method, to identify models dominated by noise. In addition, a sharp change of behaviour of the model ℓ∞-norm, as the damping becomes lower than a threshold value, is interpreted as the signature of data noise starting to significantly pollute at least one model component. Models with damping larger than this threshold are diagnosed as being constructed with poor data exploitation. Finally, a preferred model is selected from the remaining range of permitted model solutions. This choice is quasi-objective in terms of model interpretation, as the selected model shows a high degree of similarity with almost all other permitted models (correlation superior to 98% up to spherical harmonic degree 80). The obtained tomographic model is displayed in the mid lower-mantle (660-1910 km depth), and is shown to be compatible with three other recent global shear-velocity models. A wider application of the presented rationale should permit us to converge towards more objective seismic imaging of Earth's mantle.

  1. Application of numerical optimization techniques to control system design for nonlinear dynamic models of aircraft

    NASA Technical Reports Server (NTRS)

    Lan, C. Edward; Ge, Fuying

    1989-01-01

    Control system design for general nonlinear flight dynamic models is considered through numerical simulation. The design is accomplished through a numerical optimizer coupled with analysis of flight dynamic equations. The general flight dynamic equations are numerically integrated and dynamic characteristics are then identified from the dynamic response. The design variables are determined iteratively by the optimizer to optimize a prescribed objective function which is related to desired dynamic characteristics. Generality of the method allows nonlinear effects to aerodynamics and dynamic coupling to be considered in the design process. To demonstrate the method, nonlinear simulation models for an F-5A and an F-16 configurations are used to design dampers to satisfy specifications on flying qualities and control systems to prevent departure. The results indicate that the present method is simple in formulation and effective in satisfying the design objectives.

  2. A Corticothalamic Circuit Model for Sound Identification in Complex Scenes

    PubMed Central

    Otazu, Gonzalo H.; Leibold, Christian

    2011-01-01

    The identification of the sound sources present in the environment is essential for the survival of many animals. However, these sounds are not presented in isolation, as natural scenes consist of a superposition of sounds originating from multiple sources. The identification of a source under these circumstances is a complex computational problem that is readily solved by most animals. We present a model of the thalamocortical circuit that performs level-invariant recognition of auditory objects in complex auditory scenes. The circuit identifies the objects present from a large dictionary of possible elements and operates reliably for real sound signals with multiple concurrently active sources. The key model assumption is that the activities of some cortical neurons encode the difference between the observed signal and an internal estimate. Reanalysis of awake auditory cortex recordings revealed neurons with patterns of activity corresponding to such an error signal. PMID:21931668

  3. Radiation and Thermal Effects on Used Nuclear Fuel and Nuclear Waste Forms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weber, William J.; Zhang, Yanwen

    This is the final report of the NEUP project “Radiation and Thermal Effects on Used Nuclear Fuel and Nuclear Waste Forms.” This project started on July 1, 2012 and was successfully completed on June 30, 2016. This report provides an overview of the main achievements, results and findings through the duration of the project. Additional details can be found in the main body of this report and in the individual Quarterly Reports and associated Deliverables of this project, which have been uploaded in PICS-NE. The objective of this research was to advance understanding and develop validated models on the effectsmore » of self-radiation from beta and alpha decay on the response of used nuclear fuel and nuclear waste forms during high-temperature interim storage and long-term permanent disposition. To achieve this objective, model used-fuel materials and model waste form materials were identified, fabricated, and studied.« less

  4. Investigation using data from ERTS to develop and implement utilization of living marine resources

    NASA Technical Reports Server (NTRS)

    Stevenson, W. H. (Principal Investigator); Pastula, E. J., Jr.

    1973-01-01

    The author has identified the following significant results. The feasibility of utilizing ERTS-1 data in conjunction with aerial remote sensing and sea truth information to predict the distribution of menhaden in the Mississippi Sound during a specific time frame has been demonstrated by employing a number of uniquely designed empirical regression models. The construction of these models was made possible through innovative statistical routines specifically developed to meet the stated objectives.

  5. Mine safety assessment using gray relational analysis and bow tie model

    PubMed Central

    2018-01-01

    Mine safety assessment is a precondition for ensuring orderly and safety in production. The main purpose of this study was to prevent mine accidents more effectively by proposing a composite risk analysis model. First, the weights of the assessment indicators were determined by the revised integrated weight method, in which the objective weights were determined by a variation coefficient method and the subjective weights determined by the Delphi method. A new formula was then adopted to calculate the integrated weights based on the subjective and objective weights. Second, after the assessment indicator weights were determined, gray relational analysis was used to evaluate the safety of mine enterprises. Mine enterprise safety was ranked according to the gray relational degree, and weak links of mine safety practices identified based on gray relational analysis. Third, to validate the revised integrated weight method adopted in the process of gray relational analysis, the fuzzy evaluation method was used to the safety assessment of mine enterprises. Fourth, for first time, bow tie model was adopted to identify the causes and consequences of weak links and allow corresponding safety measures to be taken to guarantee the mine’s safe production. A case study of mine safety assessment was presented to demonstrate the effectiveness and rationality of the proposed composite risk analysis model, which can be applied to other related industries for safety evaluation. PMID:29561875

  6. MUTLI-OBJECTIVE OPTIMIZATION OF MICROSTRUCTURE IN WROUGHT MAGNESIUM ALLOYS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Radhakrishnan, Balasubramaniam; Gorti, Sarma B; Simunovic, Srdjan

    2013-01-01

    The microstructural features that govern the mechanical properties of wrought magnesium alloys include grain size, crystallographic texture, and twinning. Several processes based on shear deformation have been developed that promote grain refinement, weakening of the basal texture, as well as the shift of the peak intensity away from the center of the basal pole figure - features that promote room temperature ductility in Mg alloys. At ORNL, we are currently exploring the concept of introducing nano-twins within sub-micron grains as a possible mechanism for simultaneously improving strength and ductility by exploiting a potential dislocation glide along the twin-matrix interface amore » mechanism that was originally proposed for face-centered cubic materials. Specifically, we have developed an integrated modeling and optimization framework in order to identify the combinations of grain size, texture and twin spacing that can maximize strength-ductility combinations. A micromechanical model that relates microstructure to material strength is coupled with a failure model that relates ductility to a critical shear strain and a critical hydrostatic stress. The micro-mechanical model is combined with an optimization tool based on genetic algorithm. A multi-objective optimization technique is used to explore the strength-ductility space in a systematic fashion and identify optimum combinations of the microstructural parameters that will simultaneously maximize the strength-ductility in the alloy.« less

  7. An evaluation of soil moisture models for countermine application

    NASA Astrophysics Data System (ADS)

    Mason, George L.

    2004-09-01

    The focus of this study is the evaluation of emerging soil moisture models as they apply to infrared, radar, and acoustic sensors within the scope of countermine operations. Physical, chemical, and biological processes changing the signature of the ground are considered. The available models were not run in-house, but were evaluated by the theory by which they were constructed and the supporting documentation. The study was conducted between September and October of 2003 and represents a subset of existing models. The objective was to identify those models suited for simulation, define the general constraints of the models, and summarize the emerging functionalities which would support sensor modeling for mine detection.

  8. Stochastic multi-objective auto-optimization for resource allocation decision-making in fixed-input health systems.

    PubMed

    Bastian, Nathaniel D; Ekin, Tahir; Kang, Hyojung; Griffin, Paul M; Fulton, Lawrence V; Grannan, Benjamin C

    2017-06-01

    The management of hospitals within fixed-input health systems such as the U.S. Military Health System (MHS) can be challenging due to the large number of hospitals, as well as the uncertainty in input resources and achievable outputs. This paper introduces a stochastic multi-objective auto-optimization model (SMAOM) for resource allocation decision-making in fixed-input health systems. The model can automatically identify where to re-allocate system input resources at the hospital level in order to optimize overall system performance, while considering uncertainty in the model parameters. The model is applied to 128 hospitals in the three services (Air Force, Army, and Navy) in the MHS using hospital-level data from 2009 - 2013. The results are compared to the traditional input-oriented variable returns-to-scale Data Envelopment Analysis (DEA) model. The application of SMAOM to the MHS increases the expected system-wide technical efficiency by 18 % over the DEA model while also accounting for uncertainty of health system inputs and outputs. The developed method is useful for decision-makers in the Defense Health Agency (DHA), who have a strategic level objective of integrating clinical and business processes through better sharing of resources across the MHS and through system-wide standardization across the services. It is also less sensitive to data outliers or sampling errors than traditional DEA methods.

  9. The Blazar 3C 66A in 2003-2004: hadronic versus leptonic model fits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reimer, A.; Joshi, M.; Boettcher, M.

    2008-12-24

    The low-frequency peaked BL Lac object 3C 66A was the subject of an extensive multi-wavelength campaign from July 2003 till April 2004, which included quasi-simultaneous observations at optical, X-rays and very high energy gamma-rays. Here we apply the hadronic Synchrotron-Proton Blazar (SPB) model to the observed spectral energy distribution time-averaged over a flaring state, and compare the resulting model fits to those obtained from the application of the leptonic Synchrotron-Self-Compton (SSC) model. The results are used to identify diagnostic key predictions of the two blazar models for future multi-wavelength observations.

  10. Design strategies for human & earth systems modeling to meet emerging multi-scale decision support needs

    NASA Astrophysics Data System (ADS)

    Spak, S.; Pooley, M.

    2012-12-01

    The next generation of coupled human and earth systems models promises immense potential and grand challenges as they transition toward new roles as core tools for defining and living within planetary boundaries. New frontiers in community model development include not only computational, organizational, and geophysical process questions, but also the twin objectives of more meaningfully integrating the human dimension and extending applicability to informing policy decisions on a range of new and interconnected issues. We approach these challenges by posing key policy questions that require more comprehensive coupled human and geophysical models, identify necessary model and organizational processes and outputs, and work backwards to determine design criteria in response to these needs. We find that modular community earth system model design must: * seamlessly scale in space (global to urban) and time (nowcasting to paleo-studies) and fully coupled on all component systems * automatically differentiate to provide complete coupled forward and adjoint models for sensitivity studies, optimization applications, and 4DVAR assimilation across Earth and human observing systems * incorporate diagnostic tools to quantify uncertainty in couplings, and in how human activity affects them * integrate accessible community development and application with JIT-compilation, cloud computing, game-oriented interfaces, and crowd-sourced problem-solving We outline accessible near-term objectives toward these goals, and describe attempts to incorporate these design objectives in recent pilot activities using atmosphere-land-ocean-biosphere-human models (WRF-Chem, IBIS, UrbanSim) at urban and regional scales for policy applications in climate, energy, and air quality.

  11. Modeling functional neuroanatomy for an anatomy information system.

    PubMed

    Niggemann, Jörg M; Gebert, Andreas; Schulz, Stefan

    2008-01-01

    Existing neuroanatomical ontologies, databases and information systems, such as the Foundational Model of Anatomy (FMA), represent outgoing connections from brain structures, but cannot represent the "internal wiring" of structures and as such, cannot distinguish between different independent connections from the same structure. Thus, a fundamental aspect of Neuroanatomy, the functional pathways and functional systems of the brain such as the pupillary light reflex system, is not adequately represented. This article identifies underlying anatomical objects which are the source of independent connections (collections of neurons) and uses these as basic building blocks to construct a model of functional neuroanatomy and its functional pathways. The basic representational elements of the model are unnamed groups of neurons or groups of neuron segments. These groups, their relations to each other, and the relations to the objects of macroscopic anatomy are defined. The resulting model can be incorporated into the FMA. The capabilities of the presented model are compared to the FMA and the Brain Architecture Management System (BAMS). Internal wiring as well as functional pathways can correctly be represented and tracked. This model bridges the gap between representations of single neurons and their parts on the one hand and representations of spatial brain structures and areas on the other hand. It is capable of drawing correct inferences on pathways in a nervous system. The object and relation definitions are related to the Open Biomedical Ontology effort and its relation ontology, so that this model can be further developed into an ontology of neuronal functional systems.

  12. Building an Ontology for Identity Resolution in Healthcare and Public Health.

    PubMed

    Duncan, Jeffrey; Eilbeck, Karen; Narus, Scott P; Clyde, Stephen; Thornton, Sidney; Staes, Catherine

    2015-01-01

    Integration of disparate information from electronic health records, clinical data warehouses, birth certificate registries and other public health information systems offers great potential for clinical care, public health practice, and research. Such integration, however, depends on correctly matching patient-specific records using demographic identifiers. Without standards for these identifiers, record linkage is complicated by issues of structural and semantic heterogeneity. Our objectives were to develop and validate an ontology to: 1) identify components of identity and events subsequent to birth that result in creation, change, or sharing of identity information; 2) develop an ontology to facilitate data integration from multiple healthcare and public health sources; and 3) validate the ontology's ability to model identity-changing events over time. We interviewed domain experts in area hospitals and public health programs and developed process models describing the creation and transmission of identity information among various organizations for activities subsequent to a birth event. We searched for existing relevant ontologies. We validated the content of our ontology with simulated identity information conforming to scenarios identified in our process models. We chose the Simple Event Model (SEM) to describe events in early childhood and integrated the Clinical Element Model (CEM) for demographic information. We demonstrated the ability of the combined SEM-CEM ontology to model identity events over time. The use of an ontology can overcome issues of semantic and syntactic heterogeneity to facilitate record linkage.

  13. System Interdependency Modeling in the Design of Prognostic and Health Management Systems in Smart Manufacturing.

    PubMed

    Malinowski, M L; Beling, P A; Haimes, Y Y; LaViers, A; Marvel, J A; Weiss, B A

    2015-01-01

    The fields of risk analysis and prognostics and health management (PHM) have developed in a largely independent fashion. However, both fields share a common core goal. They aspire to manage future adverse consequences associated with prospective dysfunctions of the systems under consideration due to internal or external forces. This paper describes how two prominent risk analysis theories and methodologies - Hierarchical Holographic Modeling (HHM) and Risk Filtering, Ranking, and Management (RFRM) - can be adapted to support the design of PHM systems in the context of smart manufacturing processes. Specifically, the proposed methodologies will be used to identify targets - components, subsystems, or systems - that would most benefit from a PHM system in regards to achieving the following objectives: minimizing cost, minimizing production/maintenance time, maximizing system remaining usable life (RUL), maximizing product quality, and maximizing product output. HHM is a comprehensive modeling theory and methodology that is grounded on the premise that no system can be modeled effectively from a single perspective. It can also be used as an inductive method for scenario structuring to identify emergent forced changes (EFCs) in a system. EFCs connote trends in external or internal sources of risk to a system that may adversely affect specific states of the system. An important aspect of proactive risk management includes bolstering the resilience of the system for specific EFCs by appropriately controlling the states. Risk scenarios for specific EFCs can be the basis for the design of prognostic and diagnostic systems that provide real-time predictions and recognition of scenario changes. The HHM methodology includes visual modeling techniques that can enhance stakeholders' understanding of shared states, resources, objectives and constraints among the interdependent and interconnected subsystems of smart manufacturing systems. In risk analysis, HHM is often paired with Risk Filtering, Ranking, and Management (RFRM). The RFRM process provides the users, (e.g., technology developers, original equipment manufacturers (OEMs), technology integrators, manufacturers), with the most critical risks to the objectives, which can be used to identify the most critical components and subsystems that would most benefit from a PHM system. A case study is presented in which HHM and RFRM are adapted for PHM in the context of an active manufacturing facility located in the United States. The methodologies help to identify the critical risks to the manufacturing process, and the major components and subsystems that would most benefit from a developed PHM system.

  14. System Interdependency Modeling in the Design of Prognostic and Health Management Systems in Smart Manufacturing

    PubMed Central

    Malinowski, M.L.; Beling, P.A.; Haimes, Y.Y.; LaViers, A.; Marvel, J.A.; Weiss, B.A.

    2017-01-01

    The fields of risk analysis and prognostics and health management (PHM) have developed in a largely independent fashion. However, both fields share a common core goal. They aspire to manage future adverse consequences associated with prospective dysfunctions of the systems under consideration due to internal or external forces. This paper describes how two prominent risk analysis theories and methodologies – Hierarchical Holographic Modeling (HHM) and Risk Filtering, Ranking, and Management (RFRM) – can be adapted to support the design of PHM systems in the context of smart manufacturing processes. Specifically, the proposed methodologies will be used to identify targets – components, subsystems, or systems – that would most benefit from a PHM system in regards to achieving the following objectives: minimizing cost, minimizing production/maintenance time, maximizing system remaining usable life (RUL), maximizing product quality, and maximizing product output. HHM is a comprehensive modeling theory and methodology that is grounded on the premise that no system can be modeled effectively from a single perspective. It can also be used as an inductive method for scenario structuring to identify emergent forced changes (EFCs) in a system. EFCs connote trends in external or internal sources of risk to a system that may adversely affect specific states of the system. An important aspect of proactive risk management includes bolstering the resilience of the system for specific EFCs by appropriately controlling the states. Risk scenarios for specific EFCs can be the basis for the design of prognostic and diagnostic systems that provide real-time predictions and recognition of scenario changes. The HHM methodology includes visual modeling techniques that can enhance stakeholders’ understanding of shared states, resources, objectives and constraints among the interdependent and interconnected subsystems of smart manufacturing systems. In risk analysis, HHM is often paired with Risk Filtering, Ranking, and Management (RFRM). The RFRM process provides the users, (e.g., technology developers, original equipment manufacturers (OEMs), technology integrators, manufacturers), with the most critical risks to the objectives, which can be used to identify the most critical components and subsystems that would most benefit from a PHM system. A case study is presented in which HHM and RFRM are adapted for PHM in the context of an active manufacturing facility located in the United States. The methodologies help to identify the critical risks to the manufacturing process, and the major components and subsystems that would most benefit from a developed PHM system. PMID:28664162

  15. Genome-wide association study to identify potential genetic modifiers in a canine model for Duchenne muscular dystrophy.

    PubMed

    Brinkmeyer-Langford, Candice; Balog-Alvarez, Cynthia; Cai, James J; Davis, Brian W; Kornegay, Joe N

    2016-08-22

    Duchenne muscular dystrophy (DMD) causes progressive muscle degeneration, cardiomyopathy and respiratory failure in approximately 1/5,000 boys. Golden Retriever muscular dystrophy (GRMD) resembles DMD both clinically and pathologically. Like DMD, GRMD exhibits remarkable phenotypic variation among affected dogs, suggesting the influence of modifiers. Understanding the role(s) of genetic modifiers of GRMD may identify genes and pathways that also modify phenotypes in DMD and reveal novel therapies. Therefore, our objective in this study was to identify genetic modifiers that affect discrete GRMD phenotypes. We performed a linear mixed-model (LMM) analysis using 16 variably-affected dogs from our GRMD colony (8 dystrophic, 8 non-dystrophic). All of these dogs were either full or half-siblings, and phenotyped for 19 objective, quantitative biomarkers at ages 6 and 12 months. Each biomarker was individually assessed. Gene expression profiles of 59 possible candidate genes were generated for two muscle types: the cranial tibialis and medial head of the gastrocnemius. SNPs significantly associated with GRMD biomarkers were identified on multiple chromosomes (including the X chromosome). Gene expression levels for candidate genes located near these SNPs correlated with biomarker values, suggesting possible roles as GRMD modifiers. The results of this study enhance our understanding of GRMD pathology and represent a first step toward the characterization of GRMD modifiers that may be relevant to DMD pathology. Such modifiers are likely to be useful for DMD treatment development based on their relationships to GRMD phenotypes.

  16. Classification of antecedents towards safety use of health information technology: A systematic review.

    PubMed

    Salahuddin, Lizawati; Ismail, Zuraini

    2015-11-01

    This paper provides a systematic review of safety use of health information technology (IT). The first objective is to identify the antecedents towards safety use of health IT by conducting systematic literature review (SLR). The second objective is to classify the identified antecedents based on the work system in Systems Engineering Initiative for Patient Safety (SEIPS) model and an extension of DeLone and McLean (D&M) information system (IS) success model. A systematic literature review (SLR) was conducted from peer-reviewed scholarly publications between January 2000 and July 2014. SLR was carried out and reported based on the preferred reporting items for systematic reviews and meta-analyses (PRISMA) statement. The related articles were identified by searching the articles published in Science Direct, Medline, EMBASE, and CINAHL databases. Data extracted from the resultant studies included are to be analysed based on the work system in Systems Engineering Initiative for Patient Safety (SEIPS) model, and also from the extended DeLone and McLean (D&M) information system (IS) success model. 55 articles delineated to be antecedents that influenced the safety use of health IT were included for review. Antecedents were identified and then classified into five key categories. The categories are (1) person, (2) technology, (3) tasks, (4) organization, and (5) environment. Specifically, person is attributed by competence while technology is associated to system quality, information quality, and service quality. Tasks are attributed by task-related stressor. Organisation is related to training, organisation resources, and teamwork. Lastly, environment is attributed by physical layout, and noise. This review provides evidence that the antecedents for safety use of health IT originated from both social and technical aspects. However, inappropriate health IT usage potentially increases the incidence of errors and produces new safety risks. The review cautions future implementation and adoption of health IT to carefully consider the complex interactions between social and technical elements propound in healthcare settings. Copyright © 2015. Published by Elsevier Ireland Ltd.

  17. The Cost of Unintended Pregnancies for Employer-Sponsored Health Insurance Plans

    PubMed Central

    Dieguez, Gabriela; Pyenson, Bruce S.; Law, Amy W.; Lynen, Richard; Trussell, James

    2015-01-01

    Background Pregnancy is associated with a significant cost for employers providing health insurance benefits to their employees. The latest study on the topic was published in 2002, estimating the unintended pregnancy rate for women covered by employer-sponsored insurance benefits to be approximately 29%. Objectives The primary objective of this study was to update the cost of unintended pregnancy to employer-sponsored health insurance plans with current data. The secondary objective was to develop a regression model to identify the factors and associated magnitude that contribute to unintended pregnancies in the employee benefits population. Methods We developed stepwise multinomial logistic regression models using data from a national survey on maternal attitudes about pregnancy before and shortly after giving birth. The survey was conducted by the Centers for Disease Control and Prevention through mail and via telephone interviews between 2009 and 2011 of women who had had a live birth. The regression models were then applied to a large commercial health claims database from the Truven Health MarketScan to retrospectively assign the probability of pregnancy intention to each delivery. Results Based on the MarketScan database, we estimate that among employer-sponsored health insurance plans, 28.8% of pregnancies are unintended, which is consistent with national findings of 29% in a survey by the Centers for Disease Control and Prevention. These unintended pregnancies account for 27.4% of the annual delivery costs to employers in the United States, or approximately 1% of the typical employer's health benefits spending for 1 year. Using these findings, we present a regression model that employers could apply to their claims data to identify the risk for unintended pregnancies in their health insurance population. Conclusion The availability of coverage for contraception without employee cost-sharing, as was required by the Affordable Care Act in 2012, combined with the ability to identify women who are at high risk for an unintended pregnancy, can help employers address the costs of unintended pregnancies in their employee benefits population. This can also help to bring contraception efforts into the mainstream of other preventive and wellness programs, such as smoking cessation, obesity management, and diabetes control programs. PMID:26005515

  18. A scoping review of malaria forecasting: past work and future directions

    PubMed Central

    Zinszer, Kate; Verma, Aman D; Charland, Katia; Brewer, Timothy F; Brownstein, John S; Sun, Zhuoyu; Buckeridge, David L

    2012-01-01

    Objectives There is a growing body of literature on malaria forecasting methods and the objective of our review is to identify and assess methods, including predictors, used to forecast malaria. Design Scoping review. Two independent reviewers searched information sources, assessed studies for inclusion and extracted data from each study. Information sources Search strategies were developed and the following databases were searched: CAB Abstracts, EMBASE, Global Health, MEDLINE, ProQuest Dissertations & Theses and Web of Science. Key journals and websites were also manually searched. Eligibility criteria for included studies We included studies that forecasted incidence, prevalence or epidemics of malaria over time. A description of the forecasting model and an assessment of the forecast accuracy of the model were requirements for inclusion. Studies were restricted to human populations and to autochthonous transmission settings. Results We identified 29 different studies that met our inclusion criteria for this review. The forecasting approaches included statistical modelling, mathematical modelling and machine learning methods. Climate-related predictors were used consistently in forecasting models, with the most common predictors being rainfall, relative humidity, temperature and the normalised difference vegetation index. Model evaluation was typically based on a reserved portion of data and accuracy was measured in a variety of ways including mean-squared error and correlation coefficients. We could not compare the forecast accuracy of models from the different studies as the evaluation measures differed across the studies. Conclusions Applying different forecasting methods to the same data, exploring the predictive ability of non-environmental variables, including transmission reducing interventions and using common forecast accuracy measures will allow malaria researchers to compare and improve models and methods, which should improve the quality of malaria forecasting. PMID:23180505

  19. 2D approaches to 3D watermarking: state-of-the-art and perspectives

    NASA Astrophysics Data System (ADS)

    Mitrea, M.; Duţă, S.; Prêteux, F.

    2006-02-01

    With the advent of the Information Society, video, audio, speech, and 3D media represent the source of huge economic benefits. Consequently, there is a continuously increasing demand for protecting their related intellectual property rights. The solution can be provided by robust watermarking, a research field which exploded in the last 7 years. However, the largest part of the scientific effort was devoted to video and audio protection, the 3D objects being quite neglected. In the absence of any standardisation attempt, the paper starts by summarising the approaches developed in this respect and by further identifying the main challenges to be addressed in the next years. Then, it describes an original oblivious watermarking method devoted to the protection of the 3D objects represented by NURBS (Non uniform Rational B Spline) surfaces. Applied to both free form objects and CAD models, the method exhibited very good transparency (no visible differences between the marked and the unmarked model) and robustness (with respect to both traditional attacks and to NURBS processing).

  20. Crowding with conjunctions of simple features.

    PubMed

    Põder, Endel; Wagemans, Johan

    2007-11-20

    Several recent studies have related crowding with the feature integration stage in visual processing. In order to understand the mechanisms involved in this stage, it is important to use stimuli that have several features to integrate, and these features should be clearly defined and measurable. In this study, Gabor patches were used as target and distractor stimuli. The stimuli differed in three dimensions: spatial frequency, orientation, and color. A group of 3, 5, or 7 objects was presented briefly at 4 deg eccentricity of the visual field. The observers' task was to identify the object located in the center of the group. A strong effect of the number of distractors was observed, consistent with various spatial pooling models. The analysis of incorrect responses revealed that these were a mix of feature errors and mislocalizations of the target object. Feature errors were not purely random, but biased by the features of distractors. We propose a simple feature integration model that predicts most of the observed regularities.

  1. A knowledge based software engineering environment testbed

    NASA Technical Reports Server (NTRS)

    Gill, C.; Reedy, A.; Baker, L.

    1985-01-01

    The Carnegie Group Incorporated and Boeing Computer Services Company are developing a testbed which will provide a framework for integrating conventional software engineering tools with Artifical Intelligence (AI) tools to promote automation and productivity. The emphasis is on the transfer of AI technology to the software development process. Experiments relate to AI issues such as scaling up, inference, and knowledge representation. In its first year, the project has created a model of software development by representing software activities; developed a module representation formalism to specify the behavior and structure of software objects; integrated the model with the formalism to identify shared representation and inheritance mechanisms; demonstrated object programming by writing procedures and applying them to software objects; used data-directed and goal-directed reasoning to, respectively, infer the cause of bugs and evaluate the appropriateness of a configuration; and demonstrated knowledge-based graphics. Future plans include introduction of knowledge-based systems for rapid prototyping or rescheduling; natural language interfaces; blackboard architecture; and distributed processing

  2. Predictive Modeling and Concentration of the Risk of Suicide: Implications for Preventive Interventions in the US Department of Veterans Affairs

    PubMed Central

    McCarthy, John F.; Katz, Ira R.; Thompson, Caitlin; Kemp, Janet; Hannemann, Claire M.; Nielson, Christopher; Schoenbaum, Michael

    2015-01-01

    Objectives. The Veterans Health Administration (VHA) evaluated the use of predictive modeling to identify patients at risk for suicide and to supplement ongoing care with risk-stratified interventions. Methods. Suicide data came from the National Death Index. Predictors were measures from VHA clinical records incorporating patient-months from October 1, 2008, to September 30, 2011, for all suicide decedents and 1% of living patients, divided randomly into development and validation samples. We used data on all patients alive on September 30, 2010, to evaluate predictions of suicide risk over 1 year. Results. Modeling demonstrated that suicide rates were 82 and 60 times greater than the rate in the overall sample in the highest 0.01% stratum for calculated risk for the development and validation samples, respectively; 39 and 30 times greater in the highest 0.10%; 14 and 12 times greater in the highest 1.00%; and 6.3 and 5.7 times greater in the highest 5.00%. Conclusions. Predictive modeling can identify high-risk patients who were not identified on clinical grounds. VHA is developing modeling to enhance clinical care and to guide the delivery of preventive interventions. PMID:26066914

  3. Cognitive object recognition system (CORS)

    NASA Astrophysics Data System (ADS)

    Raju, Chaitanya; Varadarajan, Karthik Mahesh; Krishnamurthi, Niyant; Xu, Shuli; Biederman, Irving; Kelley, Troy

    2010-04-01

    We have developed a framework, Cognitive Object Recognition System (CORS), inspired by current neurocomputational models and psychophysical research in which multiple recognition algorithms (shape based geometric primitives, 'geons,' and non-geometric feature-based algorithms) are integrated to provide a comprehensive solution to object recognition and landmarking. Objects are defined as a combination of geons, corresponding to their simple parts, and the relations among the parts. However, those objects that are not easily decomposable into geons, such as bushes and trees, are recognized by CORS using "feature-based" algorithms. The unique interaction between these algorithms is a novel approach that combines the effectiveness of both algorithms and takes us closer to a generalized approach to object recognition. CORS allows recognition of objects through a larger range of poses using geometric primitives and performs well under heavy occlusion - about 35% of object surface is sufficient. Furthermore, geon composition of an object allows image understanding and reasoning even with novel objects. With reliable landmarking capability, the system improves vision-based robot navigation in GPS-denied environments. Feasibility of the CORS system was demonstrated with real stereo images captured from a Pioneer robot. The system can currently identify doors, door handles, staircases, trashcans and other relevant landmarks in the indoor environment.

  4. Understanding the Effect of Land Cover Classification on Model Estimates of Regional Carbon Cycling in the Boreal Forest Biome

    NASA Technical Reports Server (NTRS)

    Kimball, John; Kang, Sinkyu

    2003-01-01

    The original objectives of this proposed 3-year project were to: 1) quantify the respective contributions of land cover and disturbance (i.e., wild fire) to uncertainty associated with regional carbon source/sink estimates produced by a variety of boreal ecosystem models; 2) identify the model processes responsible for differences in simulated carbon source/sink patterns for the boreal forest; 3) validate model outputs using tower and field- based estimates of NEP and NPP; and 4) recommend/prioritize improvements to boreal ecosystem carbon models, which will better constrain regional source/sink estimates for atmospheric C02. These original objectives were subsequently distilled to fit within the constraints of a 1 -year study. This revised study involved a regional model intercomparison over the BOREAS study region involving Biome-BGC, and TEM (A.D. McGuire, UAF) ecosystem models. The major focus of these revised activities involved quantifying the sensitivity of regional model predictions associated with land cover classification uncertainties. We also evaluated the individual and combined effects of historical fire activity, historical atmospheric CO2 concentrations, and climate change on carbon and water flux simulations within the BOREAS study region.

  5. Fundamental study of ash formation and deposition: Effect of reducing stoichiometry. Final report, April 1, 1993--June 30, 1995

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bool, L.E. III; Helble, J.J.; Shah, N.

    1995-09-01

    The technical objectives of this project are: (1) To identify the partitioning of inorganic coal constituents among vapor, submicron fume, and fly ash products generated during the combustion of pulverized coal under a variety of combustion conditions. Fuel lean and fuel rich combustion conditions are considered. (2) To identify and quantify the fundamental processes by which the transformations of minerals and organically-associated inorganic species occur. Emphasis is placed on identifying any changes that occur as a result of combustion under sub-stoichiometric combustion conditions. (3) To incorporate the effects of combustion stoichiometry into an Engineering Model for Ash Formation.

  6. Building a balanced scorecard for a burn center.

    PubMed

    Wachtel, T L; Hartford, C E; Hughes, J A

    1999-08-01

    The Balanced Scorecard provides a model that can be adapted to the management of any burn center, burn service or burn program. This model enables an organization to translate its mission and vision into specific strategic objectives across the four perspective: (1) the financial perspective; (2) the customer service perspective; (3) the internal business perspective; and (4) the growth and learning perspective. Once the appropriate objectives are identified, the Balanced Scorecard guides the organization to develop reasonable performance measures and establishes targets, initiatives and alternatives to meet programmatic goals and pursue longer-term visionary improvements. We used the burn center at the University of Colorado Health Sciences Center to test whether the Balanced Scorecard methodology was appropriate for the core business plan of a healthcare strategic business unit (i.e. a burn center).

  7. Bunching at the kink: implications for spending responses to health insurance contracts

    PubMed Central

    Einav, Liran; Finkelstein, Amy

    2017-01-01

    A large literature in empirical public finance relies on “bunching” to identify a behavioral response to non-linear incentives and to translate this response into an economic object to be used counterfactually. We conduct this type of analysis in the context of prescription drug insurance for the elderly in Medicare Part D, where a kink in the individual’s budget set generates substantial bunching in annual drug expenditure around the famous “donut hole”. We show that different alternative economic models can match the basic bunching pattern, but have very different quantitative implications for the counterfactual spending response to alternative insurance contracts. These findings illustrate the importance of modeling choices in mapping a compelling reduced form pattern into an economic object of interest. PMID:28785121

  8. The Accuracy of the VISA-P Questionnaire, Single-Leg Decline Squat, and Tendon Pain History to Identify Patellar Tendon Abnormalities in Adult Athletes.

    PubMed

    Mendonça, Luciana de Michelis; Ocarino, Juliana Melo; Bittencourt, Natália Franco Netto; Fernandes, Ludmila Maria Oliveira; Verhagen, Evert; Fonseca, Sérgio Teixeira

    2016-08-01

    Study Design Cross-sectional clinical assessment. Background Patellar tendinopathy is not always accompanied by patellar tendon abnormalities (PTAs). Thus, clinical screening tools to help identify patients with patellar tendon pain who have PTAs could enhance clinical decision making and patient prognosis. Objectives To test the diagnostic accuracy of the Victorian Institute of Sport Assessment-Patella (VISA-P) questionnaire, a single-leg decline squat (SLDS), tendon pain history, age, and years of sports participation to identify athletes with symptomatic patellar tendons who have PTAs confirmed on imaging. Methods Data provided by ultrasound examination, the VISA-P questionnaire, the SLDS, tendon pain history, age, and years of sport participation were collected in 43 athletes. A classification and regression tree (CART) model was developed to verify variables associated with PTA occurrence. Likelihood ratios (LRs) were computed for positive and negative tests. Results The SLDS, VISA-P questionnaire, and tendon pain history were associated with PTA occurrence. Athletes with negative results on all 3 tests (CART model) had a lower likelihood of having PTAs (negative LR = 0.3; 95% confidence interval [CI]: 0.2, 0.5). The isolated use of the SLDS or tendon pain history (positive LR = 4.2; 95% CI: 2.3, 7.14 and 4.5; 95% CI: 1.8, 11.1, respectively) had similar influence on probability of PTA presence compared to the CART model (positive LR = 4.1; 95% CI: 2.5, 6.3). Conclusion Although the objective was to investigate a clinical test to identify PTAs, the combined use of the tests had greater accuracy to identify individuals without PTAs. Level of Evidence Diagnosis, level 3b. J Orthop Sports Phys Ther 2016;46(8):673-680. Epub 3 Jul 2016. doi:10.2519/jospt.2016.6192.

  9. Numerical and analytical investigation towards performance enhancement of a newly developed rockfall protective cable-net structure

    NASA Astrophysics Data System (ADS)

    Dhakal, S.; Bhandary, N. P.; Yatabe, R.; Kinoshita, N.

    2012-04-01

    In a previous companion paper, we presented a three-tier modelling of a particular type of rockfall protective cable-net structure (barrier), developed newly in Japan. Therein, we developed a three-dimensional, Finite Element based, nonlinear numerical model having been calibrated/back-calculated and verified with the element- and structure-level physical tests. Moreover, using a very simple, lumped-mass, single-degree-of-freedom, equivalently linear analytical model, a global-displacement-predictive correlation was devised by modifying the basic equation - obtained by combining the principles of conservation of linear momentum and energy - based on the back-analysis of the tests on the numerical model. In this paper, we use the developed models to explore the performance enhancement potential of the structure in terms of (a) the control of global displacement - possibly the major performance criterion for the proposed structure owing to a narrow space available in the targeted site, and (b) the increase in energy dissipation by the existing U-bolt-type Friction-brake Devices - which are identified to have performed weakly when integrated into the structure. A set of parametric investigations have revealed correlations to achieve the first objective in terms of the structure's mass, particularly by manipulating the wire-net's characteristics, and has additionally disclosed the effects of the impacting-block's parameters. Towards achieving the second objective, another set of parametric investigations have led to a proposal of a few innovative improvements in the constitutive behaviour (model) of the studied brake device (dissipator), in addition to an important recommendation of careful handling of the device based on the identified potential flaw.

  10. Identifying black swans in NextGen: predicting human performance in off-nominal conditions.

    PubMed

    Wickens, Christopher D; Hooey, Becky L; Gore, Brian F; Sebok, Angelia; Koenicke, Corey S

    2009-10-01

    The objective is to validate a computational model of visual attention against empirical data--derived from a meta-analysis--of pilots' failure to notice safety-critical unexpected events. Many aircraft accidents have resulted, in part, because of failure to notice nonsalient unexpected events outside of foveal vision, illustrating the phenomenon of change blindness. A model of visual noticing, N-SEEV (noticing-salience, expectancy, effort, and value), was developed to predict these failures. First, 25 studies that reported objective data on miss rate for unexpected events in high-fidelity cockpit simulations were identified, and their miss rate data pooled across five variables (phase of flight, event expectancy, event location, presence of a head-up display, and presence of a highway-in-the-sky display). Second, the parameters of the N-SEEV model were tailored to mimic these dichotomies. The N-SEEV model output predicted variance in the obtained miss rate (r = .73). The individual miss rates of all six dichotomous conditions were predicted within 14%, and four of these were predicted within 7%. The N-SEEV model, developed on the basis of an independent data set, was able to successfully predict variance in this safety-critical measure of pilot response to abnormal circumstances, as collected from the literature. As new technology and procedures are envisioned for the future airspace, it is important to predict if these may compromise safety in terms of pilots' failing to notice unexpected events. Computational models such as N-SEEV support cost-effective means of making such predictions.

  11. Prior knowledge guided active modules identification: an integrated multi-objective approach.

    PubMed

    Chen, Weiqi; Liu, Jing; He, Shan

    2017-03-14

    Active module, defined as an area in biological network that shows striking changes in molecular activity or phenotypic signatures, is important to reveal dynamic and process-specific information that is correlated with cellular or disease states. A prior information guided active module identification approach is proposed to detect modules that are both active and enriched by prior knowledge. We formulate the active module identification problem as a multi-objective optimisation problem, which consists two conflicting objective functions of maximising the coverage of known biological pathways and the activity of the active module simultaneously. Network is constructed from protein-protein interaction database. A beta-uniform-mixture model is used to estimate the distribution of p-values and generate scores for activity measurement from microarray data. A multi-objective evolutionary algorithm is used to search for Pareto optimal solutions. We also incorporate a novel constraints based on algebraic connectivity to ensure the connectedness of the identified active modules. Application of proposed algorithm on a small yeast molecular network shows that it can identify modules with high activities and with more cross-talk nodes between related functional groups. The Pareto solutions generated by the algorithm provides solutions with different trade-off between prior knowledge and novel information from data. The approach is then applied on microarray data from diclofenac-treated yeast cells to build network and identify modules to elucidate the molecular mechanisms of diclofenac toxicity and resistance. Gene ontology analysis is applied to the identified modules for biological interpretation. Integrating knowledge of functional groups into the identification of active module is an effective method and provides a flexible control of balance between pure data-driven method and prior information guidance.

  12. Risk adjustment in the American College of Surgeons National Surgical Quality Improvement Program: a comparison of logistic versus hierarchical modeling.

    PubMed

    Cohen, Mark E; Dimick, Justin B; Bilimoria, Karl Y; Ko, Clifford Y; Richards, Karen; Hall, Bruce Lee

    2009-12-01

    Although logistic regression has commonly been used to adjust for risk differences in patient and case mix to permit quality comparisons across hospitals, hierarchical modeling has been advocated as the preferred methodology, because it accounts for clustering of patients within hospitals. It is unclear whether hierarchical models would yield important differences in quality assessments compared with logistic models when applied to American College of Surgeons (ACS) National Surgical Quality Improvement Program (NSQIP) data. Our objective was to evaluate differences in logistic versus hierarchical modeling for identifying hospitals with outlying outcomes in the ACS-NSQIP. Data from ACS-NSQIP patients who underwent colorectal operations in 2008 at hospitals that reported at least 100 operations were used to generate logistic and hierarchical prediction models for 30-day morbidity and mortality. Differences in risk-adjusted performance (ratio of observed-to-expected events) and outlier detections from the two models were compared. Logistic and hierarchical models identified the same 25 hospitals as morbidity outliers (14 low and 11 high outliers), but the hierarchical model identified 2 additional high outliers. Both models identified the same eight hospitals as mortality outliers (five low and three high outliers). The values of observed-to-expected events ratios and p values from the two models were highly correlated. Results were similar when data were permitted from hospitals providing < 100 patients. When applied to ACS-NSQIP data, logistic and hierarchical models provided nearly identical results with respect to identification of hospitals' observed-to-expected events ratio outliers. As hierarchical models are prone to implementation problems, logistic regression will remain an accurate and efficient method for performing risk adjustment of hospital quality comparisons.

  13. Hepatic Proteomic Analysis Revealed Altered Metabolic Pathways in Insulin Resistant Akt1+/-/Akt2-/-Mice

    PubMed Central

    Pedersen, Brian A; Wang, Weiwen; Taylor, Jared F; Khattab, Omar S; Chen, Yu-Han; Edwards, Robert A; Yazdi, Puya G; Wang, Ping H

    2015-01-01

    Objective The aim of this study was to identify liver proteome changes in a mouse model of severe insulin resistance and markedly decreased leptin levels. Methods Two-dimensional differential gel electrophoresis was utilized to identify liver proteome changes in AKT1+/-/AKT2-/- mice. Proteins with altered levels were identified with tandem mass spectrometry. Ingenuity Pathway analysis was performed for the interpretation of the biological significance of the observed proteomic changes. Results 11 proteins were identified from 2 biological replicates to be differentially expressed by a ratio of at least 1.3 between age-matched insulin resistant (Akt1+/-/Akt2-/-) and wild type mice. Albumin and mitochondrial ornithine aminotransferase were detected from multiple spots, which suggest post-translational modifications. Enzymes of the urea cycle were common members of top regulated pathways. Conclusion Our results help to unveil the regulation of the liver proteome underlying altered metabolism in an animal model of severe insulin resistance. PMID:26455965

  14. Estimating migratory fish distribution from altitude and basin area: a case study in a large Neotropical river

    Treesearch

    Jose Ricardo Barradas; Lucas G. Silva; Bret C. Harvey; Nelson F. Fontoura

    2012-01-01

    1. The objective of this study was to identify longitudinal distribution patterns of large migratory fish species in the Uruguay River basin, southern Brazil, and construct statistical distribution models for Salminus brasiliensis, Prochilodus lineatus, Leporinus obtusidens and Pseudoplatystoma corruscans. 2. The sampling programme resulted in 202 interviews with old...

  15. Chapter 9 - Vegetation succession modeling for the LANDFIRE Prototype Project

    Treesearch

    Donald Long; B. John (Jack) Losensky; Donald Bedunah

    2006-01-01

    One of the main objectives of the Landscape Fire and Resource Management Planning Tools Prototype Project, or LANDFIRE Prototype Project, was to determine departure of current vegetation conditions from the range and variation of conditions that existed during the historical era identified in the LANDFIRE guidelines as 1600-1900 A.D. (Keane and Rollins, Ch. 3). In...

  16. Local Adaptation of Central Policies: The Policymaking and Implementation of Compulsory Education for Migrant Children in China

    ERIC Educational Resources Information Center

    Wang, Lihua

    2016-01-01

    This article looks at the central and local governments' policymaking and implementation of compulsory education for migrant children in China. Three distinct models of policy implementation were identified through a case study approach. They indicated a selective adaptation of central policy objective and principles by the local governments and…

  17. Patterns of Therapist Variability: Therapist Effects and the Contribution of Patient Severity and Risk

    ERIC Educational Resources Information Center

    Saxon, David; Barkham, Michael

    2012-01-01

    Objective: To investigate the size of therapist effects using multilevel modeling (MLM), to compare the outcomes of therapists identified as above and below average, and to consider how key variables--in particular patient severity and risk and therapist caseload--contribute to therapist variability and outcomes. Method: We used a large…

  18. Risk Factors for Unidirectional and Bidirectional Intimate Partner Violence among Young Adults

    ERIC Educational Resources Information Center

    Renner, Lynette M.; Whitney, Stephen D.

    2012-01-01

    Objective: The purpose of this study was to identify common and unique risk factors for intimate partner violence (IPV) among young adults in relationships. Guided by two models of IPV, the same set of risk factors was used to examine outcomes of unidirectional (perpetration or victimization) and bidirectional (reciprocal) IPV separately for males…

  19. Syntax "and" Semantics: A Teaching Model.

    ERIC Educational Resources Information Center

    Wolfe, Frank

    In translating perception into written language, a child must learn an encoding process which is a continuation of the process of improving sensing of the world around him or her. To verbalize an object (a perception) we use frames which name a referent, locate the referent in space and time, identify its appearance and behavior, and define terms…

  20. Testing Theories of Dietary Behavior Change in Youth Using the Mediating Variable Model with Intervention Programs

    ERIC Educational Resources Information Center

    Cerin, Ester; Barnett, Anthony; Baranowski, Tom

    2009-01-01

    Objective: To review and critique current experimentally-based evidence of theoretical mechanisms of dietary behavior change in youth and provide recommendations on ways to enhance theory evaluation. Methods: Interventions that examined mediators of dietary behavior change in youth (age 5-18 years) were identified via electronic database searches…

  1. Getting Children Home: Hospital to Community. Workbook Series for Providing Services to Children with Handicaps and Their Families.

    ERIC Educational Resources Information Center

    Bilotti, Gene

    The workbook presents a model of care for children with severe medical involvement that features the professional care manager. Three phases in planning for home/community discharge are identified: identification of the candidate for in-home care; identification of specific objectives, service providers, funding sources, etc.; and full…

  2. Near-infrared spectroscopy measurement of abdominal tissue oxygenation is a useful indicator of intestinal blood flow and necrotizing enterocolitis in premature piglets

    USDA-ARS?s Scientific Manuscript database

    A major objective of necrotizing enterocolitis (NEC)research is to devise a noninvasive method of early detection. We hypothesized that abdominal near-infrared spectroscopy (A-NIRS) readings will identify impending NEC in a large animal model. Piglets were prematurely delivered and received parenter...

  3. Changing State-University Relations: The Experiences of Japan and Lessons for Malaysia

    ERIC Educational Resources Information Center

    Sirat, Morshidi; Kaur, Sarjit

    2010-01-01

    This article investigates the changing state-university relations in Japan and Malaysia. Its main objective is to identify and examine possible lessons for Malaysia, based on the Japanese experience. Notably, since the late 1970s, Malaysia has been looking towards Japan as a model for socio-economic development (the "look-east" Policy)…

  4. A Multilevel Model to Examine Adolescent Outcomes in Outdoor Behavioral Healthcare: The Parent Perspective

    ERIC Educational Resources Information Center

    Combs, Katie Massey; Hoag, Matthew J.; Roberts, Sean D.; Javorski, Stephen

    2016-01-01

    Background: Outdoor Behavioral Healthcare (OBH) has arisen to fill a gap in mental health treatment. While research shows large positive changes in adolescent self-reports, little is known about predictors of change, longitudinal outcomes, and parent-reports of change. Objective This study sought to identify treatment outcomes up to 18 months…

  5. Automated Thermal Image Processing for Detection and Classification of Birds and Bats - FY2012 Annual Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duberstein, Corey A.; Matzner, Shari; Cullinan, Valerie I.

    Surveying wildlife at risk from offshore wind energy development is difficult and expensive. Infrared video can be used to record birds and bats that pass through the camera view, but it is also time consuming and expensive to review video and determine what was recorded. We proposed to conduct algorithm and software development to identify and to differentiate thermally detected targets of interest that would allow automated processing of thermal image data to enumerate birds, bats, and insects. During FY2012 we developed computer code within MATLAB to identify objects recorded in video and extract attribute information that describes the objectsmore » recorded. We tested the efficiency of track identification using observer-based counts of tracks within segments of sample video. We examined object attributes, modeled the effects of random variability on attributes, and produced data smoothing techniques to limit random variation within attribute data. We also began drafting and testing methodology to identify objects recorded on video. We also recorded approximately 10 hours of infrared video of various marine birds, passerine birds, and bats near the Pacific Northwest National Laboratory (PNNL) Marine Sciences Laboratory (MSL) at Sequim, Washington. A total of 6 hours of bird video was captured overlooking Sequim Bay over a series of weeks. An additional 2 hours of video of birds was also captured during two weeks overlooking Dungeness Bay within the Strait of Juan de Fuca. Bats and passerine birds (swallows) were also recorded at dusk on the MSL campus during nine evenings. An observer noted the identity of objects viewed through the camera concurrently with recording. These video files will provide the information necessary to produce and test software developed during FY2013. The annotation will also form the basis for creation of a method to reliably identify recorded objects.« less

  6. Hospital networks: how to make them work in Belgium? Facilitators and barriers of different governance models.

    PubMed

    De Pourcq, Kaat; De Regge, Melissa; Van den Heede, Koen; Van de Voorde, Carine; Gemmel, Paul; Eeckloo, Kristof

    2018-03-29

    Objectives This study aims to identify the facilitators and barriers to governance models of hospital collaborations. The country-specific characteristics of the Belgian healthcare system and legislation are taken into account. Methods A case study was carried out in six Belgian hospital collaborations. Different types of governance models were selected: two health systems, two participant-governed networks, and two lead-organization-governed networks. Within these collaborations, 43 people were interviewed. Results All structures have both advantages and disadvantages. It is important that the governance model fits the network. However, structural, procedural, and especially contextual factors also affect the collaborations, such as alignment of hospitals' and professionals' goals, competition, distance, level of integrated care, time needed for decision-making, and legal and financial incentives. Conclusion The fit between the governance model and the collaboration can facilitate the functioning of a collaboration. The main barriers we identified are contextual factors. The Belgian government needs to play a major role in facilitating collaboration.

  7. Predictive Modelling for Fisheries Management in the Colombian Amazon

    NASA Astrophysics Data System (ADS)

    Beal, Jacob; Bennett, Sara

    A group of Colombian indigenous communities and Amacayacu National Park are cooperating to make regulations for sustainable use of their shared natural resources, especially the fish populations. To aid this effort, we are modeling the interactions among these communities and their ecosystem with the objective of predicting the stability of regulations, identifying potential failure modes, and guiding investment of scarce resources. The goal is to improve the probability of actually achieving fair, sustainable and community-managed subsistence fishing in the region.

  8. System cost performance analysis (study 2.3). Volume 1: Executive summary. [unmanned automated payload programs and program planning

    NASA Technical Reports Server (NTRS)

    Campbell, B. H.

    1974-01-01

    A study is described which was initiated to identify and quantify the interrelationships between and within the performance, safety, cost, and schedule parameters for unmanned, automated payload programs. The result of the investigation was a systems cost/performance model which was implemented as a digital computer program and could be used to perform initial program planning, cost/performance tradeoffs, and sensitivity analyses for mission model and advanced payload studies. Program objectives and results are described briefly.

  9. A novel vehicle tracking algorithm based on mean shift and active contour model in complex environment

    NASA Astrophysics Data System (ADS)

    Cai, Lei; Wang, Lin; Li, Bo; Zhang, Libao; Lv, Wen

    2017-06-01

    Vehicle tracking technology is currently one of the most active research topics in machine vision. It is an important part of intelligent transportation system. However, in theory and technology, it still faces many challenges including real-time and robustness. In video surveillance, the targets need to be detected in real-time and to be calculated accurate position for judging the motives. The contents of video sequence images and the target motion are complex, so the objects can't be expressed by a unified mathematical model. Object-tracking is defined as locating the interest moving target in each frame of a piece of video. The current tracking technology can achieve reliable results in simple environment over the target with easy identified characteristics. However, in more complex environment, it is easy to lose the target because of the mismatch between the target appearance and its dynamic model. Moreover, the target usually has a complex shape, but the tradition target tracking algorithm usually represents the tracking results by simple geometric such as rectangle or circle, so it cannot provide accurate information for the subsequent upper application. This paper combines a traditional object-tracking technology, Mean-Shift algorithm, with a kind of image segmentation algorithm, Active-Contour model, to get the outlines of objects while the tracking process and automatically handle topology changes. Meanwhile, the outline information is used to aid tracking algorithm to improve it.

  10. Designing concept maps for a precise and objective description of pharmaceutical innovations

    PubMed Central

    2013-01-01

    Background When a new drug is launched onto the market, information about the new manufactured product is contained in its monograph and evaluation report published by national drug agencies. Health professionals need to be able to determine rapidly and easily whether the new manufactured product is potentially useful for their practice. There is therefore a need to identify the best way to group together and visualize the main items of information describing the nature and potential impact of the new drug. The objective of this study was to identify these items of information and to bring them together in a model that could serve as the standard for presenting the main features of new manufactured product. Methods We developed a preliminary conceptual model of pharmaceutical innovations, based on the knowledge of the authors. We then refined this model, using a random sample of 40 new manufactured drugs recently approved by the national drug regulatory authorities in France and covering a broad spectrum of innovations and therapeutic areas. Finally, we used another sample of 20 new manufactured drugs to determine whether the model was sufficiently comprehensive. Results The results of our modeling led to three sub models described as conceptual maps representingi) the medical context for use of the new drug (indications, type of effect, therapeutical arsenal for the same indications), ii) the nature of the novelty of the new drug (new molecule, new mechanism of action, new combination, new dosage, etc.), and iii) the impact of the drug in terms of efficacy, safety and ease of use, compared with other drugs with the same indications. Conclusions Our model can help to standardize information about new drugs released onto the market. It is potentially useful to the pharmaceutical industry, medical journals, editors of drug databases and medical software, and national or international drug regulation agencies, as a means of describing the main properties of new pharmaceutical products. It could also used as a guide for the writing of comprehensive and objective texts summarizing the nature and interest of new manufactured product. PMID:23331768

  11. Webizing mobile augmented reality content

    NASA Astrophysics Data System (ADS)

    Ahn, Sangchul; Ko, Heedong; Yoo, Byounghyun

    2014-01-01

    This paper presents a content structure for building mobile augmented reality (AR) applications in HTML5 to achieve a clean separation of the mobile AR content and the application logic for scaling as on the Web. We propose that the content structure contains the physical world as well as virtual assets for mobile AR applications as document object model (DOM) elements and that their behaviour and user interactions are controlled through DOM events by representing objects and places with a uniform resource identifier. Our content structure enables mobile AR applications to be seamlessly developed as normal HTML documents under the current Web eco-system.

  12. Electronic prototyping

    NASA Technical Reports Server (NTRS)

    Hopcroft, J.

    1987-01-01

    The potential benefits of automation in space are significant. The science base needed to support this automation not only will help control costs and reduce lead-time in the earth-based design and construction of space stations, but also will advance the nation's capability for computer design, simulation, testing, and debugging of sophisticated objects electronically. Progress in automation will require the ability to electronically represent, reason about, and manipulate objects. Discussed here is the development of representations, languages, editors, and model-driven simulation systems to support electronic prototyping. In particular, it identifies areas where basic research is needed before further progress can be made.

  13. Using CART to Identify Thresholds and Hierarchies in the Determinants of Funding Decisions.

    PubMed

    Schilling, Chris; Mortimer, Duncan; Dalziel, Kim

    2017-02-01

    There is much interest in understanding decision-making processes that determine funding outcomes for health interventions. We use classification and regression trees (CART) to identify cost-effectiveness thresholds and hierarchies in the determinants of funding decisions. The hierarchical structure of CART is suited to analyzing complex conditional and nonlinear relationships. Our analysis uncovered hierarchies where interventions were grouped according to their type and objective. Cost-effectiveness thresholds varied markedly depending on which group the intervention belonged to: lifestyle-type interventions with a prevention objective had an incremental cost-effectiveness threshold of $2356, suggesting that such interventions need to be close to cost saving or dominant to be funded. For lifestyle-type interventions with a treatment objective, the threshold was much higher at $37,024. Lower down the tree, intervention attributes such as the level of patient contribution and the eligibility for government reimbursement influenced the likelihood of funding within groups of similar interventions. Comparison between our CART models and previously published results demonstrated concurrence with standard regression techniques while providing additional insights regarding the role of the funding environment and the structure of decision-maker preferences.

  14. Application of Non-Deterministic Methods to Assess Modeling Uncertainties for Reinforced Carbon-Carbon Debris Impacts

    NASA Technical Reports Server (NTRS)

    Lyle, Karen H.; Fasanella, Edwin L.; Melis, Matthew; Carney, Kelly; Gabrys, Jonathan

    2004-01-01

    The Space Shuttle Columbia Accident Investigation Board (CAIB) made several recommendations for improving the NASA Space Shuttle Program. An extensive experimental and analytical program has been developed to address two recommendations related to structural impact analysis. The objective of the present work is to demonstrate the application of probabilistic analysis to assess the effect of uncertainties on debris impacts on Space Shuttle Reinforced Carbon-Carbon (RCC) panels. The probabilistic analysis is used to identify the material modeling parameters controlling the uncertainty. A comparison of the finite element results with limited experimental data provided confidence that the simulations were adequately representing the global response of the material. Five input parameters were identified as significantly controlling the response.

  15. State machine analysis of sensor data from dynamic processes

    DOEpatents

    Cook, William R.; Brabson, John M.; Deland, Sharon M.

    2003-12-23

    A state machine model analyzes sensor data from dynamic processes at a facility to identify the actual processes that were performed at the facility during a period of interest for the purpose of remote facility inspection. An inspector can further input the expected operations into the state machine model and compare the expected, or declared, processes to the actual processes to identify undeclared processes at the facility. The state machine analysis enables the generation of knowledge about the state of the facility at all levels, from location of physical objects to complex operational concepts. Therefore, the state machine method and apparatus may benefit any agency or business with sensored facilities that stores or manipulates expensive, dangerous, or controlled materials or information.

  16. Efficient Calibration of Distributed Catchment Models Using Perceptual Understanding and Hydrologic Signatures

    NASA Astrophysics Data System (ADS)

    Hutton, C.; Wagener, T.; Freer, J. E.; Duffy, C.; Han, D.

    2015-12-01

    Distributed models offer the potential to resolve catchment systems in more detail, and therefore simulate the hydrological impacts of spatial changes in catchment forcing (e.g. landscape change). Such models may contain a large number of model parameters which are computationally expensive to calibrate. Even when calibration is possible, insufficient data can result in model parameter and structural equifinality. In order to help reduce the space of feasible models and supplement traditional outlet discharge calibration data, semi-quantitative information (e.g. knowledge of relative groundwater levels), may also be used to identify behavioural models when applied to constrain spatially distributed predictions of states and fluxes. The challenge is to combine these different sources of information together to identify a behavioural region of state-space, and efficiently search a large, complex parameter space to identify behavioural parameter sets that produce predictions that fall within this behavioural region. Here we present a methodology to incorporate different sources of data to efficiently calibrate distributed catchment models. Metrics of model performance may be derived from multiple sources of data (e.g. perceptual understanding and measured or regionalised hydrologic signatures). For each metric, an interval or inequality is used to define the behaviour of the catchment system, accounting for data uncertainties. These intervals are then combined to produce a hyper-volume in state space. The state space is then recast as a multi-objective optimisation problem, and the Borg MOEA is applied to first find, and then populate the hyper-volume, thereby identifying acceptable model parameter sets. We apply the methodology to calibrate the PIHM model at Plynlimon, UK by incorporating perceptual and hydrologic data into the calibration problem. Furthermore, we explore how to improve calibration efficiency through search initialisation from shorter model runs.

  17. Intervention mapping protocol for developing a theory-based diabetes self-management education program.

    PubMed

    Song, Misoon; Choi, Suyoung; Kim, Se-An; Seo, Kyoungsan; Lee, Soo Jin

    2015-01-01

    Development of behavior theory-based health promotion programs is encouraged with the paradigm shift from contents to behavior outcomes. This article describes the development process of the diabetes self-management program for older Koreans (DSME-OK) using intervention mapping (IM) protocol. The IM protocol includes needs assessment, defining goals and objectives, identifying theory and determinants, developing a matrix to form change objectives, selecting strategies and methods, structuring the program, and planning for evaluation and pilot testing. The DSME-OK adopted seven behavior objectives developed by the American Association of Diabetes Educators as behavioral outcomes. The program applied an information-motivation-behavioral skills model, and interventions were targeted to 3 determinants to change health behaviors. Specific methods were selected to achieve each objective guided by IM protocol. As the final step, program evaluation was planned including a pilot test. The DSME-OK was structured as the 3 determinants of the IMB model were intervened to achieve behavior objectives in each session. The program has 12 weekly 90-min sessions tailored for older adults. Using the IM protocol in developing a theory-based self-management program was beneficial in terms of providing a systematic guide to developing theory-based and behavior outcome-focused health education programs.

  18. Interactive and independent associations between the socioeconomic and objective built environment on the neighbourhood level and individual health: a systematic review of multilevel studies.

    PubMed

    Schüle, Steffen Andreas; Bolte, Gabriele

    2015-01-01

    The research question how contextual factors of neighbourhood environments influence individual health has gained increasing attention in public health research. Both socioeconomic neighbourhood characteristics and factors of the built environment play an important role for health and health-related behaviours. However, their reciprocal relationships have not been systematically reviewed so far. This systematic review aims to identify studies applying a multilevel modelling approach which consider both neighbourhood socioeconomic position (SEP) and factors of the objective built environment simultaneously in order to disentangle their independent and interactive effects on individual health. The three databases PubMed, PsycINFO, and Web of Science were systematically searched with terms for title and abstract screening. Grey literature was not included. Observational studies from USA, Canada, Australia, New Zealand, and Western European countries were considered which analysed simultaneously factors of neighbourhood SEP and the objective built environment with a multilevel modelling approach. Adjustment for individual SEP was a further inclusion criterion. Thirty-three studies were included in qualitative synthesis. Twenty-two studies showed an independent association between characteristics of neighbourhood SEP or the built environment and individual health outcomes or health-related behaviours. Twenty-one studies found cross-level or within-level interactions either between neighbourhood SEP and the built environment, or between neighbourhood SEP or the built environment and individual characteristics, such as sex, individual SEP or ethnicity. Due to the large variation of study design and heterogeneous reporting of results the identification of consistent findings was problematic and made quantitative analysis not possible. There is a need for studies considering multiple neighbourhood dimensions and applying multilevel modelling in order to clarify their causal relationship towards individual health. Especially, more studies using comparable characteristics of neighbourhood SEP and the objective built environment and analysing interactive effects are necessary to disentangle health impacts and identify vulnerable neighbourhoods and population groups.

  19. Interactive and Independent Associations between the Socioeconomic and Objective Built Environment on the Neighbourhood Level and Individual Health: A Systematic Review of Multilevel Studies

    PubMed Central

    Schüle, Steffen Andreas; Bolte, Gabriele

    2015-01-01

    Background The research question how contextual factors of neighbourhood environments influence individual health has gained increasing attention in public health research. Both socioeconomic neighbourhood characteristics and factors of the built environment play an important role for health and health-related behaviours. However, their reciprocal relationships have not been systematically reviewed so far. This systematic review aims to identify studies applying a multilevel modelling approach which consider both neighbourhood socioeconomic position (SEP) and factors of the objective built environment simultaneously in order to disentangle their independent and interactive effects on individual health. Methods The three databases PubMed, PsycINFO, and Web of Science were systematically searched with terms for title and abstract screening. Grey literature was not included. Observational studies from USA, Canada, Australia, New Zealand, and Western European countries were considered which analysed simultaneously factors of neighbourhood SEP and the objective built environment with a multilevel modelling approach. Adjustment for individual SEP was a further inclusion criterion. Results Thirty-three studies were included in qualitative synthesis. Twenty-two studies showed an independent association between characteristics of neighbourhood SEP or the built environment and individual health outcomes or health-related behaviours. Twenty-one studies found cross-level or within-level interactions either between neighbourhood SEP and the built environment, or between neighbourhood SEP or the built environment and individual characteristics, such as sex, individual SEP or ethnicity. Due to the large variation of study design and heterogeneous reporting of results the identification of consistent findings was problematic and made quantitative analysis not possible. Conclusions There is a need for studies considering multiple neighbourhood dimensions and applying multilevel modelling in order to clarify their causal relationship towards individual health. Especially, more studies using comparable characteristics of neighbourhood SEP and the objective built environment and analysing interactive effects are necessary to disentangle health impacts and identify vulnerable neighbourhoods and population groups. PMID:25849569

  20. A guide to multi-objective optimization for ecological problems with an application to cackling goose management

    USGS Publications Warehouse

    Williams, Perry J.; Kendall, William L.

    2017-01-01

    Choices in ecological research and management are the result of balancing multiple, often competing, objectives. Multi-objective optimization (MOO) is a formal decision-theoretic framework for solving multiple objective problems. MOO is used extensively in other fields including engineering, economics, and operations research. However, its application for solving ecological problems has been sparse, perhaps due to a lack of widespread understanding. Thus, our objective was to provide an accessible primer on MOO, including a review of methods common in other fields, a review of their application in ecology, and a demonstration to an applied resource management problem.A large class of methods for solving MOO problems can be separated into two strategies: modelling preferences pre-optimization (the a priori strategy), or modelling preferences post-optimization (the a posteriori strategy). The a priori strategy requires describing preferences among objectives without knowledge of how preferences affect the resulting decision. In the a posteriori strategy, the decision maker simultaneously considers a set of solutions (the Pareto optimal set) and makes a choice based on the trade-offs observed in the set. We describe several methods for modelling preferences pre-optimization, including: the bounded objective function method, the lexicographic method, and the weighted-sum method. We discuss modelling preferences post-optimization through examination of the Pareto optimal set. We applied each MOO strategy to the natural resource management problem of selecting a population target for cackling goose (Branta hutchinsii minima) abundance. Cackling geese provide food security to Native Alaskan subsistence hunters in the goose's nesting area, but depredate crops on private agricultural fields in wintering areas. We developed objective functions to represent the competing objectives related to the cackling goose population target and identified an optimal solution first using the a priori strategy, and then by examining trade-offs in the Pareto set using the a posteriori strategy. We used four approaches for selecting a final solution within the a posteriori strategy; the most common optimal solution, the most robust optimal solution, and two solutions based on maximizing a restricted portion of the Pareto set. We discuss MOO with respect to natural resource management, but MOO is sufficiently general to cover any ecological problem that contains multiple competing objectives that can be quantified using objective functions.

  1. Analyzing the Formation of Ultra-compact Dwarfs through Stellar Populations

    NASA Astrophysics Data System (ADS)

    Seshadri, Anish; Wang, Carolyn; Romanowsky, Aaron J.; Martin-navarro, Ignacio

    2017-01-01

    Since their discovery in 1999, ultra-compact dwarfs (UCDs) have been the subjects of intense study. Their small size, yet tremendous mass, brings into question their place among celestial objects. Are they galaxies or globular clusters? The answer to this question could come from analyzing how they formed. Thus, the goal of this project is to test one of the theories for the formation of UCDs, the theory of tidal stripping.This project approaches the issue by looking at dwarf galaxies currently in the process of stripping to understand formation history. Over twenty such dwarf galaxies were identified and their stellar populations analyzed. Using modeling techniques on spectroscopic and photometric data, the age, metallicity, and color of each object was identified. By objectively categorizing each object into a stage of evolution in the process of tidal stripping, a virtual timeline was built for the formation of UCDs. Data for each object were plotted vs. stage of formation, with pristine dwarfs and UCDs signifying the endpoints. Trends in the data revealed a natural progression over all stages of evolution, showing that tidally stripped dwarfs likely represent an intermediate stage in the formation of UCDs.This research was supported by NSF Grant AST-1515084. Most of this work was carried out by high school students working under the auspices of the Science Internship Program at UC Santa Cruz.

  2. On the potential for using immersive virtual environments to support laboratory experiment contextualisation

    NASA Astrophysics Data System (ADS)

    Machet, Tania; Lowe, David; Gütl, Christian

    2012-12-01

    This paper explores the hypothesis that embedding a laboratory activity into a virtual environment can provide a richer experimental context and hence improve the understanding of the relationship between a theoretical model and the real world, particularly in terms of the model's strengths and weaknesses. While an identified learning objective of laboratories is to support the understanding of the relationship between models and reality, the paper illustrates that this understanding is hindered by inherently limited experiments and that there is scope for improvement. Despite the contextualisation of learning activities having been shown to support learning objectives in many fields, there is traditionally little contextual information presented during laboratory experimentation. The paper argues that the enhancing laboratory activity with contextual information affords an opportunity to improve students' understanding of the relationship between the theoretical model and the experiment (which is effectively a proxy for the complex real world), thereby improving their understanding of the relationship between the model and reality. The authors propose that these improvements can be achieved by setting remote laboratories within context-rich virtual worlds.

  3. Scalable persistent identifier systems for dynamic datasets

    NASA Astrophysics Data System (ADS)

    Golodoniuc, P.; Cox, S. J. D.; Klump, J. F.

    2016-12-01

    Reliable and persistent identification of objects, whether tangible or not, is essential in information management. Many Internet-based systems have been developed to identify digital data objects, e.g., PURL, LSID, Handle, ARK. These were largely designed for identification of static digital objects. The amount of data made available online has grown exponentially over the last two decades and fine-grained identification of dynamically generated data objects within large datasets using conventional systems (e.g., PURL) has become impractical. We have compared capabilities of various technological solutions to enable resolvability of data objects in dynamic datasets, and developed a dataset-centric approach to resolution of identifiers. This is particularly important in Semantic Linked Data environments where dynamic frequently changing data is delivered live via web services, so registration of individual data objects to obtain identifiers is impractical. We use identifier patterns and pattern hierarchies for identification of data objects, which allows relationships between identifiers to be expressed, and also provides means for resolving a single identifier into multiple forms (i.e. views or representations of an object). The latter can be implemented through (a) HTTP content negotiation, or (b) use of URI querystring parameters. The pattern and hierarchy approach has been implemented in the Linked Data API supporting the United Nations Spatial Data Infrastructure (UNSDI) initiative and later in the implementation of geoscientific data delivery for the Capricorn Distal Footprints project using International Geo Sample Numbers (IGSN). This enables flexible resolution of multi-view persistent identifiers and provides a scalable solution for large heterogeneous datasets.

  4. In-phase thermomechanical fatigue mechanisms in an unidirectional SCS-6/Ti 15-3 MMC

    NASA Technical Reports Server (NTRS)

    Newaz, Golam M.; Majumdar, Bhaskar S.

    1995-01-01

    The objective of this investigation was to identify the inelastic deformation and damage mechanisms under in-phase (IP) thermomechanical fatigue (TMF) in a unidirectional SCS-6/Ti 15-3 metal matrix composite (MMC). Load-controlled IP TMF tests were conducted at 300-538 C at various stress ranges in high-purity argon. A major emphasis of this work was to identify damage mechanism well before final fracture of specimens, rather than to generate life diagrams, to aid development of a realistic deformation/damage and life model.

  5. Method for Statically Checking an Object-oriented Computer Program Module

    NASA Technical Reports Server (NTRS)

    Bierhoff, Kevin M. (Inventor); Aldrich, Jonathan (Inventor)

    2012-01-01

    A method for statically checking an object-oriented computer program module includes the step of identifying objects within a computer program module, at least one of the objects having a plurality of references thereto, possibly from multiple clients. A discipline of permissions is imposed on the objects identified within the computer program module. The permissions enable tracking, from among a discrete set of changeable states, a subset of states each object might be in. A determination is made regarding whether the imposed permissions are violated by a potential reference to any of the identified objects. The results of the determination are output to a user.

  6. Addressing drug adherence using an operations management model.

    PubMed

    Nunlee, Martin; Bones, Michelle

    2014-01-01

    OBJECTIVE To provide a model that enables health systems and pharmacy benefit managers to provide medications reliably and test for reliability and validity in the analysis of adherence to drug therapy of chronic disease. SUMMARY The quantifiable model described here can be used in conjunction with behavioral designs of drug adherence assessments. The model identifies variables that can be reproduced and expanded across the management of chronic diseases with drug therapy. By creating a reorder point system for reordering medications, the model uses a methodology commonly seen in operations research. The design includes a safety stock of medication and current supply of medication, which increases the likelihood that patients will have a continuous supply of medications, thereby positively affecting adherence by removing barriers. CONCLUSION This method identifies an adherence model that quantifies variables related to recommendations from health care providers; it can assist health care and service delivery systems in making decisions that influence adherence based on the expected order cycle days and the expected daily quantity of medication administered. This model addresses the possession of medication as a barrier to adherence.

  7. Use of structured decision making to identify monitoring variables and management priorities for salt marsh ecosystems

    USGS Publications Warehouse

    Neckles, Hilary A.; Lyons, James E.; Guntenspergen, Glenn R.; Shriver, W. Gregory; Adamowicz, Susan C.

    2015-01-01

    Most salt marshes in the USA have been degraded by human activities, and coastal managers are faced with complex choices among possible actions to restore or enhance ecosystem integrity. We applied structured decision making (SDM) to guide selection of monitoring variables and management priorities for salt marshes within the National Wildlife Refuge System in the northeastern USA. In general, SDM is a systematic process for decomposing a decision into its essential elements. We first engaged stakeholders in clarifying regional salt marsh decision problems, defining objectives and attributes to evaluate whether objectives are achieved, and developing a pool of alternative management actions for achieving objectives. Through this process, we identified salt marsh attributes that were applicable to monitoring National Wildlife Refuges on a regional scale and that targeted management needs. We then analyzed management decisions within three salt marsh units at Prime Hook National Wildlife Refuge, coastal Delaware, as a case example of prioritizing management alternatives. Values for salt marsh attributes were estimated from 2 years of baseline monitoring data and expert opinion. We used linear value modeling to aggregate multiple attributes into a single performance score for each alternative, constrained optimization to identify alternatives that maximized total management benefits subject to refuge-wide cost constraints, and used graphical analysis to identify the optimal set of alternatives for the refuge. SDM offers an efficient, transparent approach for integrating monitoring into management practice and improving the quality of management decisions.

  8. Segmentation of 3d Models for Cultural Heritage Structural Analysis - Some Critical Issues

    NASA Astrophysics Data System (ADS)

    Gonizzi Barsanti, S.; Guidi, G.; De Luca, L.

    2017-08-01

    Cultural Heritage documentation and preservation has become a fundamental concern in this historical period. 3D modelling offers a perfect aid to record ancient buildings and artefacts and can be used as a valid starting point for restoration, conservation and structural analysis, which can be performed by using Finite Element Methods (FEA). The models derived from reality-based techniques, made up of the exterior surfaces of the objects captured at high resolution, are - for this reason - made of millions of polygons. Such meshes are not directly usable in structural analysis packages and need to be properly pre-processed in order to be transformed in volumetric meshes suitable for FEA. In addition, dealing with ancient objects, a proper segmentation of 3D volumetric models is needed to analyse the behaviour of the structure with the most suitable level of detail for the different sections of the structure under analysis. Segmentation of 3D models is still an open issue, especially when dealing with ancient, complicated and geometrically complex objects that imply the presence of anomalies and gaps, due to environmental agents such as earthquakes, pollution, wind and rain, or human factors. The aims of this paper is to critically analyse some of the different methodologies and algorithms available to segment a 3D point cloud or a mesh, identifying difficulties and problems by showing examples on different structures.

  9. Utilization of Expert Knowledge in a Multi-Objective Hydrologic Model Automatic Calibration Process

    NASA Astrophysics Data System (ADS)

    Quebbeman, J.; Park, G. H.; Carney, S.; Day, G. N.; Micheletty, P. D.

    2016-12-01

    Spatially distributed continuous simulation hydrologic models have a large number of parameters for potential adjustment during the calibration process. Traditional manual calibration approaches of such a modeling system is extremely laborious, which has historically motivated the use of automatic calibration procedures. With a large selection of model parameters, achieving high degrees of objective space fitness - measured with typical metrics such as Nash-Sutcliffe, Kling-Gupta, RMSE, etc. - can easily be achieved using a range of evolutionary algorithms. A concern with this approach is the high degree of compensatory calibration, with many similarly performing solutions, and yet grossly varying parameter set solutions. To help alleviate this concern, and mimic manual calibration processes, expert knowledge is proposed for inclusion within the multi-objective functions, which evaluates the parameter decision space. As a result, Pareto solutions are identified with high degrees of fitness, but also create parameter sets that maintain and utilize available expert knowledge resulting in more realistic and consistent solutions. This process was tested using the joint SNOW-17 and Sacramento Soil Moisture Accounting method (SAC-SMA) within the Animas River basin in Colorado. Three different elevation zones, each with a range of parameters, resulted in over 35 model parameters simultaneously calibrated. As a result, high degrees of fitness were achieved, in addition to the development of more realistic and consistent parameter sets such as those typically achieved during manual calibration procedures.

  10. Modelling the Cost Effectiveness of Disease-Modifying Treatments for Multiple Sclerosis

    PubMed Central

    Thompson, Joel P.; Abdolahi, Amir; Noyes, Katia

    2013-01-01

    Several cost-effectiveness models of disease-modifying treatments (DMTs) for multiple sclerosis (MS) have been developed for different populations and different countries. Vast differences in the approaches and discrepancies in the results give rise to heated discussions and limit the use of these models. Our main objective is to discuss the methodological challenges in modelling the cost effectiveness of treatments for MS. We conducted a review of published models to describe the approaches taken to date, to identify the key parameters that influence the cost effectiveness of DMTs, and to point out major areas of weakness and uncertainty. Thirty-six published models and analyses were identified. The greatest source of uncertainty is the absence of head-to-head randomized clinical trials. Modellers have used various techniques to compensate, including utilizing extension trials. The use of large observational cohorts in recent studies aids in identifying population-based, ‘real-world’ treatment effects. Major drivers of results include the time horizon modelled and DMT acquisition costs. Model endpoints must target either policy makers (using cost-utility analysis) or clinicians (conducting cost-effectiveness analyses). Lastly, the cost effectiveness of DMTs outside North America and Europe is currently unknown, with the lack of country-specific data as the major limiting factor. We suggest that limited data should not preclude analyses, as models may be built and updated in the future as data become available. Disclosure of modelling methods and assumptions could improve the transferability and applicability of models designed to reflect different healthcare systems. PMID:23640103

  11. Terrestrial Ecosystem Science 2017 ECRP Annual Report: Tropical Forest Response to a Drier Future: Turnover Times of Soil Organic Matter, Roots, Respired CO 2, and CH 4 Across Moisture Gradients in Time and Space

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McFarlane, Karis J.

    The overall goal of my Early Career research is to constrain belowground carbon turnover times for tropical forests across a broad range in moisture regimes. My group is using 14C analysis and modeling to address two major objectives: quantify age and belowground carbon turnover times across tropical forests spanning a moisture gradient from wetlands to dry forest; and identify specific areas for focused model improvement and data needs through site-specific model-data comparison and belowground carbon modeling for tropic forests.

  12. Extending 3D city models with legal information

    NASA Astrophysics Data System (ADS)

    Frank, A. U.; Fuhrmann, T.; Navratil, G.

    2012-10-01

    3D city models represent existing physical objects and their topological and functional relations. In everyday life the rights and responsibilities connected to these objects, primarily legally defined rights and obligations but also other socially and culturally established rights, are of importance. The rights and obligations are defined in various laws and it is often difficult to identify the rules applicable for a certain case. The existing 2D cadastres show civil law rights and obligations and plans to extend them to provide information about public law restrictions for land use are in several countries under way. It is tempting to design extensions to the 3D city models to provide information about legal rights in 3D. The paper analyses the different types of information that are needed to reduce conflicts and to facilitate decisions about land use. We identify the role 3D city models augmented with planning information in 3D can play, but do not advocate a general conversion from 2D to 3D for the legal cadastre. Space is not anisotropic and the up/down dimension is practically very different from the two dimensional plane - this difference must be respected when designing spatial information systems. The conclusions are: (1) continue the current regime for ownership of apartments, which is not ownership of a 3D volume, but co-ownership of a building with exclusive use of some rooms; such exclusive use rights could be shown in a 3D city model; (2) ownership of 3D volumes for complex and unusual building situations can be reported in a 3D city model, but are not required everywhere; (3) indicate restrictions for land use and building in 3D city models, with links to the legal sources.

  13. Addressing subjective decision-making inherent in GLUE-based multi-criteria rainfall-runoff model calibration

    NASA Astrophysics Data System (ADS)

    Shafii, Mahyar; Tolson, Bryan; Shawn Matott, L.

    2015-04-01

    GLUE is one of the most commonly used informal methodologies for uncertainty estimation in hydrological modelling. Despite the ease-of-use of GLUE, it involves a number of subjective decisions such as the strategy for identifying the behavioural solutions. This study evaluates the impact of behavioural solution identification strategies in GLUE on the quality of model output uncertainty. Moreover, two new strategies are developed to objectively identify behavioural solutions. The first strategy considers Pareto-based ranking of parameter sets, while the second one is based on ranking the parameter sets based on an aggregated criterion. The proposed strategies, as well as the traditional strategies in the literature, are evaluated with respect to reliability (coverage of observations by the envelope of model outcomes) and sharpness (width of the envelope of model outcomes) in different numerical experiments. These experiments include multi-criteria calibration and uncertainty estimation of three rainfall-runoff models with different number of parameters. To demonstrate the importance of behavioural solution identification strategy more appropriately, GLUE is also compared with two other informal multi-criteria calibration and uncertainty estimation methods (Pareto optimization and DDS-AU). The results show that the model output uncertainty varies with the behavioural solution identification strategy, and furthermore, a robust GLUE implementation would require considering multiple behavioural solution identification strategies and choosing the one that generates the desired balance between sharpness and reliability. The proposed objective strategies prove to be the best options in most of the case studies investigated in this research. Implementing such an approach for a high-dimensional calibration problem enables GLUE to generate robust results in comparison with Pareto optimization and DDS-AU.

  14. [Barriers to the normalization of telemedicine in a healthcare system model based on purchasing of healthcare services using providers' contracts].

    PubMed

    Roig, Francesc; Saigí, Francesc

    2011-01-01

    Despite the clear political will to promote telemedicine and the large number of initiatives, the incorporation of this modality in clinical practice remains limited. The objective of this study was to identify the barriers perceived by key professionals who actively participate in the design and implementation of telemedicine in a healthcare system model based on purchasing of healthcare services using providers' contracts. We performed a qualitative study based on data from semi-structured interviews with 17 key informants belonging to distinct Catalan health organizations. The barriers identified were grouped in four areas: technological, organizational, human and economic. The main barriers identified were changes in the healthcare model caused by telemedicine, problems with strategic alignment, resistance to change in the (re)definition of roles, responsibilities and new skills, and lack of a business model that incorporates telemedicine in the services portfolio to ensure its sustainability. In addition to suitable management of change and of the necessary strategic alignment, the definitive normalization of telemedicine in a mixed healthcare model based on purchasing of healthcare services using providers' contracts requires a clear and stable business model that incorporates this modality in the services portfolio and allows healthcare organizations to obtain reimbursement from the payer. 2010 SESPAS. Published by Elsevier Espana. All rights reserved.

  15. Improved multi-objective ant colony optimization algorithm and its application in complex reasoning

    NASA Astrophysics Data System (ADS)

    Wang, Xinqing; Zhao, Yang; Wang, Dong; Zhu, Huijie; Zhang, Qing

    2013-09-01

    The problem of fault reasoning has aroused great concern in scientific and engineering fields. However, fault investigation and reasoning of complex system is not a simple reasoning decision-making problem. It has become a typical multi-constraint and multi-objective reticulate optimization decision-making problem under many influencing factors and constraints. So far, little research has been carried out in this field. This paper transforms the fault reasoning problem of complex system into a paths-searching problem starting from known symptoms to fault causes. Three optimization objectives are considered simultaneously: maximum probability of average fault, maximum average importance, and minimum average complexity of test. Under the constraints of both known symptoms and the causal relationship among different components, a multi-objective optimization mathematical model is set up, taking minimizing cost of fault reasoning as the target function. Since the problem is non-deterministic polynomial-hard(NP-hard), a modified multi-objective ant colony algorithm is proposed, in which a reachability matrix is set up to constrain the feasible search nodes of the ants and a new pseudo-random-proportional rule and a pheromone adjustment mechinism are constructed to balance conflicts between the optimization objectives. At last, a Pareto optimal set is acquired. Evaluation functions based on validity and tendency of reasoning paths are defined to optimize noninferior set, through which the final fault causes can be identified according to decision-making demands, thus realize fault reasoning of the multi-constraint and multi-objective complex system. Reasoning results demonstrate that the improved multi-objective ant colony optimization(IMACO) can realize reasoning and locating fault positions precisely by solving the multi-objective fault diagnosis model, which provides a new method to solve the problem of multi-constraint and multi-objective fault diagnosis and reasoning of complex system.

  16. Validation of self-reported figural drawing scales against anthropometric measurements in adults.

    PubMed

    Dratva, Julia; Bertelsen, Randi; Janson, Christer; Johannessen, Ane; Benediktsdóttir, Bryndis; Bråbäck, Lennart; Dharmage, Shyamali C; Forsberg, Bertil; Gislason, Thorarinn; Jarvis, Debbie; Jogi, Rain; Lindberg, Eva; Norback, Dan; Omenaas, Ernst; Skorge, Trude D; Sigsgaard, Torben; Toren, Kjell; Waatevik, Marie; Wieslander, Gundula; Schlünssen, Vivi; Svanes, Cecilie; Real, Francisco Gomez

    2016-08-01

    The aim of the present study was to validate figural drawing scales depicting extremely lean to extremely obese subjects to obtain proxies for BMI and waist circumference in postal surveys. Reported figural scales and anthropometric data from a large population-based postal survey were validated with measured anthropometric data from the same individuals by means of receiver-operating characteristic curves and a BMI prediction model. Adult participants in a Scandinavian cohort study first recruited in 1990 and followed up twice since. Individuals aged 38-66 years with complete data for BMI (n 1580) and waist circumference (n 1017). Median BMI and waist circumference increased exponentially with increasing figural scales. Receiver-operating characteristic curve analyses showed a high predictive ability to identify individuals with BMI > 25·0 kg/m2 in both sexes. The optimal figural scales for identifying overweight or obese individuals with a correct detection rate were 4 and 5 in women, and 5 and 6 in men, respectively. The prediction model explained 74 % of the variance among women and 62 % among men. Predicted BMI differed only marginally from objectively measured BMI. Figural drawing scales explained a large part of the anthropometric variance in this population and showed a high predictive ability for identifying overweight/obese subjects. These figural scales can be used with confidence as proxies of BMI and waist circumference in settings where objective measures are not feasible.

  17. Section-Based Tree Species Identification Using Airborne LIDAR Point Cloud

    NASA Astrophysics Data System (ADS)

    Yao, C.; Zhang, X.; Liu, H.

    2017-09-01

    The application of LiDAR data in forestry initially focused on mapping forest community, particularly and primarily intended for largescale forest management and planning. Then with the smaller footprint and higher sampling density LiDAR data available, detecting individual tree overstory, estimating crowns parameters and identifying tree species are demonstrated practicable. This paper proposes a section-based protocol of tree species identification taking palm tree as an example. Section-based method is to detect objects through certain profile among different direction, basically along X-axis or Y-axis. And this method improve the utilization of spatial information to generate accurate results. Firstly, separate the tree points from manmade-object points by decision-tree-based rules, and create Crown Height Mode (CHM) by subtracting the Digital Terrain Model (DTM) from the digital surface model (DSM). Then calculate and extract key points to locate individual trees, thus estimate specific tree parameters related to species information, such as crown height, crown radius, and cross point etc. Finally, with parameters we are able to identify certain tree species. Comparing to species information measured on ground, the portion correctly identified trees on all plots could reach up to 90.65 %. The identification result in this research demonstrate the ability to distinguish palm tree using LiDAR point cloud. Furthermore, with more prior knowledge, section-based method enable the process to classify trees into different classes.

  18. Identifying novel phenotypes of vulnerability and resistance to activity-based anorexia in adolescent female rats.

    PubMed

    Barbarich-Marsteller, Nicole C; Underwood, Mark D; Foltin, Richard W; Myers, Michael M; Walsh, B Timothy; Barrett, Jeffrey S; Marsteller, Douglas A

    2013-11-01

    Activity-based anorexia is a translational rodent model that results in severe weight loss, hyperactivity, and voluntary self-starvation. The goal of our investigation was to identify vulnerable and resistant phenotypes of activity-based anorexia in adolescent female rats. Sprague-Dawley rats were maintained under conditions of restricted access to food (N = 64; or unlimited access, N = 16) until experimental exit, predefined as a target weight loss of 30-35% or meeting predefined criteria for animal health. Nonlinear mixed effects statistical modeling was used to describe wheel running behavior, time to event analysis was used to assess experimental exit, and a regressive partitioning algorithm was used to classify phenotypes. Objective criteria were identified for distinguishing novel phenotypes of activity-based anorexia, including a vulnerable phenotype that conferred maximal hyperactivity, minimal food intake, and the shortest time to experimental exit, and a resistant phenotype that conferred minimal activity and the longest time to experimental exit. The identification of objective criteria for defining vulnerable and resistant phenotypes of activity-based anorexia in adolescent female rats provides an important framework for studying the neural mechanisms that promote vulnerability to or protection against the development of self-starvation and hyperactivity during adolescence. Ultimately, future studies using these novel phenotypes may provide important translational insights into the mechanisms that promote these maladaptive behaviors characteristic of anorexia nervosa. Copyright © 2013 Wiley Periodicals, Inc.

  19. Regulatory Technology Development Plan - Sodium Fast Reactor. Mechanistic Source Term - Trial Calculation. Work Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, David; Bucknor, Matthew; Jerden, James

    2016-02-01

    The overall objective of the SFR Regulatory Technology Development Plan (RTDP) effort is to identify and address potential impediments to the SFR regulatory licensing process. In FY14, an analysis by Argonne identified the development of an SFR-specific MST methodology as an existing licensing gap with high regulatory importance and a potentially long lead-time to closure. This work was followed by an initial examination of the current state-of-knowledge regarding SFR source term development (ANLART-3), which reported several potential gaps. Among these were the potential inadequacies of current computational tools to properly model and assess the transport and retention of radionuclides duringmore » a metal fuel pool-type SFR core damage incident. The objective of the current work is to determine the adequacy of existing computational tools, and the associated knowledge database, for the calculation of an SFR MST. To accomplish this task, a trial MST calculation will be performed using available computational tools to establish their limitations with regard to relevant radionuclide release/retention/transport phenomena. The application of existing modeling tools will provide a definitive test to assess their suitability for an SFR MST calculation, while also identifying potential gaps in the current knowledge base and providing insight into open issues regarding regulatory criteria/requirements. The findings of this analysis will assist in determining future research and development needs.« less

  20. [Identification of ecological corridors and its importance by integrating circuit theory].

    PubMed

    Song, Li Li; Qin, Ming Zhou

    2016-10-01

    Landscape connectivity is considered as an extraordinarily important factor affecting various ecological processes. The least cost path (LCP) on the basis of minimum cumulative resis-tance model (MCRM) may provide a more efficient approach to identify functional connectivity in heterogeneous landscapes, and is already adopted by the research of landscape functional connecti-vity assessment and ecological corridor simulation. Connectivity model on circuit theory (CMCT) replaced the edges in the graph theory with resistors, cost distance with resistance distance to measure the functional connectivity in heterogeneous landscapes. By means of Linkage Mapper tool and Circuitscape software, the simulated landscape generated from SIMMAP 2.0 software was viewed as the study object in this article, aimed at exploring how to integrate MCRM with CMCT to identify ecological corridors and relative importance of landscape factors. The results showed that two models had their individual advantages and mutual complement. MCRM could effectively identify least cost corridors among habitats. CMCT could effectively identify important landscape factor and pinch point, which had important influence on landscape connectivity. We also found that the position of pinch point was not affected by corridor width, which had obvious advantage in the research of identifying the importance of corridors. The integrated method could provide certain scientific basis for regional ecological protection planning and ecological corridor design.

Top